Hi Antimo,
first, sorry I misremembered, we also do have Glasses 2, thus the cable should indeed work (reason for my confusion was that only some weeks after we ordered our Glasses 2 the next generation was already announced and I was angry that Tobii didn’t indicate before ordering that a new version was to be expected very soon; now I only remember that we have the “old” version and constantly mix up the numbers). Note that I didn’t touch the system for some month, I hope my other memories are more accurate.
So far, I have one PC with Matlab/Psychtoolbox to display the images and one computer controlling the glasses to which send the trigger with the TTL. What else do I need?
The suggested USB-cable directly connects the PTB PC USB port to the Glasses Recording unit 3.5 mm jack. There is no other computer or any network involved. Trigger is only one bit but timing (latency and jitter) is expected to be very good (in low millisecond range).
You have to open the serial port, typically I keep it open during the whole block/experiment, port address depends on OS (this is Ubuntu Linux with a single USB-serial port):
Cfg.porthandle = IOPort( 'OpenSerialPort', '/dev/ttyUSB0', 'FlowControl=None,BaudRate=300' );
BaudRate defines TTL pulse width. Could possibly be optimized but worked well for us. You can then send a TTL pulse:
[ nwritten, when, errmsg ] = IOPort('Write', Cfg.porthandle, uint8( 255 ) );
and close the port at the end of the block/experiment:
IOPort( 'CloseAll' );
One thing I don’t fully understand is the network/REST API that you mention in your reply. Where do you send the network signal? I think that I’m missing one piece of the setup at the moment. Could you please help me to understand?
The Recording unit also provides an ethernet port and a WiFi access point to remote control the Recording unit. The ethernet port allows lower latencies but is IPv6 only (IIRC; making some trouble with our PTB PC). We connect from PTB via WiFi (IPv4) and curl to the Recording unit (REST API). You can then start, pause, and stop recordings from PTB, calibrate, get tracker status, set project/participant information and send event information. This event messages contain an „external" timestamp (here, PTB GetSecs time typically relative to block start; “external” from eye-tracker perspective) and a char string describing the event (here, an integer number converted to char to keep them identical with the EEG triggers). As they go over the WiFi network they typically introduce latencies and jitter in the range of at least tens of milliseconds, thus, unsuitable for precise timing. (Side aspect: I found the MATLAB curl/json/REST implementation really sluggish. We therefore instead use curl via system commands.)
Both event types, TTL and API events are stored in the raw data (including eye-tracker timestamps) and imported by Dee’s GlassesViewer. If you send both types for each event, you can offline match TTL and API events and get the timing from the TTL and the event information from the matched API triggers.
The tablet typically used with the Glasses actually also connects to this REST API to control recording etc. and to the livestream API for the camera signals.
Does this make the general architecture somewhat more clear? Dee, please correct if I misremember any details! The API documentation can be downloaded from Tobii.
Best,
Andreas