Hi all,
As I'm new to this group (and to the use of the toolboxes) this will
probably be one of those RTFM questions, but I couldn't find an answer
after searching the archives, so I'll just post another message hoping
someone can point me in the right direction. Very sorry in advance if
it is indeed an RTFM question. :)
Here goes - I've programmed an experiment in Matlab, using another
toolbox called Cogent graphics. However, now that I want to use direct
feedback from the eyelink II, I'm running into troubles and I'm
looking into other toolboxes.
What I need to do is the following: as soon as a subjects fixates a
target, my stimulus has to change positions (kinda like the double
step paradigm). So just to make it clear: it should use the data it
receives from the eyelink tracker in realtime to alter the screen.
I was wondering if anyone in this group has had any experience
programming an experiment like the one I described above - one that
uses "realtime" data from the tracker in the experiment, and one that
uses the psychtoolbox and the eyelink toolbox for matlab (on windows
XP with matlab 6.5 or 7.1).
Information I'm particulary looking for includes information on how
easy/hard it would be to setup an experiment like that in the
toolboxes I mentioned and how the coordinate systems of the two
toolboxes work together.
Thanks in advance guys, and keep up the great work!
- Arthur
As I'm new to this group (and to the use of the toolboxes) this will
probably be one of those RTFM questions, but I couldn't find an answer
after searching the archives, so I'll just post another message hoping
someone can point me in the right direction. Very sorry in advance if
it is indeed an RTFM question. :)
Here goes - I've programmed an experiment in Matlab, using another
toolbox called Cogent graphics. However, now that I want to use direct
feedback from the eyelink II, I'm running into troubles and I'm
looking into other toolboxes.
What I need to do is the following: as soon as a subjects fixates a
target, my stimulus has to change positions (kinda like the double
step paradigm). So just to make it clear: it should use the data it
receives from the eyelink tracker in realtime to alter the screen.
I was wondering if anyone in this group has had any experience
programming an experiment like the one I described above - one that
uses "realtime" data from the tracker in the experiment, and one that
uses the psychtoolbox and the eyelink toolbox for matlab (on windows
XP with matlab 6.5 or 7.1).
Information I'm particulary looking for includes information on how
easy/hard it would be to setup an experiment like that in the
toolboxes I mentioned and how the coordinate systems of the two
toolboxes work together.
Thanks in advance guys, and keep up the great work!
- Arthur