Further, it looks like the HTC Vive Pro Eye has some addition eye tracking benefits via SRanipal.
Has anyone had any experience with PTB and eye tracking. If so, which headsets?
In an ideal scenario we would like to do binocular eye tracking, blinks, pupil dilation.
Looks like the following headsets might support eye tracking via SRanipal? Though I am not 100% sure. The HTC Vive Pro Eye is a little old now, so ideally it would be noice to get a newer model headset, but one known to work with PTB.
Not much practical guidance I can give. My sole experience is with the HTC Vive Pro Eye, which indeed is quite old nowadays. I chose that one, because it was the only one with good enough eye tracking to develop PTB’s current support, and one could rent it - we were too poor to be able to buy any HMD with eye tracking, and that was old/low spec enough to have low enough hardware requirements to work on my 500 Euro ALDI PC with 8 GB RAM and onboard AMD gpu and NVidia GTX 1650 discrete gpu for testing on AMD and NVidia under both Linux and Windows-10, and not suck too much performance-wise. Even then the setup was barely workable enough for my development and testing purposes on Windows-10, because my machine is about as barely capable of running that setup, as it can be. Lack of money was the main contstraint, as so often.
Therefore, the old HTC Vive Pro Eye is the only one tested by myself with eye tracking for PTB under Windows-10 and later, so everything else from my side is also just guesswork, based on looking at these links and googling.
That said…
PTB’s OpenXR eyetracking should work with any PC VR HMD with support for OpenXR and the OpenXR XR_eye_gaze_interaction extension, which is likely supported by any modern VR HMD with eye tracking. It is the same extension that other VR game engines etc. also use. The downside is that this extension is rather basic and restricted: It only gives you monocular gaze samples, ie. either from a monocular eye tracker, or in case of binocular eye trackers likely some single gaze vector via integration / sensor fusion of the gaze data from both eyes, e.g., a weighted average of both eyes, creating a “cyclops eye” located between both real eyes, or similar. And a binary “tracked / not tracked” confidence indicator. No info about pupil diameter, eye opening etc., eye gaze timestamps may or may not be interpolated or extrapolated between real hardware gaze samples, depending on eyetracking runtime.
The SRAnipal eyetracking, which only works on MS-Windows 10+ with HTC VR HMD’s, provides binocular eye tracking, pupil diameter, an estimate of eye opening, and gaze timestamps at up to 120 Hz. My guess would be that the HMD’s you linked (+ their extra gaze tracker add-ons for HMD’s other than the Vive Focus Vision) would work with SRAnipal, mostly based on what I read here:
I should note that setting up SRAnipal with my Vive Pro Eye on Windows-10 was a nighmarish fiddly procedure, with many false starts, trial and error etc. Part of it may be because my machine was so borderline underpowered and SRAnipal seems to be relatively resource intense - combine driving a VR HMD with Eyetracking with the resource hog that modern Matlab is, and a relatively low-end PC, low-end gpu, only 8 GB of RAM, a spinning hard disc, and the sluggish nightmare that is Windows-10 on spinning hard-discs and low RAM, and that may explain a lot of the trouble, e.g., running out of RAM memory or having overloaded cpu for any non-trivial test. A state of the art machine would hopefully fare better, but our financial situation of the last 10+ years didn’t provide for luxuries beyond the bare minimum. However, the documentation and software quality of HTC’s software was just god-awful.
The link above gives you a taste. But also quite a bit of help and setup instructions - I wish I’d had that link when I did the setup in early 2023, might have avoided quite a few missteps, but I only found this website just now.
Unfortunately, all the currently standardized OpenXR extensions for eye tracking are only really designed for typical VR “consumer use” use cases, like pointing at or selecting items in a 3D scene with the eyes, or driving the eye movement animations of avatars in a VR space, not really for research grade data collection. E.g., the only extension I’m aware of for binocular gaze reporting (XR_FB_eye_tracking_social) for Meta Quest Pro HMD’s does apply strong low-pass filtering, as it is only meant to animate the eyes of avatars. None of the extensions reports eye opening, or pupil diameter, or such.
Thanks, thats very helpful. I am going to contact HTC to dig a bit further.
I have had a PhD student who has used a PICO with an eye tracker and Unity. The output is indeed limited and that too was an utter pain to get set up and use.
I am hoping things have moved on a little as that was a few years back.
For the project that I have in mind a basic combined gaze vector would be fine. However, with my background in stereo vision I was hoping possibly of binocular recording.
Anyhow, I will contact HTC dig further myself and update this thread with any additional info.
I have had a PhD student who has used a PICO with an eye tracker and Unity. The output is indeed limited and that too was an utter pain to get set up and use.
I am hoping things have moved on a little as that was a few years back.
With standard OpenXR eye tracking, nothing has changed much afaics. The binocular, but low-pass filtered, extension is only available on Meta VR HMD’s with eye tracking, probably only the Meta Quest Pro. And not recommended for anything but animating avatars eyes.
The “monocular” XR_EXT_eye_gaze_interaction extension, which could also be what Unity may use with your PICO, as this is the one standard extension for VR eye tracking, has the widest vendor/hardware support. See the registry for a list of supported VR/AR/MR HMD’s:
I think - and that’s just my personal opinion ofc. - the current limitations are somewhat intentional. There are no technical reasons that one couldn’t expose more advanced eye tracking. E.g., HTC’s implemenation of the standard XR_EXT_eye_gaze_interaction extension is, as far as I could deduce during my development, nothing more than a thin wrapper around SRAnipal, using SRAnipal in the dumbest and least efficient way possible to get eye gaze data, then stripping away most of the information and feeding the minimum left into OpenXR.
One reason for this is protecting user privacy in consumer use cases, as accurate info about eye gaze, pupil diameter (~ attention ~ awakeness ~ emotional arousal?) etc. is privacy sensitive information, and the commercial consumer HMD vendors want to stay clean of accusations of mining eye tracking data or exposing it to the wrong actors etc. (Meta, Microsoft, …) Or they even make it a major marketing point to not provide direct eye tracking data to 3rd party applications at all. Apple, e.g., with its Vision Pro, builds a technically pretty good HMD, but then does its Apple thing, and as the only major VR hardware vendor to my knowledge doesn’t support OpenXR (the good old Apple golden handcuffs vendor lock strategy). Also they (last time I checked) intentionally don’t provide any eye tracking information to 3rd party apps at all, making the device useless for any non-consumer use.
The second reason is that those officially involved in the extension standardization process, ie. paying Khronos members, are probably quite happy to somewhat hamstring the open industry standard eye tracking api’s for basic consumer use cases only, so they can sell their way more expensive vendor proprietary solutions for research use and some commercial uses. Tobii was a primary author and contributor for the XR_EXT_eye_gaze_interaction extension, and they are obviously not clueless about the needs of the eye tracking research community. But they also would like to sell you their “Ocumen” SDK to get more useful data out of VR HMD’s with their eye trackers, for a mere ~1500 Euros per year per device license for academic use:
So I don’t think there will be much push from the established players to improve the situation beyond what is needed for consumer use cases.
This btw. suggests that the latest SteamVR releases might be broken wrt. eye tracking on HTC since somewhere around SteamVR 2.11.
For the project that I have in mind a basic combined gaze vector would be fine. However, with my background in stereo vision I was hoping possibly of binocular recording.
One option that also exists is some VR HMD eyetracker add-ons from Pupillabs (e.g., for Quest 3 and the Pico 4 and some ), see VR & AR - Eye tracking. They seem to provide a Matlab/Octave integration, also for use and tested by them with Psychtoolbox:
Celia once used a Pupil labs desktop eye tracker (ie. not for VR) in a previous project under Linux and was quite happy with it. The nice thing about their offering is that most of the software seems to be open-source and/or open-source friendly and it also works on Linux, Windows and macOS.
We don’t have experience with the VR variants, or if or how well their software integrations work out of the box for VR use cases, or if a lot of do-it-yourself is involved for those. At some point early 2024 I almost managed to enter them into our Psychtoolbox partnership program after days of negotiation, but my boss managed to botch that up in no time, effortlessly utilizing his “unique” sales skills and “understanding” of the Psychtoolbox project and of our customers, so that supposed collaboration that I would have loved to have last year for both the needed money, and for optimal integration of their products into Psychtoolbox, sadly went absolutely nowhere…
Anyway, I do like their relatively open approach to eye tracking.
Just to throw this out as well: My long-term hope for a VR HMD that could be maximally flexible and useful for research use (including eye and hand tracking), especially on the software side, because it is Linux powered and using an open-source stack for everything, and could even run Psychtoolbox on the device itself, is the SimulaVR: https://simulavr.com/ I’m watching their blogs since years and I am rooting for them, as a pro-class device following an open-source and open-hardware approach would be really cool!
But given they are a not super well funded newcomer in a niche market, and hardware is hard and capital intense, and their schedules are slipping all the time, I just hope they make it to an actual product release and don’t go belly up before.
Yes, the Tobii Ocumen seemed nothing more than an utterly shameful cash grab. You buy the hardware and then have to pay to properly access the data from it. I would never go the route with a headset with Tobii again.
We ended up just using the basic free functionality.
It felt like buying a car and then having to pay a yearly fee to allow the wheels to rotate.