Alternative to NVIDIA 3D vision?

Hi!

Until recently we were using the NVIDIA 3D vision system for binocular rivalry (using Psychtoolbox-3 - NVision3D ).

However, our glasses have broken and NVIDIA has stopped manufacturing the system, so I was wondering if anyone else had experienced a similar issue and had any alternatives to the NVIDIA system.

Additionally, if anyone has any advice on what to look for in 3D shutter glasses system for compatibility with psychtoolbox it’d be super appreciated.

Thanks in advance!

I also have the same question.
Could anyone help to reply please? Thanks.

It isn’t as easy to find consumer binocular-capable displays at present. The previous fad of consumer 3D (almost every projector and several TVs / displays supported a 3D mode, NVIDIA et al. for gaming) is almost completely dead.

I think DLP-Link capable projectors are the main remaining consumer tech still available (also active polarising glasses that sync to the projector signal, e.g DGD5 3D Glasses | BenQ US).

We bought a Chinese-made dual projector system (similar to commercial cinema 3D). The potential benefit is you don’t need active glasses, but proper alignment was close to impossible due to poor manufacturing :frowning:

If money is no obstacle, then the Propixx has to be the best validated stereoscopic-capable display for vision research: PROPixx - VPixx Technologies – an active shutter so the glasses can be passive, and PTB optimised for great visual fidelity (great contrast / temporal resolution)!

More broadly there are other technologies that are as good as or better than polarising glasses, and it seems that in particular lenticular-based (remember the Nintendo 3DS?) glasses free displays are making a strong comeback:

As head tracking is now a trivial thing – I know of at least one recent clinical trial for amblyopia using lenticular display (without head tracking) for children: Phase 2a randomised controlled feasibility trial of a new 'balanced binocular viewing' treatment for unilateral amblyopia in children age 3-8 years: trial protocol - PubMed

And the gold-standard of course is proper dichoptic displays, which ensures zero-crosstalk as each eye has its own display. Due to VR headsets, there are newer tiny OLED displays with better near-eye optics that can be used for binocular display, e.g. https://www.seeya-tech.com/en/products/products2_11.html – that would require some manual design, or you could try to see if a VR headset could support pass-through, but exactly how PTB’s stereoscopic display modes could work with these displays remains to be seen (note PTB recently added OpenXR support for compatible displays, but I’m not sure how OpenXR and PTBs stereo modes interact?)

Thanks for your reply, Ian.

Yep, that hype-wave is over and the industry wasn’t too pleased with lack of $$$ streaming in. That’s why companies like NVidia not only stopped sales of their consumer 3D goggles, but also actively removed support for them from their proprietary drivers, so even existing hardware will become unusable on graphics card or potentially operating system upgrade. The danger of proprietary non OSS systems…

Also the Viewpixx 3D, and the old Datapixx for good’ol CRT monitors should have Vesa 3-pin Mini-DIN stereo output connectors to driver suitable shutter goggles? Should be conveniently supported by PTB. There’s also some PTB support for CRS FE1 goggles, cfe. ‘help BitsPlusPlus’ section about UseFE1StereoGoggles. I wrote driver code for all this stuff, but don’t remember ever having the opportunity to actually test it in practice, due to lack of hardware, so ymmv…

VR HMDs are binocular by nature, and PTB’s regular stereo drawing code applies. You either set up PTB to use your HMD in 3DVR or Tracked3DVR mode, so our driver will ask OpenXR to setup for proper 3D perspective correct projection (by use of OpenXR projectionLayers that are configured in field of view, view frustum, focal length etc. to be optimal for the given HMD/optics/viewer - for some definition of optimal. PTB will also provide suitable projection and modelview matrices to setup OpenGL perspective correct 3D rendering compatible with this viewing model, and updated by head tracking. The OpenXR runtime decides on all specific properties of the projection, and may take things like optics, field of view, focal length, eye-lens/screen distance (eye relief), lens separation or potentially actual IPD into account, depending on how fancy the HMD hardware is, and what settings can be adjusted or measured by the hardware.

Cfe. VRHMDDemo1.m, VRInputStuffTest.m, SuperShapeDemo.m, MorphDemo.m.

Or instead you add one line of code to an existing stereoscopic script to request presentation on a HMD. This allows to turn any existing stereo presentations script into something HMD compatible - in principle, e.g., VRHMDDemo.m, ImagingStereoDemo(103), some others. In this case OpenXR quad view layers are used to present images. You can think of these as big rectangular viewscreens floating in front of the viewers eye, at a fixed location and orientation relative to the eyes. The size and location and orientation of these wrt. to the viewers eyes are setup by a heuristic of mine to look ok for hopefully many use cases on hopefully many HMD’s on hopefully many OpenXR runtimes. But heuristics it is, and my test set of HMD’s currently is n=2, and due to the severe lack of funding for PTB I didn’t have plenty of time to fine-tune that heuristic. Therefore there are functions that allow to change those default locations/sizes/orientations per eye to adapt to specific needs of the experiment or specific properties of the used HMD or subject, e.g., IPD, eye relief, etc.

I guess VR HMD’s are the new hot thing for binocular stimulation, after the failure of 3D TV’s, with gradually improving quality of the technology. At least as long as industry thinks there’s money to be made in this area.

The downside of using VR HMD’s for research right now is somewhat the lack of control about how and when stuff is presented to the subject. OpenXR standardizes some aspects of this and for the first time provides an api and open standards across hardware vendors and operating systems. But there is enough wiggle room and variability in the details of different hardware and software implementations that may not matter at all for consumer applications, but very much for vision science.

My hope on the software side here is Monado, a free and open-source OpenXR runtime implementation, therefore capable people can look at and understand how the software side of the VR stack works, and can potentially customize and improve it for their needs. In my work on PTB’s new OpenXR driver I’ve already contributed small fixes and improvements, with hopefully more substantial stuff to come. On the hardware side, I root for SimulaVR, a consumer oriented, pro class HMD which uses a fully open-source Linux+Monado software stack. They are still in pre-mass-production stage, a small startup, so given how difficult hardware business is, they might go under before they make it to a good stage. But as far as suitability for research and openess goes, that would be probably a pretty splendid device. There’s also the ILLIXR project for hacker types who are not afraid of do-it-yourself hardware hacking.

-mario

1 Like

Tl;dr: ProPixx is superb for 3D(!), but pricy. Use caution expecting the Viewpixx 3D to be a compact/cheaper substitute.

Just wanted to throw in my 2¢ & support for using ProPixx for 3D stimulus presentation. I’ve used it extensively and would 100% recommend it to anyone that can afford it. Its rock solid, linear, fast (120Hz per eye), and the people at VPixx are super knowledgeable, helpful, and responsive.

However, I can’t give the same full-throated support for the Viewpixx 3D. My experience with it is more secondhand (reviewing plots & saying “oh yea, thats not good”), but it seemed to suffer from heat dispersion problems. Temporal precision would gradually begin to lag over the course of an extended experimental run, then abruptly recover once the cooling fans kicked on. Its possible that the issues were specific to the unit we had, or the relatively high-demand use case (3D stereo w/overlay & digital sync I/O), but the origin of the issue seemed to be a consequence of the control box & monitor being squeezed into the same small enclosure. …in the end, the Viewpixx 3D display was passed onto another project/user that did not utilize any 3D features and didn’t encounter (or didn’t test for) temperature-dependent temporal lag.

My other go-to setup was dual-CRTs with Matrox display splitters, but both of those hardware solutions are long out of production by now. The HMD & OpenXR path is probably the most viable/affordable option going forward.

PTB isn’t part of my current repertoire, but I’m very happy to see it’s continued development. Great thanks & encouragement to Mario and all the contributors (both code & financial).

1 Like

You can actually do 240 Hz per eye with the ProPixx, their shutter will work at 480 Hz (we do this in my lab).

Keith

1 Like

For binocular experiments we still use a ViewPixx3D and shutter glasses. VR headsets are not designed for highly controlled binocular stimulation. If you want to know what is hitting the retina, VR at the moment is not the way to go.

Yes, you can retrieve the modelview and projection matrices, but it is not clear how accurate these are. For example, the headset can sit on peoples head differently. It is not clear how IPD adjustment/settings work on different headsets. Is this reflected correctly in the matrices? Some IPD adjustment is in software, others in hardware. Then you have the image warping for correction due to the lenses on top of that.

I am sure this is all understandable, but it adds a bunch more layers you need to be careful with and understand if the aim is for controlled binocular stimulation.

Obviously, as always, it depends on what you want to achieve in your experiment. There are drawbacks for everything, you just need to know if the “error zone” is below what you require or not.

For a bit of background the following paper is excellent. It shows you how to do a VR calibration to get the matrices. Really shows what one needs to think about.

Though even here, it is only accurate for one IPD and does not take into account how the headset sits on the head.

In terms of alternatives, I am not sure. Sorry.

P

1 Like

We were very much in the same boat - trying to keep NVidia Vision equipment alive and scrounging for glasses on eBay.

These days, we have two projector systems - we were fortunate to get a grant to buy a ProPixx last year, but we also got in touch with DepthQ directly and got them to build a system for us based around a consumer/gaming projector.

DepthQ makes the polarizer that VPixx uses and sells as part of the ProPixx, and they advertise a “Passive 3D bundle” - they were very happy to work with us to build a slightly different version customized to our needs. We had it built around an Optoma UHZ45 (1080p/240Hz laser-LED illuminator projector). It was also pretty reasonable including a very nice projection screen, which they sourced for us from Stewart Filmscreen. My recollection is that the projector was about $2k USD and all the work they did for us, including the screen, was another $6k or so. We’ve been very happy with it.

1 Like