Nvidia 3D Vision glasses, Windows 7

Hi,


Just looking for some up to date information on using Nvidia 3D vision shutter glasses on Windows 7 with PTB. These are the ones which plug in via USB, not a 3-pin-din stereo port. 


Basically, do they work well? I will be using a CRT, so I'm not asking about ghosting caused by screen persistence. Just whether the glasses play nice with PTB and Windows 7. 


For this project there would also be the possibility of using Linux.


Thanks


Peter

XX-In PSYCHTOOLBOX@yahoogroups.com, <peterscarfe@...> wrote :

Ah, all is clear. 

Thats annoying. Stereo rendering is getting more and more common nowadays, so it is a shame there are not more commercial systems. 

We used Crystal Eyes and blue line stereo. But this was when I was a PhD student. They no longer sell the adaptor to do this now. Its all just the 3pin stereo adaptors, for which you need a Quadro card, for something like the DataPixx. 

Thanks for the help / advice.

Peter

-> I guess a lot of people would look into using VR headsets for this soon? They are relatively cheap (well, the Rift DK2 is very cheap for what it has to offer), although the prices for the actual consumer mass market products will be higher as far as i read. With PTB's new Rift support you can use the DK2 as a mono display monitor with good timing and controlled lighting conditions ("A cubicle worn on your head"), or as a stereo monitor, or for the full VR head tracked 3D rendering show.

-> That Github project won't be a real option. Using the library itself would be possible within PTB with ok effort, but that would only work on Windows, which i don't like. Another problem would be the requirement to rename matlab.exe into WoW.exe or similar to trick the NVidia driver into thinking this isn't Matlab but "World of Warcraft". I doubt that would end well with the way Matlabs startup process works.

-> The best chance to get this working cross-platform, without awful hacks, possibly even on other than NVidia gpu's would be libnvstusb

http://sourceforge.net/p/libnvstusb/code/HEAD/tree/

...and various related Linux projects around it, e.g, stuff from this project:

magestik


If libnvstusb would work well enough with your glasses under Linux, i think it would be possible to integrate it into PTB's own frame-seq stereo mode (mode 11 afair?) for Linux relatively quickly. In principle it would also be possible to get that working on OSX or Windows at some point with more effort.

Of course the risk with such a thing based on reverse engineering the USB protocol is that it only works with certain versions of the glasses, and/or could stop working with any future versions of those products. I don't know what the current cheapest Quadro cards with 3-pin mini din connector cost?

-mario

HI Both,

On Fri, Oct 9, 2015 at 8:08 PM, mario.kleiner@... [PSYCHTOOLBOX] <PSYCHTOOLBOX@yahoogroups.com> wrote:
If libnvstusb would work well enough with your glasses under Linux, i think it would be possible to integrate it into PTB's own frame-seq stereo mode (mode 11 afair?) for Linux relatively quickly. In principle it would also be possible to get that working on OSX or Windows at some point with more effort.

Cool, maybe if testing shows its worth it... Even with quadro cards it can be fiddly to get working. So having our own control might be better.
Of course the risk with such a thing based on reverse engineering the USB protocol is that it only works with certain versions of the glasses, and/or could stop working with any future versions of those products.

True, but better something that works, and that one can hack to hopefully get working again, than no solution at all, because ...
I don't know what the current cheapest Quadro cards with 3-pin mini din connector cost?

... not much, but they have no fire power. The cheapest are passively cooled and perform like it. In terms of bang for but, geforce is by far the better choice. If only it wasn't arbitrarily limited...

Cheers,
Dee
XX-In PSYCHTOOLBOX@yahoogroups.com, <peterscarfe@...> wrote :


-> I guess a lot of people would look into using VR headsets for this soon? They are relatively cheap (well, the Rift DK2 is very cheap for what it has to offer), although the prices for the actual consumer mass market products will be higher as far as i read. With PTB's new Rift support you can use the DK2 as a mono display monitor with good timing and controlled lighting conditions ("A cubicle worn on your head"), or as a stereo monitor, or for the full VR head tracked 3D rendering show.

So far I have been really disappointed with the Rift. The experience you end up with is like looking at a screen positioned right next to your eye through a powerful magnifying glass. The pixels are highly visible as is geometric and colour distortion. Oculus' approach seems to be to correct this in software rather then make better optics, and overall, at the moment, that doesn't work at all well in my opinion. 

-> Interesting how perceptions differ. I've used two separate exemplars of the DK2 intensely throughout the last month and was quite happy with its quality. At least with the PTB demos i ported so far, geometric distortion for me is essentially imperceptible, color abberation is minimal and only noticeable at all to me on sharp color/contrast edges, and latency is excellent. I can't remember getting such good immersion from the very expensive high end devices i tried at the place were money for very expensive toys was never in short supply.

Sure the pixel raster is visible if you focus your attention on it, but it isn't as bad as i thought it would be. And at least i quickly tune that out when interacting with a 3d scene. So at least for me the fast tracking/low latency and wide field of view does the trick very well.

Doing software correction instead of using better optics was as far as i read the only way to get price down to something affordable. The optics is what makes the thing heavy and expensive. For me the software undistortion works very well. Have you tried the PTB demos? They tweak their software all the time, maybe the v0.5 runtime we use has better post processing than the runtimes you used when trying? Although PTB does all of the post-processing itself with its own imaging pipeline. It only gets the optimized undistortion warp mesh geometry from the runtime.

Of course you may have higher requirements for your work. I'm mostly happy that this hopefully enables new use cases for more affordable prices. At least some labs already use the Rift for perception research.

I stumbled across a Microsoft project whose aim is to get better optics for the Rift, so I guess I am not the only one.  

For my work, accurate geometric rendering is essential, and with the Rift it is unclear whether you end up with a geometrically correct frustum. My guess is not, given the visible distortions. We calibrate our HMD (SX111) specifically for this purpose. We could do something similar with the Rift, but that is not going to fix the Rifts' optics. The SX111 is now old (and expensive) technology, but it still far outperforms the Rift. 

-> Sure, but you are comparing a pre-production prototype for around 350$, with a production device whose prize is "contact us for a quote" - the code speech for fricking expensive. Also much higher weight. I assume the consumer versions of Rift, Vive etc. will be quite a step up when they show up in 2016.


Even with the SX111, there are experiments I would not run on it in favour of a monitor and shutter glasses. As all of the above problems are either absent, of far far far better than any current HMD. 

-> There are interesting uses beyond classic VR, e.g., having a head tracker to collect behavioral responses, or the low latency/low persistence/deterministic timing of the display panel which outperform typical LCD displays and are at the same or better level as good ol' CRTs. MovingLinesDemo is an interesting demonstration. With the space key you can switch between low and full persistence mode, giving the Rift either the characteristics of a good CRT monitor or of a typical LCD panel. It is interesting to see how the interaction with smooth pursuit eye movements affects the perceived motion blur.

-> Similar, i don't know how well typical LCD panels with cheap LCD NVidia shutter glasses and USB sync will work wrt. stereo cross-talk, compared to expensive high end shutter glasses with some ferro-magnetic shutters and 3-pin mini din sync and pro graphics cards. I think there could be situations where the Rift, or better consumer class successors could have quite an edge. But sure, everything has its own use cases. I'm just excited about the potential to open up new applications at a low cost for people with good ideas but not much money, something that wasn't possible in the past with existing VR tech.

Saying all that, things will change. VR companies are aware of the of the problems they need to fix. It will just take a few (hopefully not that many) iterations before they get it right, and VR does offer the potential for a much greater level of experimental realism. 

-> Indeed.

Thanks for the usb library pointers. I will start having a look around as to potential options. 

-> If you'd send some NVision kit my way, i'd give it a try adding support to PTB if it works out without the need for reverse engineering. You can also try if it would work out if you have such goggles already. Just install that library on a Linux system and follow their setup instructions and run their minimal test application as instructed by them to see if the goggles respond or not.

-mario


I think we agree. I published JOV a paper a little earlier this year on the fantastic opportunities affordable VR will bring. 

http://jov.arvojournals.org/article.aspx?articleid=2389539&resultClick=1


Research grade VR and its huge expense is simply due to them having a monopoly. The main companies haven't really innovate in years. I guess mainly because they didn't have to in order to make sales. 

I have had a go on a DK1, DK2, and a Samsung version where the phone is the display (I forget the name). My hopes were super high, but I was disappointed. The main thing was the visible pixelation. I think they were run on C++ code, and one in Unity. Not my own work, so I am unsure if it was all done correctly or not. 

I have just received a DK2, so will have a look at the PTB demos. My lab computers also finally arrived, so I should be able to get better performance then on my laptop. Look forward to looking at the PTB demos. 

I'm sure the consumer versions will also be much improved. As you say, these are all just development kits at the moment, so it is hard to tell what the final product will be like. 

The one I am most excited about is the Vive. Primarily because it has tracking over a larger volume. It has also got good reviews from the tech press, specifically in comparison to the Rift. 

I'll let you know how things go with the shutter glasses. I am going to look into ordering some next week. ATM I am just using some with a ViewPixx3D. 

Peter






 

I have had a go on a DK1, DK2, and a Samsung version where the phone is the display (I forget the name). My hopes were super high, but I was disappointed. The main thing was the visible pixelation. I think they were run on C++ code, and one in Unity. Not my own work, so I am unsure if it was all done correctly or not. 

I have just received a DK2, so will have a look at the PTB demos. My lab computers also finally arrived, so I should be able to get better performance then on my laptop. Look forward to looking at the PTB demos. 

I'm sure the consumer versions will also be much improved. As you say these are all just development kits at the moment, so it is hard to tell what the final product will be like. 

The one I am most excited about is the Vive. Primarily because it has tracking over a larger volume. It has also got good reviews from the tech press, specifically in comparison to the Rift. 

I'll let you know how things go with the shutter glasses. I am going to look into ordering some next week. ATM I am just using some with a ViewPixx3D. 

Peter




Hi Mario,

I have had a look at the PTB demos and the Rift. They work really well. Much better then I have seen in previous non-PTB Rift demos. I think this must have been with earlier software and/or hardware.

The visibility of the pixels is, to me, still very distracting, and far more visible then in the SX111. The field of view also seems smaller. Or at least, if I divert my gaze to the edges of the display it gets distorted more quickly in the Rift. 

Motion tracking works well also, but it is a shame that it is over such a restricted area compared to the HTC Vive. 

I am going to be working on a Matlab implementation of our labs calibration procedure, which might be useful for PTB users who require geometrically correct frustums. ATM it is unclear whether that is what one gets with the Rifts geometry correction. 

With the SX111 whilst the optics produce less distortion, we still need to calibrate to get geometrically correct projection. This is particularly important to align virtual objects with real world objects, or objects generated via robotics, and also to run experiments where you want to be accurately simulating world geometry.

The procedure requires a trackable markers that can be seen by both the tracking software and a standard video camera (for example, we use Vicon LEDs). Is it possible to track additional markers via the Rift SDK? Or is it limited to purely tracking the headset markers?

It is fantastic to get such good VR for a tiny fraction of the cost. And the commercial version is likely to have many things improved. 

I have also ordered a Nvidia 3D vision kit. Our procurement process can be slow (so very slow), so I will let you know when it arrives and I have had a chance to play with it. 

Thanks

Peter



Dear Peter,

I just saw this thread now. I had a Quadro card (I think it is not doable without one of their enabled cards) and the Nvidia 3D glasses (both version 1 and 2) both worked perfectly in PTB and gave great perception of depth. PTB with Windows 7. I used it for a variety of objects.

Just a couple of notes:

1: Ensure the Nvidia Control Panel is adjusted correctly, this could seemingly randomly switch of the 3D. It comes with a few test images which are handy to check everything is working correctly.

2: As I had the hardware with the din output I used mode 1.

3: Watch out for interference, the glasses use infrared, and are a little susceptible to being interfered with by other nearby signals. You'll notice this when they start to jitter in there flicker. Ensure that the line of sight to the IR blaster is true.

Let me know if you have any further questions,

Nick

On 19 October 2015 at 08:32, peterscarfe@... [PSYCHTOOLBOX] <PSYCHTOOLBOX@yahoogroups.com> wrote:

Hi Mario,


I have had a look at the PTB demos and the Rift. They work really well. Much better then I have seen in previous non-PTB Rift demos. I think this must have been with earlier software and/or hardware.

The visibility of the pixels is, to me, still very distracting, and far more visible then in the SX111. The field of view also seems smaller. Or at least, if I divert my gaze to the edges of the display it gets distorted more quickly in the Rift.

Motion tracking works well also, but it is a shame that it is over such a restricted area compared to the HTC Vive.

I am going to be working on a Matlab implementation of our labs calibration procedure, which might be useful for PTB users who require geometrically correct frustums. ATM it is unclear whether that is what one gets with the Rifts geometry correction.

With the SX111 whilst the optics produce less distortion, we still need to calibrate to get geometrically correct projection. This is particularly important to align virtual objects with real world objects, or objects generated via robotics, and also to run experiments where you want to be accurately simulating world geometry.

The procedure requires a trackable markers that can be seen by both the tracking software and a standard video camera (for example, we use Vicon LEDs). Is it possible to track additional markers via the Rift SDK? Or is it limited to purely tracking the headset markers?

It is fantastic to get such good VR for a tiny fraction of the cost. And the commercial version is likely to have many things improved.

I have also ordered a Nvidia 3D vision kit. Our procurement process can be slow (so very slow), so I will let you know when it arrives and I have had a chance to play with it.

Thanks

Peter






--
Nicholas Peatfield, PhD

XX---In PSYCHTOOLBOX@yahoogroups.com, <peterscarfe@...> wrote :

Hi Mario,

I have had a look at the PTB demos and the Rift. They work really well. Much better then I have seen in previous non-PTB Rift demos. I think this must have been with earlier software and/or hardware.

-> Good, i'm not delusional :) - At least a bit less delusional...

The visibility of the pixels is, to me, still very distracting, and far more visible then in the SX111. The field of view also seems smaller. Or at least, if I divert my gaze to the edges of the display it gets distorted more quickly in the Rift. 

Motion tracking works well also, but it is a shame that it is over such a restricted area compared to the HTC Vive. 

-> That will be interesting to see how it works out into sales numbers once all companies have consumer products out. The VR researchers are all very excited about the larger tracking volume of Vive. Otoh. the typical end users/early adopters - gamers - probably won't have an empty 25 square meter room in their apartments to spare to take full advantage of the Vive.

I am going to be working on a Matlab implementation of our labs calibration procedure, which might be useful for PTB users who require geometrically correct frustums. ATM it is unclear whether that is what one gets with the Rifts geometry correction. 

-> The rift developers manual, part of the sdk download, has some sections about such topics. The PsychVRHMD('SetupRenderingParameters',...) functions has some optional parameters to control view frustum. How well it works out i didn't try much, as this was about getting the thing reasonably working at defaults. How accurate it is i don't know.


With the SX111 whilst the optics produce less distortion, we still need to calibrate to get geometrically correct projection. This is particularly important to align virtual objects with real world objects, or objects generated via robotics, and also to run experiments where you want to be accurately simulating world geometry.

The procedure requires a trackable markers that can be seen by both the tracking software and a standard video camera (for example, we use Vicon LEDs). Is it possible to track additional markers via the Rift SDK? Or is it limited to purely tracking the headset markers?

-> No, with the 0.5 SDK at least you don't have any access to the markers, not even the ones on the Rift. The vision based tracking process is a black box, you only get things like head position/pose, speed, acceleration, sensor-fused from the 60 Hz camera and the 1000 Hz inertial measurement unit in the HMD. Neither does Valves current OpenVR SDK, which would be needed for the HTC Vive, provide access to such data.

-> On Linux at least, one can access the tracking cam as a "webcam" in IR spectrum and could do its own image processing to find markers etc. In theory PTB itself has some fast support for realtime tracking of 2D markers or even multi-camera 3D tracking and stuff - in practice it is a plugin i haven't open-sourced and bundled yet, didn't have the time or motivation to put the finishing touches on it throughout the last half year.

It is fantastic to get such good VR for a tiny fraction of the cost. And the commercial version is likely to have many things improved. 

I have also ordered a Nvidia 3D vision kit. Our procurement process can be slow (so very slow), so I will let you know when it arrives and I have had a chance to play with it. d

-> Last Sunday, i've actually added support for NVision on 64-Bit Linux in stereomode 11 to the upcoming PTB beta. Based on libnvstusb, mostly untested due to lack of hardware. After the next beta release, a UpdatePsychtoolbox on Linux + "help NVision3D" should tell you how to get it tested.

ciao,
-mario

Hi Mario,

The NVidia 3DVision glasses and emitter arrived today. I will have a look at testing on Linux once the next PTB Beta comes out. 

Is this going to be a Linux only thing, or is it feasible to get it working on other operating systems?

Peter
XX---In PSYCHTOOLBOX@yahoogroups.com, <peterscarfe@...> wrote :

Hi Mario,

The NVidia 3DVision glasses and emitter arrived today. I will have a look at testing on Linux once the next PTB Beta comes out. 

-> It's out now. "help NVision3D". Should work - if it works at all - with stereomode 11. May need some tweaking by myself to the code to avoid or reduce glitches during animations. For Intel or AMD gpus i recommend use of the standard open source graphics drivers which are installed by default. For NVidia gpus i'd recommend installing the proprietary graphics driver, the open-source driver nouveau has some known limitations which would make it unstable under frameseq. stereo mode.

Is this going to be a Linux only thing, or is it feasible to get it working on other operating systems?

-> The library was made for Linux, and even there it took quite a bit of fiddling to get it to compile, because it hasn't been maintained for multiple years, so the makefiles weren't really compatible with current Linux distros anymore. I had a detailed look at the code and assuming this works reliably on Linux i think it would be possible with some slightly tedious effort to hack it to make it work on OSX and maybe with more effort on Windows, although performance might be worse as this is very timing sensitive and pushing what one can/should do outside a kernel driver. But as long as i don't have such a device at my own free disposal i certainly won't look into this. Hacking up the Linux side was more work than expected and the entertainment value of coding for 12 hours without getting any reward out of it is not great.

-mario

Great. I'd be happy to send you the hardware. PM me with address details. 

P
Ok, but first we need to be sure that this works reliably on Linux for you, otherwise there isn't any point pursuing this further.

-> I'll look into this sometime this week. I have a Linux system which I should be able to test it on.


Also understand that i would take that hardware as a (very cheap) payment for my work, independent of if it would be successful on OSX or Windows. You wouldn't get it back, regardless if porting to OSX or Windows would work or not. I'd spend at most one day trying to get it to work, sometimes within the next couple of weeks, but not much more if it wouldn't work.

-> Have you got any closer to coming up with some kind of structure to fund your development of PTB? I think there are plenty of labs out there which provide some kind of funding if a clear structure were in place, even if they were funding something which were given away for free i.e. PTB.

-> You would need to out in place a better structure then the voluntary donations thing. But we discussed this a way back and my memory is you thought this would be too difficult. 

-> Alternately, are you interested in trying to put for a grant in to fund PTB development? Or at least looking into that possibility. 

P