Hi Mario,
I have had a look at the PTB demos and the Rift. They work really well. Much better then I have seen in previous non-PTB Rift demos. I think this must have been with earlier software and/or hardware.
-> Good, i'm not delusional :) - At least a bit less delusional...
The visibility of the pixels is, to me, still very distracting, and far more visible then in the SX111. The field of view also seems smaller. Or at least, if I divert my gaze to the edges of the display it gets distorted more quickly in the Rift.
Motion tracking works well also, but it is a shame that it is over such a restricted area compared to the HTC Vive.
-> That will be interesting to see how it works out into sales numbers once all companies have consumer products out. The VR researchers are all very excited about the larger tracking volume of Vive. Otoh. the typical end users/early adopters - gamers - probably won't have an empty 25 square meter room in their apartments to spare to take full advantage of the Vive.
I am going to be working on a Matlab implementation of our labs calibration procedure, which might be useful for PTB users who require geometrically correct frustums. ATM it is unclear whether that is what one gets with the Rifts geometry correction.
-> The rift developers manual, part of the sdk download, has some sections about such topics. The PsychVRHMD('SetupRenderingParameters',...) functions has some optional parameters to control view frustum. How well it works out i didn't try much, as this was about getting the thing reasonably working at defaults. How accurate it is i don't know.
With the SX111 whilst the optics produce less distortion, we still need to calibrate to get geometrically correct projection. This is particularly important to align virtual objects with real world objects, or objects generated via robotics, and also to run experiments where you want to be accurately simulating world geometry.
The procedure requires a trackable markers that can be seen by both the tracking software and a standard video camera (for example, we use Vicon LEDs). Is it possible to track additional markers via the Rift SDK? Or is it limited to purely tracking the headset markers?
-> No, with the 0.5 SDK at least you don't have any access to the markers, not even the ones on the Rift. The vision based tracking process is a black box, you only get things like head position/pose, speed, acceleration, sensor-fused from the 60 Hz camera and the 1000 Hz inertial measurement unit in the HMD. Neither does Valves current OpenVR SDK, which would be needed for the HTC Vive, provide access to such data.
-> On Linux at least, one can access the tracking cam as a "webcam" in IR spectrum and could do its own image processing to find markers etc. In theory PTB itself has some fast support for realtime tracking of 2D markers or even multi-camera 3D tracking and stuff - in practice it is a plugin i haven't open-sourced and bundled yet, didn't have the time or motivation to put the finishing touches on it throughout the last half year.
It is fantastic to get such good VR for a tiny fraction of the cost. And the commercial version is likely to have many things improved.
I have also ordered a Nvidia 3D vision kit. Our procurement process can be slow (so very slow), so I will let you know when it arrives and I have had a chance to play with it. d
-> Last Sunday, i've actually added support for NVision on 64-Bit Linux in stereomode 11 to the upcoming PTB beta. Based on libnvstusb, mostly untested due to lack of hardware. After the next beta release, a UpdatePsychtoolbox on Linux + "help NVision3D" should tell you how to get it tested.
ciao,
-mario