VR/XR experiments

Thanks to the newly introduced Virtuality features, I started to think about VR/XR experiments seriously. I am excited if I can use VR for rigorous presentation of stimuli in psychophysics experiments.

But, first of all, is it possible to show stimuli fixed to somewhere in the view in any VR systems? For example, can I show a fixation point presented at the center of the view irrespective to user’s head direction?

Based on the ‘Stereoscopic’/‘Monoscopic’ feature of the PsychOpenXR, I assumed that the stimulus could be presented to anywhere in the user’s view, not in a 3D space. But, I wanted to quickly confirm before trying out unfamiliar VR developement, not to mention buying expensive VR headsets and PCs.

Thank you in advance,

I haven’t looked at PTBs new interface yet, though we also are building a custom headset we want to be openXR compatible, and I’m pretty sure we should be able to deal with head/eye position relative to world-centric co-ordinates (what is the point of VR otherwise?). I assume regular PTB commands draw to a virtual screen, and while they are 2D, the virtual screen is referenced in 3D relative to the user?

Thank you for confirming that we should be able to present stimuli not only in world-centric but also in user-centric, or eye-centric coordinates. This helps a lot because I could not find any instances of VR demos that display objects fixed to the head or eye.

It is great that you are building a custom headset! VR has been on the market for a while, but I think it is in a turning point in the rise of the AI technologies. I will catch up with these new fields to see how we can utilize XR for psychophisics and social experiments. Thank you!