VBLTimestamp with multiple screens at different refresh rates

Hello

I'm trying to understand how to determine which monitor is queried for VBL timestamping in a multi-monitor setup.

I'm using MATLAB 2015b on Ubuntu 16 with an NVIDIA Quadro P2000 graphics card and the NVIDIA binary driver.  I'm using a Datapixx2 system and the PLDAPS code package to run experiments.  In particular, I'm using the "software overlay" mode.  There are three monitors connected; one for the command window, the second for the console display, and the third connected to DataPixx and then to the test monitor.

The command window is on Screen 0 and the two other monitors are Screen 1.

How can I determine which monitor is queried for time stamps?  I'd like it to be the monitor seen by the subject.

The command Screen('Flip') appears to be sending back the time stamping for screen 0.

Thanks,
Lee Lovejoy


Hi Mario, thanks for the reply!

PLDAPS is for controlling a Plexon (or other electrophysiology data acquisition system) from MATLAB with a DataPixx and Psychophysics Toolbox.  Mainly a creation of people in Alex Huk's lab at UT Austin, but there are several other groups using it.

The "software overlay" mode is for whenever you want to have a view of what the subject sees with overlaid information (like eye position) but for some reason using DataPixx's mirroring with overlay mode isn't available or isn't appropriate.  For DataPixx, to the best of my understanding the second lookup table for transparency only works if all color is in that lookup table with a single entry as the transparency.  So if you need to have color on the screen that the subject sees but also have superimposed color on the console display, you'd have to make sure that whenever the color was superimposed, the CLUT called for a non-transparent row matching the background color.  Theoretically feasible as far as I know but it would require querying what the RGB value was out of the texture.  As an alternative, the overlay is done in PTB and not in DataPixx.

I executed GraphicsDisplaySyncAcrossDualHeadsTestLinux.  I see a flat line with a handful of spikes, which I understand indicates that the displays are well synchronized.  PerceptualVBLSyncTest gave the homogenous gray screen with a few yellow lines at the very top of the display.  So, tests work as expected.

Thanks,
Lee

---In psychtoolbox@yahoogroups.com, <mario.kleiner@...> wrote :

<lee.lovejoy@...> wrote :

Hello

I'm trying to understand how to determine which monitor is queried for VBL timestamping in a multi-monitor setup.

I'm using MATLAB 2015b on Ubuntu 16 with an NVIDIA Quadro P2000 graphics card and the NVIDIA binary driver.  I'm using a Datapixx2 system and the PLDAPS code package to run experiments.  In particular, I'm using the "software overlay" mode.

-> I'm not familiar with PLDAPS, in fact didn't know it existed until reading this message. So many high-level toolkits on top of PTB, so little communication and feedback by the authors of such software :(. I'm always baffled why they don't even advertise their creations on the PTB forum or our Wiki. What is "software overlay mode" and how does it relate to such multi-display setups?

There are three monitors connected; one for the command window, the second for the console display, and the third connected to DataPixx and then to the test monitor.

The command window is on Screen 0 and the two other monitors are Screen 1.

-> Is there a reason not to use Datapixx built-in method of mirroring one video output from the graphics card to two connected monitors - one for the subject, the other for the console display? That would solve 2/3rd of all potential visual timing problems if one needs a console display for monitoring.

How can I determine which monitor is queried for time stamps?  I'd like it to be the monitor seen by the subject.

-> That's a good question, but not easy to answer reliably if you use NVidia's proprietary graphics driver, instead of a recommended Intel or AMD graphics card with open-source drivers. The open-source drivers on a multi-display setup with multiple outputs per screen choose the monitor output with the larger pixel area as timing master for all scheduling and timestamping. The NVidia driver might do something similar, or something else, depending on whatever NVidia decided for a given driver version.

If your 2nd x-screen (PTB screenid 1) drives two separate video outputs then both video outputs need to be synchronized in their display timing, ie. two *identical* display devices, with identical video mode settings, and the NVidia driver must auto-sync the displays. It used to do that in the past if all conditions were right. The script GraphicsDisplaySyncAcrossDualHeadsTestLinux allows to test for such proper sync.

If both outputs are synchronized that's good. If not, depending if they are in a fixed phase relationship or if they are drifting against each other, that could cause substantial performance degradation/loss and permanent or periodic timing problems.

The whole problem can be avoided by driving both the subject and console monitor from the Datapixx two video outputs, set up via the Datapixx('SetVideoHorizontalSplit', 0); setting. Not sure how that integrates with the PLDAPS thingy. Then PTB screen 1 would just drive one video output from the graphics card, feeding into the video input of the Datapixx.

The command Screen('Flip') appears to be sending back the time stamping for screen 0.

-> Mostly see "help DisplayOutputMappings": For the high precision beamposition based timestamping, PTB has to make an educated guess via a heuristic if the NVidia proprietary graphics driver is used, as that driver doesn't provide us with actual needed information. The more video outputs connected, the more likely for the mapping to go wrong. If you run PerceptualVBLSyncTest and you get homogeneous, tear-free, high frequency black-white flicker and the yellow lines cluster at the very top of the screen, that would probably mean the timestamping corresponds to the correct video output. Otherwise you have to use Screen('Preference','ScreenToHead', 1, 0, x) for a suitable x in range 0, 1, 2, 3 until you get the correct result.

-> With a Datapixx you could also use PsychDatapixx('LogOnsetTimestamps') logging to get hardware timestamps from the Datapixx, assuming your PLDAPS toolbox supports that, or at least doesn't interfere with PTB. Generally Datapixx hw timestamping incurs about 2-3 msecs of overhead per Flip, and can have its own caveats, and a properly working PTB setup will provide Flip timestamps as accurate or even better than what Datapixx hw timestamping can do, but it is another nice way to double-check timing on a VPixx device.

-mario

Thanks,
Lee Lovejoy


<lee.lovejoy@...> wrote :

Hi Mario, thanks for the reply!

PLDAPS is for controlling a Plexon (or other electrophysiology data acquisition system) from MATLAB with a DataPixx and Psychophysics Toolbox.  Mainly a creation of people in Alex Huk's lab at UT Austin, but there are several other groups using it.

The "software overlay" mode is for whenever you want to have a view of what the subject sees with overlaid information (like eye position) but for some reason using DataPixx's mirroring with overlay mode isn't available or isn't appropriate.  For DataPixx, to the best of my understanding the second lookup table for transparency only works if all color is in that lookup table with a single entry as the transparency.  So if you need to have color on the screen that the subject sees but also have superimposed color on the console display, you'd have to make sure that whenever the color was superimposed, the CLUT called for a non-transparent row matching the background color.  Theoretically feasible as far as I know but it would require querying what the RGB value was out of the texture.  As an alternative, the overlay is done in PTB and not in DataPixx.

I executed GraphicsDisplaySyncAcrossDualHeadsTestLinux.  I see a flat line with a handful of spikes, which I understand indicates that the displays are well synchronized.  PerceptualVBLSyncTest gave the homogenous gray screen with a few yellow lines at the very top of the display.  So, tests work as expected.

-> Ok, so that means that whatever two video outputs are measured by GraphicsDisplaySyncAcrossDualHeadsTestLinux are running synchronized, if you see a flat line with a few occassional (maybe periodic) spikes, and the flat line around the vertical 0 value. Here in a local lab, we had to solve the same problem with a NVidia GeForce 1000 series card + proprietary driver under Linux last week, one monitor for subject stimulation, the other as control monitor, a third one on X-Screen 0 for Matlab. Afaics the NVidia driver does synchronize displays if they are the same model, running at the same video resolution and refresh rate, ie. with identical settings and video clocks. Is this the case for your stimulation and control monitor - same resolution and refresh rate?

-> What you still need to find out is if the measured beamposition and therefore timestamps are from the stimulation monitor (ie. the output feeding the Datapixx). Run PerceptualVBLSyncTest(1, [], [], [], 600, 0, 1) to test this visually for the outputs o X-Screen 1. What you should see is black-white flicker, with some strong horizontal black/white tear-line / crack, somewhere in the lower half of the visual stimulation screen, maybe jittering a bit up and down depending on system timing. Then you should see a yellow line staying closely at or slightly below the tear-line, maybe jittering a bit as well. Both can jitter, but should stay in a roughly fixed position close to each other. The monitor where you see this is the one that is time-stamped and thereby should be the stimulus monitor. On a properly synchronized setup, the other control monitor should show the same -- yellow line and tear-line at same vertical position. If the other monitor has a wandering tear-line then it isn't properly synchronized, which can impact performance. If the "wrong" monitor has the fixed tear-line <-> yellow line pair then you know the timestamps refer to the wrong monitor. Screen('Preference','ScreenToHead') can then be used to virtually rewire this:

E.g., Screen('Preference','ScreenTohead', 1) for the setup of X-Screen 1 gives:

0 1
1 2

Then the assignment of display engine '1' to screen 1 for timestamping would be wrong, and you could try Screen('Preference','ScreenToHead', 1, 0, x) for x = 2 or 0 to try to assign engine 2 or 0 instead of engine 1 and rerun the PerceptulVBLSynctest as above, until the yellow line and tear-iine do what they are supposed to do on the stimulus monitor. That setting would then need to be added to the top of your script to override the heuristic. Replugging the cables on your computer might also do the trick, instead of this virtual replugging.

It's not a fun activity to get this right, but unavoidable with the proprietary NVidia driver.
-mario


Thanks,
Lee

---In psychtoolbox@yahoogroups.com, <mario.kleiner@...> wrote :

<lee.lovejoy@...> wrote :

Hello

I'm trying to understand how to determine which monitor is queried for VBL timestamping in a multi-monitor setup.

I'm using MATLAB 2015b on Ubuntu 16 with an NVIDIA Quadro P2000 graphics card and the NVIDIA binary driver.  I'm using a Datapixx2 system and the PLDAPS code package to run experiments.  In particular, I'm using the "software overlay" mode.

-> I'm not familiar with PLDAPS, in fact didn't know it existed until reading this message. So many high-level toolkits on top of PTB, so little communication and feedback by the authors of such software :(. I'm always baffled why they don't even advertise their creations on the PTB forum or our Wiki. What is "software overlay mode" and how does it relate to such multi-display setups?

There are three monitors connected; one for the command window, the second for the console display, and the third connected to DataPixx and then to the test monitor.

The command window is on Screen 0 and the two other monitors are Screen 1.

-> Is there a reason not to use Datapixx built-in method of mirroring one video output from the graphics card to two connected monitors - one for the subject, the other for the console display? That would solve 2/3rd of all potential visual timing problems if one needs a console display for monitoring.

How can I determine which monitor is queried for time stamps?  I'd like it to be the monitor seen by the subject.

-> That's a good question, but not easy to answer reliably if you use NVidia's proprietary graphics driver, instead of a recommended Intel or AMD graphics card with open-source drivers. The open-source drivers on a multi-display setup with multiple outputs per screen choose the monitor output with the larger pixel area as timing master for all scheduling and timestamping. The NVidia driver might do something similar, or something else, depending on whatever NVidia decided for a given driver version.

If your 2nd x-screen (PTB screenid 1) drives two separate video outputs then both video outputs need to be synchronized in their display timing, ie. two *identical* display devices, with identical video mode settings, and the NVidia driver must auto-sync the displays. It used to do that in the past if all conditions were right. The script GraphicsDisplaySyncAcrossDualHeadsTestLinux allows to test for such proper sync.

If both outputs are synchronized that's good. If not, depending if they are in a fixed phase relationship or if they are drifting against each other, that could cause substantial performance degradation/loss and permanent or periodic timing problems.

The whole problem can be avoided by driving both the subject and console monitor from the Datapixx two video outputs, set up via the Datapixx('SetVideoHorizontalSplit', 0); setting. Not sure how that integrates with the PLDAPS thingy. Then PTB screen 1 would just drive one video output from the graphics card, feeding into the video input of the Datapixx.

The command Screen('Flip') appears to be sending back the time stamping for screen 0.

-> Mostly see "help DisplayOutputMappings": For the high precision beamposition based timestamping, PTB has to make an educated guess via a heuristic if the NVidia proprietary graphics driver is used, as that driver doesn't provide us with actual needed information. The more video outputs connected, the more likely for the mapping to go wrong. If you run PerceptualVBLSyncTest and you get homogeneous, tear-free, high frequency black-white flicker and the yellow lines cluster at the very top of the screen, that would probably mean the timestamping corresponds to the correct video output. Otherwise you have to use Screen('Preference','ScreenToHead', 1, 0, x) for a suitable x in range 0, 1, 2, 3 until you get the correct result.

-> With a Datapixx you could also use PsychDatapixx('LogOnsetTimestamps') logging to get hardware timestamps from the Datapixx, assuming your PLDAPS toolbox supports that, or at least doesn't interfere with PTB. Generally Datapixx hw timestamping incurs about 2-3 msecs of overhead per Flip, and can have its own caveats, and a properly working PTB setup will provide Flip timestamps as accurate or even better than what Datapixx hw timestamping can do, but it is another nice way to double-check timing on a VPixx device.

-mario

Thanks,
Lee Lovejoy


Hi Mario

Thanks for that.

Regarding the resolution and refresh rates, I've made them consistent across all connected monitors and for good measure am using the same model monitor for all three.  Previously the X Screen 0 monitor for the GUI was different.  I thought it had a lower maximum refresh rate but now I'm not so sure.  Turns out that the DVI-D dual link cable had a bent pin and was instead working as a single link cable.  I discovered that when I swapped out the monitor and it had the same maximum refresh rate, and changing the cable fixed that particular issue.

I'm not entirely certain how these different factors contributed, but I also discovered that the powered DisplayPort to DVI-D adapter that I was using to connect the DisplayPort output of the graphics card to the DVI input of the monitor was reporting the incorrect EDID.  This wasn't initially obvious to me because DataPixx's EDID was transmitting correctly whereas that for the VG248's was not.  Switching to DisplayPort to DisplayPort cables for everything but DataPixx fixed that particular issue and now the EDID's are correct and the appropriate modes are recognized by X.

I also discovered that the xorg.conf file generated by nvidia-xconfig and then edited by nvidia-settings uses different names for the monitors and perhaps as a result some of the specifications in the PTB generated .conf file in xorg.conf.d were not getting applied quite correctly or perhaps overwritten.  So I used nvidia-settings to generate the specifications I wanted then copied those into the PTB generated .conf file and deleted xorg.conf.  I'm sure there's a cleaner way to accomplish this (such as writing another .conf in xorg.conf.d) but this worked for me.

In my searching I also learned that nvidia-settings sets a configuration file .nvidia-settings-rc in the home directory that is updated by nvidia-settings.  This file contains the settings updated by the GUI.  In particular, the control for which monitor is the primary display in a screen and the identity of the synced display are in there.  For example, on my X Screen 1 the DataPixx is DP-0 and the line in the settings file indicating that it should be synced is

astaroth:0.1/XVideoSyncToDisplayID=DP-0

Here is a link for reference.

Regarding the PerceptualVBLSyncTest, I saw the flickering gray screen with the horizontal tear usually in the lower half of the screen and sometimes bouncing up into the upper half.  I also saw the yellow line below the tear.  The two bounced up and down a bit.  The control screen and subject screen were identical to the best of my ability to discern. I tried changing the ScreenToHead preferences as you instructed.  I was unable to generate any observable difference in performance by changing the newCrtcID parameter (that's x, i believe) in Screen('Preference','ScreenToHead', 1, 0, x).  If I switched the screenID to screen 0 I was able to generate complaints from PTB that the timestampling reflected impossible stimulus onset values.

xrandr --verbose for screen 0 revealed that the monitor DP-6 has a DisplayPort connection and is connected on position 0 and has CRTC 0 (bottom most connector on the card).

xrandr --verbose for Screen 1 revealed that the monitor DP-4 is connected at position 1 and has CRTC 1.  DP-0 (the subject monitor) is on connecter 2 and has CRTC 0.  These correspond to the physical connections and match what's in .nvidia-settings-rc.

Then I went and set Screen('PreferenceScreenToHead',1,0,0) and compared that to Screen('PreferenceScreenToHead',1,0,1).  Best performance in terms of the variability in VBLSync was with crtc 0, as one might expect based on output of xrandr.

I haven't yet looked at DataPixx timestamps.  That will be next.

LL