Multiple (3) Displays in HDR mode

To whom it may concern,

I posted my problem in plain words below and included some technical specs in a list at the bottom.

Authentication Token:
ZAFYH5S6-202212610190:683a70ee891250d94d553544ab7d5b257d9fb40cf4f33572c66371a217a02bf1

We’re running an experiment that uses three different monitors in HDR mode (10bit color depth). We do not care about specific stimulus timing and we present stimuli to only one screen at a time (the others are left blank). However, whenever we try to display the stimuli, PTB seems to switch a single monitor, or at times all of the monitors, out of HDR mode. PTB will also occasionally darken one of the monitors. I cannot trace this down to any specific part of the code (it does it sometimes when we hit a while-loop and wait for a keypress, which seems completely random). The computer seems to be handling the multiple screens themselves alright, it’s just the HDR mode that isn’t working. Is there a way to run three monitors with 10bit depth at a time? Is there a specific setting we need to use in order for this to work?

Alternatively; we are considered having each monitor hooked up to its own computer and sending triggers back and forth, but we’d like to avoid buying extra computers if it is not necessary or won’t fix the problem.

Configuration Call:
PsychImaging(‘PrepareConfiguration’);
PsychImaging(‘AddTask’, ‘General’, ‘FloatingPoint16Bit’);
PsychImaging(‘AddTask’, ‘General’, ‘EnableNative10BitFramebuffer’);
(I have tried using ‘PsychImaging(‘AddTask’, ‘General’, ‘EnableHDR’’);’ as well but that messed up all of the monitors).

Specs:
OS: Windows 11 64-bit
Monitors: Samsung LS49AG95
GPU: NVIDIA RTX 3070Ti
Computer: MAG Z690 Codex X5 (MS-B930)

Please let me know if anyone has a solution, it appears to work sometimes in some of our testing or calibration code but not in the experiment. The inconsistency in how and when it breaks makes it difficult to track down what’s going wrong. If you need any more information I’m happy to supply it, just let me know.

Best,
Norick Bowers

Is PTB fully up to date?

I expect its by far a better idea to try this on Linux

It sounds very much like either an OS / driver issue, but don’t forget that dynamic contrast is also a “feature” of many commercial displays. When I tested several 10bit gaming monitors, they would switch the background black based on some criterion of how many pixels were active, this causes luminance shifts that are unacceptable for vision research and could not be fixed by the company as they wouldn’t add a switch to their monitor firmware. This sounds like what you are describing?

Technically the 10bit mode you are using in PTB is not HDR by the way. You mention PTB is “switching” out of HDR mode but I don’t really understand what this means. Have you tested 10bit luminance (1024 steps of increment) with a photometer to confirm 10bit mode is being disabled? I use EnableNative10BitFramebuffer regularly at least on a single monitor without issue.

You need a working Vulkan backend IINM for EnableHDR to work and that is the only way to actually use real HDR mode. HDR is driven as a commercial feature and I also worry from a vision perspective whether HDR modes also play loose with repeatability like the dynamic contrast trick played by many displays to claim high contrast ratios for their advertising?

Linux + AMD GPU on the same computer would be where any remaining bugs could be more completely addressed…

Hi Ian,

Thank you for your response. The dynamic brightness mode is turned off via the monitor settings (the buttons on the monitor’s themselves). We also turn off something called ‘local dimming’. Essentially we tried to disable anything that looked suspect and the monitor settings allow us a decent amount of control. When the monitor isn’t switched out of HDR mode, all of the settings stay the same and our calibration looked fine.

When I say that PTB switches out of HDR mode, we can see that in the monitor settings menu. Instead of HDRStandard mode, it changes the monitor to Custom, and we can’t change it back. Which is odd because I don’t believe PTB should have access to the monitor menu settings, however I could be mistaken on that. We have used both a photometer and a radiospectrometer for calibration and we’re pretty sure it’s giving us a full 1024 value range on colors. We are also using PTB 3.0.18-beta (I should have mentioned that originally, sorry).

It is good to know that the EnableHDR mode in PTB only works with specific hardware. I’m noticing a running theme that ‘switch to linux’ seems to be a common fix for these issues. No one on our team knows Linux but we could probably learn. Is there any obvious reason why using a separate computer for each monitor may fail that I’m unaware of?

Well I am surprised it says it is in HDR mode to begin with, just asking for a 10bit output makes it think it is in HDR mode? What happens if you test for 1024 step luminance before and after it tells you it has switched to custom, any difference? Personally I don’t trust the monitor firmware and settings much, these are commercial displays and their engineering demands and interests do not align with the sort of picky requirements vision researchers have… That is why companies like VPixx and CRS exist…

What exactly is your need, just a regular 10bit+ luminance output? Or do you need to display HDR encoded material specifically?

Did you read through help PsychHDR. At least according to the PsychImaging > 'UseStaticHDRHack' docs multi-monitor output is a very special case only tested under specific circumstances (dual display under Linux). What do the various HDR demos do when the monitor is in its “custom” mode?

If timing precision is not important, then sure, use 3 computers. You could use TCP to trigger them all and they should be roughly in sync (you could use 3 photodiodes to prove that). But cross-machine syncing will not give you frame-accurate timing across the systems, so if timing is important, you will need to factor that in…

Paying for some support time from Mario (EDIT: ok you did, good), as he clearly knows more about this than anyone else as he built the HDR support in the first place…

Hi Norick,

token validated. First let me thank your lab for supporting PTB with a yearly auto-renewing long-term license, already supporting us in the 3rd year now, as far as I can see. It was one of the first dozen labs to do the right thing. I wish all lab leaders would have the decency and long term thinking ability to make such decisions. Please tell whom it may concern my thank you! Also lucky for you, you posted your question 1 day before the roll-over into the new December 2022-2023 contract year, so this means up to 2 hours of work time included in your license for this specific issue (1 hour from before the roll-over, another one from after the roll-over). And for further work hours your lab would be eligible to buy extra time at the maximum discount of 40%. As a native swabian I’m genuinely excited that a lab finally gets maximum reward for doing the right thing! :partying_face:

This part of my reply was for free and doesn’t count toward work time :). Back to the topic at hand…

Good. Visual timing support on MS-Windows multi-display is generally somewhere between fragile and miserable due to limitations of the Windows operating system.

And at least when I tested this last in late 2020 and again late 2021, HDR support even on single-display Windows with NVidia graphics was broken timing-wise, apparently due to bugs in the proprietary NVidia Vulkan drivers for Windows. (My notes from that time after hours of investigation say: “NVidia Vulkan timestamps report stimulus onset 1 frame too early → FAIL. Even with VK_KHR_present_wait as of driver 496.49 November 2021!”). Your mileage may vary, depending on driver version, but past experience wasn’t trust inducing.

As Ian already said, this is the wrong way and it is surprising you got anything resembling HDR. The proper sequence would be on shown in, e.g., SimpleHDRDemo.m:

PsychImaging('PrepareConfiguration');
PsychImaging('AddTask', 'General', 'EnableHDR'');
win = PsychImaging('OpenWindow', ...);

→ The EnableNative10BitFramebuffer task is implied and selected automatically for HDR-10, therefore redundant to specifiy, doesn’t hurt, doesn’t help either.
→ The PsychImaging(‘AddTask’, ‘General’, ‘FloatingPoint16Bit’); is not needed, but if you specify it, you are selecting for a slight performance increase for potentially lower color precision. By default PTB would use 32 bit floating point precision per color channel. This spec would select for a lower 16 bit floating point precision, which cuts RAM consumption and memory bandwidth in half and would give speedups for high resolution displays and allow for higher animation framerates. 16 bit float precision is considered sufficient for display of HDR stimuli in itself, both because output to a typical HDR-10 monitor on MS-Windows is only 10 bit, and because fp16 supposedly stays under the JND thresholds of typical human observers on HDR displays - or so the reasoning goes in the HDR-10 standard, based on psychophysics done decades ago afaik. However, if your stimuli would make use of alpha-blending or other intermediate processing steps, where such steps could accumulate numerical roundoff errors, it is better to use the default 32 bit floating point precision with plenty of headroom to avoid/reduce such numerical errors.

The reason you may have observed some half-assed HDR’ish behaviour could be some quirk of the NVidia OpenGL drivers on MS-Windows, at least it was like that in 2019, if i remember correctly: These drivers do (did?) not support regular 10 bit framebuffers in SDR (NVidia wants (wanted?) you to pay extra for that feature with a pro-Quadro card), but at least in driver versions from 2019, some driver heuristic (is it a bug? is it a feature? is it just really bad design?) will interpret the attempt to switch to a 10 bit framebuffer as an apps desire to use HDR mode. It may decide to enable HDR signalling to the monitor under certain conditions, e.g., unoccluded fullscreen window, client app fully responsive and with keyboard focus, desktop compositor on standby. However, without the ability to control what HDR metadata is actually sent to the monitor, and at least in those times i tested it, some flakiness.

Expected symptoms would be: Switching in and out of HDR mode, depending if the app is detected as busy (e.g., during long running loops without GetMouse or Screen(‘Flip’), or during a KbWait or similar) or responsive, if you ALT+Tab between windows, or click on the wrong place with the mouse on a multi-monitor setup, or some random app pops up a notification window somewhere - all scenarios where the desktop compositor kicks in to ruin the day. Essentially all the same multi-display trouble as with regular display on MS-Windows, just instead of silent failure of timing, you get more noticeable visual feedback due to the switching between HDR and SDR mode.

And a AMD or Intel graphics card would behave differently, e.g., with no HDR at all.

All to say: Only PsychImaging('AddTask', 'General', 'EnableHDR''); is the right approach. Essentially follow our demos like SimpleHDRDemo, HDRViewer, etc.
This will enable PTB’s Vulkan display backend which has proper HDR support, and perform other setup steps like setting suitable identity gamma tables, switching the color space of the onscreen window to BT2020 / Rec2020, and intensity units for pixel color channels to Nits - Well, stimuli are defined in units of nits instead of color values between 0 and 1 or classic 0 - 255, what a consumer monitor actually makes out of this is highly dependent on monitor settings and firmware quirks/bugs, so in the end one still has to calibrate ones setup. At least PTB and the graphics card send a standard compliant signal essentially in units of nits, before the monitor may or may not mess it up :confused:

That said, both NVidia and AMD drivers on Windows-10 had different HDR or Vulkan related bugs, depending on driver version. In my experience stuff worked well from mid of year 2020 to end of 2021, then both NVidia and AMD decided to break things in different creative ways, many of these may still be unresolved.

So if you wanted to stick to Windows, I’d advice first using the proper PsychImaging calls, or run our demos. If that doesn’t work reliably we’d have to look closer. However, dual or triple monitor HDR on Windows was not ever tested by myself, our funding was only enough for one cheap 460 Euros VESA DisplayHDR-400 compliant monitor (Samsung C27HG70 is what I have - not because it is a great HDR monitor, but it was the one affordable one which checked the minimum requirements). Single display HDR was bad enough in terms of bugs.

However, as Ian and Diederick already mentioned, chances of fixing or resolving anything on Windows are close to zero if things don’t work out of the box. The best one can try is upgrading or downgrading the NVidia display driver for a bug exchange and hope that the different set of bugs in a different driver version is more compatible with ones specific goals.

For extended reading, here some links to commit messages painting the sorry story on both NVidia and AMD:

All my results relate to Windows-10, but i think Microsoft shuffled things around again on Windows-11, so bugs might be slightly different, or new complications may exist, as rarely do things get better on Windows…

As Ian and Dee already recommended, your chances for reliable operation should be generally better on Linux (Ubuntu 22.04.1-LTS recommended at this point in time, my primary test machine for HDR is still on Ubuntu 20.04.5-LTS, but I successfully spot-tested with 22.04.1-LTS sometime, and 22.04 has substantial improvements especially needed for multi-display HDR) with a modern AMD graphics card. HDR is currently only supported on AMD graphics, so you’d have to swap your NVidia (not recommended in general) for an AMD. Pretty much anything from the last six years (e.g., Radeon 500 series AMD Polaris gpu family) should do, likely even stuff back to ~2015. It doesn’t need to be a high end card, unless you need fast animations on high resolution displays. E.g., i run a 2560x1440 144 Hz monitor from the built-in AMD graphics chip of a Ryzen 5 2400G processor in a 499,- Euro PC from Aldi. Although your monitor probably calls for a not too weak graphics card, seeing it’s a 5120x1440 pixels display with 240 Hz refresh rate and up to 2000 nits peak brightness. Linux + AMD also has the unique feature of providing up to 12 bpc native color resolution per channel, when other operating systems top out at 10-11 bpc at most. Ofc. on a 10 bpc monitor like yours, the 12 bits would be achieved via 10 bit native output + 2 bits dithering.

Other Linux + AMD advantages would be fine-grained stimulus timing control – not needed in your case as you stated, and some (hackish but working) dual-display HDR stereo mode. You wouldn’t use that stereo mode though if i understand correctly, but simply open 3 separate onscreen windows on the three monitors for separate stimulus presentation? What kind of paradigm requires 3 HDR monitors, but only one being displaying at any given time, if i may ask?

This far, so far,
-mario

[Time accounted: 35 minutes]

Hi Mario,

Thank you for your response. I’ll be honest I’m not much of a hardware person but it sounds like the TLDR of it all is to switch over to Linux and possibly swap out for a decent AMD graphics card instead of NVidia? I am currently trying to switch over to Linux and I will let you know if that works out. If not we can try to switch out to an AMD graphics card (probably have one in one of our computers here somewhere we can swap) and try that. If worse comes to worse, we may try with three computers, but I will make sure we get AMD graphics cards as you suggested.

For the experiment we’re actually using three of these monitors to examine the periphery of vision (they have a 1m circular curvature which makes for a neat setup). We need 10bpc precision for our stimuli (8bit isn’t a fine enough scale). We only test one location at a time so precise timing isn’t necessary. Ultimately the experiment is about as simple as you can get, we just need very simple stimuli displayed at 10bpc on one monitor at a time, but it isn’t maintaining 10bpc stably enough for the experiment to work. We also only run the monitors at 120Hz, so we’re not really asking for too much out of the GPU. We calibrated each monitor in HDR mode and it looked totally fine. After applying the calibration we checked that it displays the proper contrast using a photometer and that also worked out alright. It sometimes hang in the experiment code in a while-loop we wait in for a little while waiting for a keypress, which sounds like it could be from the GPU from what you said above.

I will try some of the fixes you suggested. It’s really not too complicated what we’re trying to do so I feel like it has to be possible. If anything works, I will report back. If not, I will report back after banging my head against the wall in frustration for a few hours.

Best,
Norick

Yes, Ubuntu Linux 22.04.1-LTS would be recommended right now. A AMD graphics card is a must on Linux if you want to use HDR, or the highest possible color/contrast precision - up to 12 bpc, either native on a high-end monitor, or via dithering on typical 10 bit HDR capable monitors.

I see. Three of those would probably cover the human visual field horizontally if one avoids large eye movements, i guess. Neat.

What confuses me here is that just 10 bpc doesn’t really require HDR at all? In fact, wouldn’t using a “standard dynamic range” in the low hundred nits range provide for finer contrast discrimination here? Your monitors are quite high-end HDR wise with 2000 nits peak output and a locally dimming backlight with 2048 dimming zones, but if you don’t actually need high dynamic range, that just makes for more potential complications?

My experience here might be outdated, drivers and policies change now and then, but as far as I remember from my last series of testing sometime a year ago, on MS-Windows with consumer graphics cards like your GeForce RTX you don’t get 10 bit mode in SDR. They usually wanted to you to pay for a more expensive pro-level card from the Quadro series. On Linux you get 10 bpc for free on all modern Nvidia/AMD/Intel gpu’s, regardless if consumer or pro. On AMD even 12 bpc with the right setup and PsychImaging() tasks.

You can get 10 bpc as a side effect of HDR mode on MS-Windows even on cheaper consumer cards like yours, but that because half-way proper HDR-10 requires that and NVidia et al. consider that a consumer feature – video games, Netflix HDR movie playback etc., so they are more generous with the bits there. But the side effect is - well - HDR mode, where the operating system + gpu drivers + graphics cards + monitors etc. behave quite differently - and with different bugs and traps.

That all said, Ubuntu Linux 22.04 + AMD should be the better/best choice wrt. all things multi-display, high color precision (SDR or HDR), HDR, visual timing, but not using HDR if you don’t actually need it would reduce complexity anyway. Even on your NVidia consumer card (technically even on many onboard Intel chips) you can configure for 10 bit without HDR on Linux and even simplify your script. You’d use XOrgConfCreator to create a separate PTB screen driving all three monitors as one super-wide monitor with one onscreen window, easy, for a maximum of 10 bpc. In true HDR mode you’d need to follow setup steps in help PsychHDR and have 3 separate onscreen windows for the three monitors, with HDR and/or up to 12 bpc, but higher performance consumption ans less elegant script if you technically want to treat all displays as one ultra-wide.

Let me know which one it is, optimal approach is different for 10 bpc SDR vs. 10-12 bpc SDR or HDR.
-mario

[Time spent in total: 1 hour]

Hello,

I just wanted to report back on our problem in case anyone else was having a similar issue. The good news is we’re fairly confident everything is working now (hooray!). It turns out the switch to Linux was key here, and luckily we found a colleague who was a Linux expert and helped us get things set up. We can now get 10bpc color precision stably on all monitors (confirmed with a photometer). We are using the newest Ubuntu (Jammy Jellyfish) version and we are still using our NVidia cards.

There were a few small problems we had that we managed to work around. First, important information for anyone trying to use 10bit color depth, the Matlab interface DOES NOT WORK in 10bpc color mode. We got around this by calling Matlab via the Linux terminal with “ptb3-matlab -nosplash -nodesktop” options and it seems to work fine. Second, and this might be on our end, the XOrgConfigSelector command in ptb3 wasn’t working due to some sort of authentication/security problem. Our Linux expert simply made a command callable from the Linux terminal to move the config file around so we can toggle between 8bit (for using the Matlab interface) and 10bit (for the experiment itself) after a reboot. I assume it’s essentially what XOrgConfigSelector is doing anyway. This means we have to reboot between editing the code (besides simple things using a text editor) and testing it, which is annoying but not a deal breaker. Lastly, we get an error “unknown NVidia Chipset, Assuming Latest Generation” on PTB but everything seems to be working fine in our experiment thus far so I’m not sure if it’s a major problem.

Thank you all for your help. I apologize if I wasn’t able to give clear enough information in the original post, I’m not so technically savvy, so I was trying to explain things to the best of my ability. Your advice and assistance is appreciated and we’re excited to finally be able to start collecting some data soon. If anyone had any questions or a similar problem feel free to reach out and I can give you more details on what we did to fix things.

Best,
Norick

1 Like

Great! Ubuntu 22.04.1-LTS as recommended. The NVidia should indeed be fine for your case of no need for high timing precision, and no need for HDR, only SDR, and a maximum of 10 bpc. More complex requirements like HDR, > 10 bpc or more precise timing would have required an AMD, but this is fine.

Oh right, Matlab has serious bugs which make it hang or crash if displayed on a 10 bpc display, reported but unfixed since years. Their graphics team are extremely poor performers. If you wanted a GUI, you may want to give the open-source and free Octave a try, simply by installing the octave-psychtoolbox-3 package from NeuroDebian as well, and launching octave --gui. A handful of volunteers can do a better job at this than Mathworks graphics team, and Octave’s GUI should work fine at 10 bpc, like pretty much almost any other modern app on Linux.

However, this sounds like you are not using a separate “experimenter control monitor” separate from the 3 HDRdisplays for the Desktop GUI and Matlab? If you setup a dual-X-Screen setup with X-Screen 0 for the GUI and X-Screen 1 for the 3 HDR monitors, you can select 10 bpc for the screen 1 for your stimuli, and leave screen 0 at standard 8 bpc, so deficient Matlab can cope and display its GUI. I think XOrgConfCreator should be able to do that even with NVidia’s proprietary drivers, but if not, you could also use NVidia’s nvidia-settings GUI to generate such a configuration iirc.

XorgConfSelector needs write access to /etc/X11/xorg.conf.d/ to work. You could and probably should run PsychLinuxConfiguration once to setup permissions and other stuff to allow that. It is called automatically during DownloadPsychtoolbox etc. when getting PTB from us, but has to be manually run if you got PTB from NeuroDebian, as you apparently do.

It is a bit weird though, as your statement below about “unknown NVidia chipset” suggests you did run PsychLinuxConfiguration successfully already, which should have set up the /etc/X11/xorg.conf.d/ subfolder for read/write access by all users in the “psychtoolbox” user group? Does that folder exist, ie. in a terminal window,
ll /etc/X11/
reports something containing something like…
drwxrwsr-x 2 root psychtoolbox 4096 Aug 21 20:29 xorg.conf.d/

Is your user account part of the psychtoolbox group? Ie. does the output of the id
command also contain the word “psychtoolbox”?
→ If not, run PsychLinuxConfiguration again and let it add you to that group when it prompts if it should do that for you. Then XOrgConfSelector should work for a bit of extra convenience. No reboot needed, only logout/login.

That’s because your NVidia “Ampere” gpu family gpu from late 2020 is not yet recognized by PTB → Lack of funding → Falling behind on maintenance tasks, as they are lower priority, given almost nobody pays for them, except for good labs like yours. I’ve updated the detection for upcoming PTB 3.0.19 to handle Ampere and Ada Lovelace / Hopper gpu’s from the RTX 3000 / 4000 series. But even without the detection, your current PTB should still work near optimal as it assumes a Volta/Turing gpu or later if it doesn’t know what it deals with, which is the right decision in this case.

Thanks for providing this feedback, and the funding by your membership, resulting in a quick improvement.

Best,
-mario

[Work time spent on this request: 85 minutes of 120 available in total]