in our lab, we are currently testing the ability to use Linux and our 10 bit monitor (Eizo) with a Quadro P6000 (Ubuntu 19.10 desktop) and K620 (Ubuntu 18.04 LTS desktop) to display 10 bit color. We have confirmed with a spectrophotometer that the display and cards do indeed give 10 bit output on Windows 7 using Psychtoolbox.
To get 10 bit color on Linux, we found that only the following settings worked:
Nvidia proprietary drivers
Xorg.conf specifying DefaultDepth of 30
Open-source Nouveau did not work, even with latest HWE, as recommended, regardless of whether we used the recommended XOrgConfigCreator settings or not.
Anyway, have successfully opened 10 bit windows, but ONLY if Matlab is running in text only mode in a terminal. If we run Matlab as a GUI, we find that it is not able to adapt to the enforced 30 bit display depth. It then crashes. (see attached image:
Has anyone experienced this before? Have people successfully used 10 bit color with Nvidia on Linux and made mention of this somewhere? We can confirm that it is unrelated to the age of the machines. The mentioned Ubuntu 19.10 machine is only about a year old and has very good hardware.
The open-source nouveau driver should also give 10 bpc on suitable displays, at least on Ubuntu 19.10. My own testing with a ColorCal2 showed that, at least when connecting to a 8 bit monitor and enabling spatial dithering. What model of Eizo monitor is this? 10 bit native capable? What connection type? Output of glxinfo? Ofc for your type of gpu’s, nouveau won’t be a good performance choice anyway, so this is a bit academic for more demanding scenarios.
That said, yes i can confirm from a quick test that Matlab’s GUI seems to be broken in 10 bpc mode, also on other gpu’s than NVidia . This would be a Matlab bug, which should be reported to Mathworks.
Ideally you would have bought AMD gpu’s instead of NVidia gpu’s which we do not recommend at all if you can avoid them in any way. That way you’d not only get a perfect workaround for the problem, but additional precision of 11 bpc or even 12 bpc on suitable displays - or even on high quality 8 bpc displays via spatial dithering.
So here are your remaining options:
Run Matlab without gui aka matlab -nodesktop.
Use Octave, whose GUI works perfectly in 10 bpc mode.
On a dual-display setup, use XOrgConfCreator et al. to set up only the 2nd display / X-Screen 1 for 30 bit, so Matlab can stay happy on the other display.
Report the Matlab bug to Mathworks and wait for a bug fix.
Switch to an AMD gpu and get extra precision at slightly reduced performance, while working around the Matlab bug, at least as so far tested on a Polaris gpu.
many thanks for your response and tips! I have not had a chance to get back to this yet, but may have some time tomorrow to give it all a try and let you know.
so I had a chance to test a few more things. Here are the results:
We indeed have a 10 bit monitor (Eizo CG245W) and both cards were connected to the monitor via DisplayPort. We test each card one-at-a-time. The cards are not simultaneously connected to the monitor. We have previously confirmed under Windows 7 with a spectrophotometer that the 10-bit pipeline works correctly.
Nouveau drivers, with latest HWE and latest Linux HWE headers and so on, reports that the display is running at a 30 bit Depth, when the proper xorg.conf file is selected, but all OpenGL windows are 8 bit, even when 10 bit color is explicitly requested. We tried to do this with and without modesetting, which had no effect on this result.
We also made a simple GLFW program in C that just opens a window with a 10Red+10Green+10Blue+2Alpha bits setup. With the nouveau drivers, the window always reports 8 bits for all relevant channels. With the latest nvidia drivers, we get 10 bits for the red, green, and blue channels. Both xwininfo and glGetFramebufferAttachmentParameteriv confirm this.
All of this is the case for both the K620 computer and the P6000 computer. For example, I ran xdpyinfo, glxinfo, and xwininfo for the K620 computer when using the nouveau drivers. The “root” window is registered as 30 bit, but as stated above, any other windows are 8-bit. Indeed, Psychtoolbox stops and says that the system does not support 10 bpc. Output from the xdpyinfo, glxinfo, and xwinfo programs for the nvidia drivers was also investigated for comparison. When we switch to the Nvidia drivers, Psychtoolbox is able to open a 10 bit framebuffer and we confirmed that it is working as expected.
The GUI bug has been reported to MATLAB. From tips on the Nvidia DevTalk forum, I found out that it is a known bug with GTK programs, which MATLAB uses for the GUI on Linux (see here: https://devtalk.nvidia.com/default/topic/1070468/linux/10-bit-color-breaks-matlab-gui/). Indeed, with the Nvidia drivers set to 30 bit DisplayDepth, GNOME starts to give problems on Ubuntu (lots of black regions and certain buttons cannot be clicked). Using keyboard shortcuts, we could circumvent these problems in GNOME, but these problems went away completely when we used XFCE as the desktop environment.
Mathworks said they might be able to implement a fix.
Unfortunately, Octave still has too many codepaths that are very slow in comparison to MATLAB, even when Octave is configured to use OpenBLAS, and our lab tends to find ourselves in those slow paths often, so we cannot consider a switch to Octave.
We rarely push our GPUs to the limits. Most of the time we are just drawing some simple shapes or textures, maybe a video sometimes. So, any small performance loss on the monster GPUs from the past few years is negligible for us. Since that is not such a problem for us, then we might order an AMD card soon and see how that works for us.
I have not had a chance to test the dual-display setup, but we hope to reach our old “optimal” approach of one monitor per experimental computer.
Will let you know when we make more progress.
Many thanks again!
If you want the full output from xwininfo, xdpyinfo, and glxinfo, let me know and I can send along a zip file. Just didn’t want to flood the post here with text.
The reason why nouveau didn’t give 10 bpc is one that should be fixed by updating to the latest PTB 3.0.16 beta from last week. During UpdatePsychtoolbox the PsychLinuxConfiguration script will run and ask you if it should install some deep color configuration file which it finds missing. After that file is auto-installed and a logout->login, it should also work under nouveau. Turns out, somebody was changing the default setting for 30 bit color support in the open-source drivers from on to off in one of the recent Mesa releases, which i didn’t notice. So now it needs an extra config file to switch 30 bit on. This config file will get automatically installed by the very latest PTB beta during UpdatePsychtoolbox iff you answer 'y’es when it asks you if it should do so.
Of course nouveau - while providing more slower graphics on modern NVidia gpu’s, won’t help you with your Matlab problem, because it is a Matlab bug If Octave isn’t an option, then matlab text mode will be it. I think Octave is mostly slower for non-vectorized code with lots of for/while/if-else statements, where Matlabs JIT might give it some edge.
Or having only the X-Screen 1 with the stimulus display at 30 bit / 10 bpc, and X-Screen 0 with the GUI on the operator monitor at regular 24 bit / 8 bpc.
Or modern AMD gpu’s, if you only want/have a single-display setup. On AMD, Psychtoolbox can use its own high bit depth mode, so you won’t use the special xorg.conf file, so the regular desktop will work at 24 bit and only the PTB fullscreen window will operate at high bit depth. This high bit depth mode can squeeze out almost 11 bits, or on a XFCE-4 desktop, it can squeeze out up to 12 bits with a lot of extra low trickery – at a substantial loss of framerate for high resolution displays on lower end gpu’s though. It’s important that the AMD gpu is not too modern, as i haven’t ported PTB’s low level hacks to the latest integrated gpu’s (AMD Ryzen / Raven Ridge integrated graphics) or the very new Navi gpu family which was released this last late summer. The most well tested type of gpu is currently Polaris, although Vega should also work (untested due to lack of hardware) and have some extra advantages.
From empirical experience, I find it way more straightforward to stick to AMD gpus to get a steady 10bpc pipeline. I have been using AMD Pro WX 7100 video cards with Eizo displays and it works really well