11-bit luminance precision on iMac (Retina 5K, 27-inch, Late 2014)

dear mario
i just ran my test program on my iMac (Retina 5K, 27-inch, Late 2014). The graph is enclosed. This is showing 11-bit precision.
Yay!
best
denis
p.s.
Hormet tried my test program on our Linux hp and we hit a glitch, probably not important, so we don't have that result yet.

photo
Denis Pelli
Professor of Psychology & Neural Science, New York University
+1-646-258-7524 | denis.pelli@... | http://psych.nyu.edu/pelli/ | Skype: denispelli | http://denispelli.com
XX---In PSYCHTOOLBOX@yahoogroups.com, <denis.pelli@...> wrote :

dear mario
i just ran my test program on my iMac (Retina 5K, 27-inch, Late 2014). The graph is enclosed. This is showing 11-bit precision.
Yay!

-> Good. I can't find mistakes in your measurement script or results, so it seems you are getting 11 bpc for luminance via dithering, assuming you got this consistently over all 2048 steps throughout the whole luminance range? Mysterious. Maybe the limitations on Southern Islands only apply to true > 10 bpc output, but not the simulation via dithering. Or maybe artifacts would only show for color display, but testing 2^33 colors and checking for artifacts is obviously not really possible. 11 bpc is the absolute maximum that Apples half-float framebuffer format can represent in the normal range 0.0 - 1.0 of displayable color values.

What i would like to see from the iMac is a test at factory settings, as Apple advertises the 10 bit mode, after rebooting the machine and then without the Screen() 'Dithering' command, so we can be sure that the iMac 5k panel really can display true 10 bpc colors at a n individual pixel level. Dithering after all only provides higher bit depth when averaged over multiple pixels, with the expected artifacts for high spatial frequency stimuli. And if it shows the same visual timing problems in 10 bit mode, or also in normal mode?

best
denis
p.s.
Hormet tried my test program on our Linux hp and we hit a glitch, probably not important, so we don't have that result yet.

-> Anything specific? I'd like to get this all wrapped up for the next PTB beta. The Linux laptop needs a different 'Dithering' setting for testing 11 bpc and 12 bpc mode, as in Hoermets test script, as that is the "Sea Islands" gpu generation, instead of "Southern Islands". If your findings translate to the HP laptop then it might be possible to get even more than 12 bpc on the 10 bit laptop panel if PTB uses the 'Native16Bit' framebuffer mode. Stimuli are stored with 16 bpc, so if dithering can squeeze out at least 3 bits on your 8 bpc Apple panels then maybe it can also squeeze out >= 3 bits on top of the HP panel of the Linux laptop.

Btw. you asked about how to know the gpu type: winfo = Screen('GetWindowInfo', window); provides the winfo struct.

winfo.DisplayCoreId provides the vendor name of the display gpu, e.g., 'AMD' for AMD gpu's.

winfo.GPUMinorType reports some id for the display gpu family. On AMD cards that's the DCE display engine version. E.g., 60 <= winfo.GPUMinorType < 70 for the DCE 6.x engines in "Southern Islands", or 80 <= winfo.GPUMinorType < 90 for the DCE 8.x engines in the "Sea Islands" cards like in the HP laptop. So this could be used to apply the correct 'Dithering' value iff winfo.DisplayCoreId == 'AMD'

thanks,
-mario


photo
Denis Pelli
Professor of Psychology & Neural Science, New York University
+1-646-258-7524 | denis.pelli@... | http://psych.nyu.edu/pelli/ | Skype: denispelli | http://denispelli.com
dear mario
ok. thanks. i'll test the iMac tomorrow after rebooting, without touching dither.
Based on your text, I've got the following code for dither. When Hormet shares his linux hp info, i'll fill in more.

In an email I sent to you and Psychtoolbox earlier tonight, i report 10-bit performance under linux without knowing how to set up dither.

best
denis

wInfo=Screen('GetWindowInfo',window);
switch(wInfo.DisplayCoreId)
case 'AMD',
DCEDisplayEngineVersion=wInfo.GPUMinorType/10;
switch(round(DCEDisplayEngineVersion))
case 6,
displayGPUFamily='Southern Islands';
% Examples:
% AMD Radeon R9 M290X used in MacBook Pro (Retina, 15-inch, Mid 2015)
% AMD Radeon R9 M370X used in iMac (Retina 5K, 27-inch, Late 2014)
ditherCLUT=61696;
case 8,
displayGPUFamily='Sea Islands';
% Used in hp Z Book laptop.
ditherCLUT=xxx;
end
end



Denis Pelli
Professor of Psychology & Neural Science, New York University
+1-646-258-7524 | denis.pelli@... | http://psych.nyu.edu/pelli/ | Skype: denispelli | http://denispelli.com

On Wed, Apr 5, 2017 at 8:33 PM, mario.kleiner@... [PSYCHTOOLBOX] <PSYCHTOOLBOX@yahoogroups.com> wrote:

XX---In PSYCHTOOLBOX@yahoogroups.com, <denis.pelli@...> wrote :

dear mario
i just ran my test program on my iMac (Retina 5K, 27-inch, Late 2014). The graph is enclosed. This is showing 11-bit precision.
Yay!

-> Good. I can't find mistakes in your measurement script or results, so it seems you are getting 11 bpc for luminance via dithering, assuming you got this consistently over all 2048 steps throughout the whole luminance range? Mysterious. Maybe the limitations on Southern Islands only apply to true > 10 bpc output, but not the simulation via dithering. Or maybe artifacts would only show for color ! display, but testing 2^33 colors and checking for artifacts is obviously not really possible. 11 bpc is the absolute maximum that Apples half-float framebuffer format can represent in the normal range 0.0 - 1.0 of displayable color values.

What i would like to see from the iMac is a test at factory settings, as Apple advertises the 10 bit mode, after rebooting the machine and then without the Screen() 'Dithering' command, so we can be sure that the iMac 5k panel really can display true 10 bpc colors at a n individual pixel level. Dithering after all only provides higher bit depth when averaged over multiple pixels, with the expected artifacts for high spatial frequency stimuli. And if it shows the same visual timing problems in 10 bit mode, or also in normal mode?

best
denis
p.s.
Hormet tried my test program on our Linux hp and we hit a glitch, probably not important, so we don't have that result yet.

-> Anything specific? I'd like to get this all wrapped up for the next PTB beta. The Linux laptop needs a different 'Dithering' setting for testing 11 bpc and 12 bpc mode, as in Hoermets test script, as that is the "Sea Islands" gpu generation, instead of "Southern Islands". If your findings translate to the HP laptop then it might be possible to get even more than 12 bpc on the 10 bit laptop panel if PTB uses the 'Native16Bit' framebuffer mode. Stimuli are stored with 16 bpc, so if dithering can squeeze out at least 3 bits on your 8 bpc Apple panels then maybe it can also squeeze out >= 3 bits on top of the HP panel of the Linux laptop.

Btw. you asked about how to know the gpu type: winfo = Screen('GetWindowInfo', window); provides! the winfo struct.

winfo.DisplayCoreId provides the vendor name ! of the display gpu, e.g., 'AMD' for AMD gpu's.

winfo.GPUMinorType reports some id for the display gpu family. On AMD cards that's the DCE display engine version. E.g., 60 <= winfo.GPUMinorType < 70 for the DCE 6.x engines in "Southern Islands", or 80 <= winfo.GPUMinorType < 90 for the DCE 8.x engines in the "Sea Islands" cards like in the HP laptop. So this could be used to apply the correct 'Dithering' value iff winfo.DisplayCoreId == 'AMD'

thanks,
-mario


Denis Pelli
Professor of Psychology & Neural Sci! ence, New York University
+1-646-258-7524 | denis.pelli@... | http://psych.nyu.edu/pelli/ | Skype: denispelli | http://denispelli.com