test for dithering?

dear mario
thanks! super interesting.

i'd love to have a test for dithering, to be sure of what we're displaying. i made a quick try at setting my retina display black, except one pixel, and testing the luminance precision of that pixel, but i didn't get enough light. i think i'd need to use a microscope objective to collect enough light from the pixel.

would apple's spatial and temporal dither be defeated by measuring one static pixel on a black background? i'd guess that zero is below threshold, so that small perturbations around zero don't change the black. In that case an isolated pixel in a black field wouldn't benefit from spatial dither. if temporal dither is confined to successive frames, then making every other frame black would similarly defeat temporal dither. if this is right, then doing photometry on a single bright pixel in a zero background, which is turned on only for every other frame, should reveal the luminance precision of the hardware, unaided by dither.

the apple document mentioned color shifting, so they might be compromising hue to enhance precision of luminance, what chris tyler dubbed "bit stealing". Again, i think we might defeat that by using just one channel, e.g. green, and setting red and blue to zero to make them reliably black for minor perturbation.

hmm. if we knew that the dither was not stochastic and extended only a certain number of pixels spatially and frames temporally, then we could show a zero black background and upon that a sparse array of identical pixels, horizontally and vertically, shown on every other frame, to produce enough light for my photometer without needing a microscope objective. however, if the dither is random across those pixels then we'd be averaging across different values and failing to defeat the dither.

what do you think?

is there a way to load an image and then freeze the panel, to prevent any temporal change, to block temporal dither?

best
denis


Denis Pelli
Professor of Psychology & Neural Science, New York University
+1-646-258-7524 | denis.pelli@... | http://psych.nyu.edu/pelli/ | Skype: denispelli | http://denispelli.com

On Sat, Jun 17, 2017 at 12:36 AM, mario.kleiner@... [PSYCHTOOLBOX] <PSYCHTOOLBOX@yahoogroups.com> wrote:

Denis, as a follow up, i found this article today in my newsticker:

https://www.livescience.com/ 59512-apple-imac-can-display- 1-billion-colors.html

Citing this, ... "

An Apple spokesperson said the new iMac will use an algorithm that employs both temporal and spatial dithering. The former takes advantage of the human eye's tendency to mix two colors in close proximity to create a blend of the two, while the latter achieves the same effect by having a pixel flash between two colors very rapidly.

This essentially tricks the eye into thinking it sees more colors than the display is capable of producing, Mantiuk said. The trick is already widely used by software like Photoshop and to get 8-bit output on 6-bit panels, he said, adding that the vast majority of users likely won't be able to distinguish the output from that of a true 10-bit display.

"The human eye is very unlikely to spot any difference," he said.

For the small number of pro users for whom it does matter, such as Mantiuk, the Apple spokesperson said it's possible to connect a third-party display to get an end-to-end 10-bit experience."


So the grand new (5000$ base prize!!!) iMac 2017 display which allows "10 Bit color" is not actually a 10 bit display at all, but a standard 8 bit panel with the same dithering trickery used to fake 10 bit, as in the older Macs. But if you still have money left after buying that machine you can go and buy an actual 3rd party 10 bit display to get 10 bits!! Wow, am i underwhelmed again!


Motivated by this i looked if iFixit has a tear-down of the 2015 iMac 27" 5k Retina, and they do:


https://www.ifixit.com/ Teardown/iMac+Intel+27-Inch+ Retina+5K+Display+Teardown/ 30260


There you can learn about the chips used in the panel controller board, and the best guess for what timing contr! oller (tcon) they use points to a modified 8 bpc/24bpp DP 663 controller:


https://www.paradetech.com/ products/dp663/


The photograph from the panel tells that the actual LCD panel is a LM270QQ1-SDA2 panel from LG. Googling for that gives us this web-site with specs for that panel:

http://www.panelook.com/ LM270QQ1-SDA2_LG%20Display_27. 0_LCM_parameter_23491.html


Which tells us that it is a 8 bpc panel for 16.7 million colors / 24 bpp max.


So if this info is all correct then none of the existing or upcoming Apple iMac computers has true 10 bit color display capabilities on their built in displays at all. All they do is fake 10 bpc via their current proprietary dithering algorithm while comp! letely trashing display timing precision, or maybe on the 2017 iMac by using the built in dithering of the gpu's display engine - which would only achieve fake 10 bpc instead of the fake 11 bpc you and i measured, but at least with less broken display timing.


That would mean it would make zero sense for them to use a 10 bit framebuffer at all, as they do on the iMac, as it wouldn't be any better than the 8 bit framebuffer on the MacBookPro's. Maybe the whole reason they run with a "permanently on" 10 bit framebuffer on the "10 bit capable" iMacs is to simplify their code in case somebody connects an actual 10 bit panel from a 3rd party vendor.


Seems your HP Linux laptop will remain the only machine in your lab atm. which provides actually trustworthy stimulus reproduction at true 10 bpc and with solid timing without the danger of introducing low-level confounds.


-mario



XX---In PSYCHTOOLBOX@yahoogroups.com, <denis.pelli@...> wrote :

dear mario
thanks! super interesting.

i'd love to have a test for dithering, to be sure of what we're displaying.  i made a quick try at setting my retina display black, except one pixel, and testing the luminance precision of that pixel, but i didn't get enough light. i think i'd need to use a microscope objective to collect enough light from the pixel.

would apple's spatial and temporal dither be defeated by measuring one static pixel on a black background? i'd guess that zero is below threshold, so that small perturbations around zero don't change the black. In that case an isolated pixel in a black field wouldn't benefit from spatial dither. if temporal dither is confined to successive frames, then making every other frame black would similarly defeat temporal dither. if this is right, then doing photometry on a single bright pixel in a zero background, which is turned on only for every other frame, should reveal the luminance precision of the hardware, unaided by dither.

-> No. I tested "11 bpc" half-float framebuffer mode on my MacBookPro 2010 with NVidia GeForce 330M under latest OSX 10.12, with a CRS Bits# connected, so i can actually read back what 8 bit dithered video signal gets out to the "display". One can only read 1 video scan-line at a time, so assessment of temporal dithering would be difficult, but one can vary the scan-line read back to find out about spatial patterns. See attached M-Files, one for looking at one scanline only, with the whole display filled with the same constant gray value, the other displaying a single isolated pixel on black, or a row of 3 pixels on black, sweeping over 3 scan-lines to see the neighborhood of the pixel.

The M-Files translate 0 - 255 test values into equivalent 0 - 2047 values for a 11 bpc framebuffer, and use Screen('ColorRange') so PTB translates those back into 11 bit quantized floating point values between 0.0 and 1.0 for rendering. A dither-free system should thereby present the input test values unmodified to the 8 bit video input of the Bits# device. This all with an identity gamma table loaded, and verified it is indeed a 8 bit identity gamma table when using reguar 8 bpc framebuffer mode, and by readback at the end of the script before leaving "11 bpc" mode, to exclude non-linearities from there. I also tried adding different offsets o onto the j * 8 + o grayscale value, to account for bias in the float -> fb conversion. Didn't make much difference.

You see that an all-black screen gives black (RGB_target_0_0_0.txt), but already a gray level of 1 for th e whole screen creates very funky spatial patterns of 0's and 1's, also modulated over time, ie. the same image creates different output on successive Flip's. The whole thing becomes even more weird for higher values (_55_55_55, _128_128_128), where a constant value of [R,G,B]=[128, 128, 128] translates into [127,118,104] iow. the red channel follows more or less the target value, but green and blue are way off. Even for a maximum white of [R,G,B] = [255, 255, 255] you get weird results like [255, 236, 208] ie. creating errors in some color channels of up to 18%.

Using a single pixel, you don't get dithering in its neighbourhood, just weirdly distorted color values for the single pixel, even if its test value should be representable without any need to change anything (RGB_3linesweep_55_55_55). Similar for 3 pixels horizontal, all also modulated across Flip's.

The diary outputs also contain runs with other values than the ones in the filename.

I assume the same Apple proprietary dither algorithm is used regardless if it is a AMD or Nvidia gpu, but can only test this on that one NVidia card.


the apple document mentioned color shifting, so they might be compromising hue to enhance precision of luminance, what chris tyler dubbed "bit stealing". Again, i think we might defeat that by using just one channel, e.g. green, and setting red and blue to zero to make them reliably black for minor perturbation.

-> There isn't a Apple document? Only that web page which is written for lay people to explain in simple terms what dithering is? And a cited statement from Apples PR department that the future iMac will use spatial and temporal dithering. So all we seem to know is that most likely all current Macs including the "10 Bit capable" iMac's fake precision via dithering - at least on the iMac's internal panel. Even the future extra expensive 2017 iMac. So the 2017 iMac is not worth its money if you care about actual >= 10 bit precision without potential confounds. If at least display timing would be better, would depend if that thing still uses Apples own dithering method, or one of the methods built into AMD's display hardware.

hmm. if we knew that the dither was not stochastic and extended only a certain number of pixels spatially and frames temporally, then we could show a zero black background and upon that a sparse array of identical pixels, horizontally and vertically, shown on every other frame, to produce enough light for my photometer without needing a microscope objective. however, if the dither is random across those pixels then we'd be averaging across different values and failing to defeat the dither.

what do you think?

I don't know where you want to go with this? What's the point? You know already that dithering will be used for > 8 bpc modes and you can't defeat it for any meaningful stimulus. If you could, you would just be back to a 8 bpc output which you can already get by using standard 8 bpc mode.

If you want true 10 bit color precision without trouble, you can use your HP Linux laptop. If you want more than 10 bit luminance precision you could use the LInux laptop in 10 bit mode + our "bit-stealing" style PseudoGray method for potentially up to 12.7 bits. Even on your Apple machines the PseudoGray method would give you 10.7 bits. And the properties of bit-stealing are probably better understood and documented than Apples proprietary algorithm, at least you know there aren't spatial effects or timing effects, only slight colorization. And then there is the VideoSwitcher i mentioned in my e-mails to you for use with a CRT monitor, and rather cheap.

is there a way to load an image and then freeze the panel, to prevent any temporal change, to block temporal dither?

No, not in a useful way for you. Psychtoolbox stereo-resync function (Screen('Preference','SynchronizeDisplays', ...) does shutdown the display engines of all displays for a second, so that would be your "freezing" for a second. But not driving a display for more than a fraction of a second will cause all kind of funny visual artifacts and a breakdown of the image. But i doubt temporal dither is used by Apples proprietary method -- expensive. And hardware dithering isn't used at all, that we know for sure. What they seem to do is modulate properties of the spatial dither over stimulus updates, ie. on successive Flips, even if the same image is flipped again, so it's something like a temporally modulated spatial dither. That would stop if you don't Flip.

Btw. at least AMD's hardware dithering engines have various modes of operation, e.g., randomizing spatial dithering vs. not randomizing, applying high-pass filters, so dithering affects high frequency components like pixels, lines, edges different than lower frequency components etc. As i said, this isn't used on the Macs in 11 bpc mode, but possible that Apples algorithm implements similar treatments.

All to say, this is all rather troublesome if you want to present controlled low-level stimuli. I'd rather use actual high precision displays like in your HP Linux laptop and - if at all - then tricks on top of these where the algorithm is documented.

-mario

best
denis


Denis Pelli
Professor of Psychology & Neural Science, New York University
+1-646-258-7524 | denis.pelli@... | http://psych.nyu.edu/pelli/ | Skype: denispelli | http://denispelli.com

On Sat, Jun 17, 2017 at 12:36 AM, mario.kleiner@... [PSYCHTOOLBOX] <PSYCHTOOLBOX@yahoogroups.com> wrote:
 

Denis, as a follow up, i found this article today in my newsticker:

https://www.livescience.com/ 59512-apple-imac-can-display- 1-billion-colors.html

Citing this, ... "

An Apple spokesperson said the new iMac will use an algorithm that employs both temporal and spatial dithering. The former takes advantage of the human eye's tendency to mix two colors in close proximity to create a blend of the two, while the latter achieves the same effect by having a pixel flash between two colors very rapidly.

This essentially tricks the eye into thinking it sees more colors than the display is capable of producing, Mantiuk said. The trick is already widely used by software like Photoshop and to get 8-bit output on 6-bit panels, he said, adding that the vast majority of users likely won't be able to distinguish the output from that of a true 10-bit display.

"The human eye is very unlikely to spot any difference," he said.

For the small number of pro users for whom it does matter, such as Mantiuk, the Apple spokesperson said it's possible to connect a third-party display to get an end-to-end 10-bit experience."


So the grand new (5000$ base prize!!!) iMac 2017 display which allows "10 Bit color" is not actually a 10 bit display at all, but a standard 8 bit panel with the same dithering trickery used to fake 10 bit, as in the older Macs. But if you still have money left after buying that machine you can go and buy an actual 3rd party 10 bit display to get 10 bits!! Wow, am i underwhelmed again!


Motivated by this i looked if iFixit has a tear-down of the 2015 iMac 27" 5k Retina, and they do:


https://www.ifixit.com/ Teardown/iMac+Intel+27-Inch+ Retina+5K+Display+Teardown/ 30260


There you can learn about the chips used in the panel controller board, and the best guess for what timing contr! oller (tcon) they use points to a modified 8 bpc/24bpp DP 663 controller:


https://www.paradetech.com/ products/dp663/


The photograph from the panel tells that the actual LCD panel is a LM270QQ1-SDA2 panel from LG. Googling for that gives us this web-site with specs for that panel:

http://www.panelook.com/ LM270QQ1-SDA2_LG%20Display_27. 0_LCM_parameter_23491.html


Which tells us that it is a 8 bpc panel for 16.7 million colors / 24 bpp max.


So if this info is all correct then none of the existing or upcoming Apple iMac computers has true 10 bit color display capabilities on their built in displays at all. All they do is fake 10 bpc via their current proprietary dithering algorithm while comp! letely trashing display timing precision, or maybe on the 2017 iMac by using the built in dithering of the gpu's display engine - which would only achieve fake 10 bpc instead of the fake 11 bpc you and i measured, but at least with less broken display timing.


That would mean it would make zero sense for them to use a 10 bit framebuffer at all, as they do on the iMac, as it wouldn't be any better than the 8 bit framebuffer on the MacBookPro's. Maybe the whole reason they run with a "permanently on" 10 bit framebuffer on the "10 bit capable" iMacs is to simplify their code in case somebody connects an actual 10 bit panel from a 3rd party vendor.


Seems your HP Linux laptop will remain the only machine in your lab atm. which provides actually trustworthy stimulus reproduction at true 10 bpc and with solid timing without the danger of introducing low-level confounds.


-mario



@Denis
If you have access to the video signal (i.e., via DVI output), which is only 8bpc and <=1920x1080@60Hz, you could use Epiphan's DVI2USB3.0 frame grabber (if somebody knows of a different USB grabber capable of 8bpc, I am all ears). I use this for verifying that there is actually no dithering, but so far only with Windows and Linux. Epiphan dropped support for Mac (the latest driver available is for MacOS 10.10 & 10.11), but one can always use a second PC for doing the grabbing. Obviously, measurements done with such a grabber can only tell what the graphics card delivers, leaving the monitor out of the equation.
Marc
Dear Marc
Thanks. That's appealing except the need to buy a Windows machine. Thanks.
Denis

On Tue, Jun 27, 2017 at 3:21 AM marc.repnow@... [PSYCHTOOLBOX] <PSYCHTOOLBOX@yahoogroups.com> wrote:

@Denis
If you have access to the video signal (i.e., via DVI output), which is only 8bpc and <=1920x1080@60Hz, you could use Epiphan's DVI2USB3.0 frame grabber (if somebody knows of a different USB grabber capable of 8bpc, I am all ears). I use this for verifying that there is actually no dithering, but so far only with Windows and Linux. Epiphan dropped support for Mac (the latest driver available is for MacOS 10.10 & 10.11), but one can always use a second PC for doing the grabbing. Obviously, measurements done with such a grabber can only tell what the graphics card delivers, leaving the monitor out of the equation.
Marc