Ugly gratings: harware issue? software?

Thanks, license key is now activated and validated, just before the christmas vacation.

“HP 800 G3 AiO” may have multiple variants, with different display types and resolutions, according to my Google search.

But the most important info is that this is Windows-10 and Intel HD Graphics 630.

As a side note a bit unrelated to your request: In general my experience with Intel graphics on Windows-10 is that the Intel graphics drivers are very buggy, at least wrt. visual stimulation timing. On my own (former Windows-10, now Windows-11) Windows test machine with Intel UHD 620 - which is basically the same chip as yours, just a few percent slower - i never managed to get reliable visual timing, PTB sync tests fail about 9 out of 10 times, so far unfixable even on my own setup. On Linux otoh. the Intel drivers are essentially perfect wrt. quality and performance. So if you’d require not failing sync tests, you might have to upgrade or downgrade through various versions of the Intel drivers. Or upgrade to Linux if possible, for a much happier life with that graphics chip.

Back to your topic: I ran your test script on both Windows-10 and Ubuntu 20.04.3-LTS Linux, on a Microsoft Surface Pro 6 tablet with native 2736 x 1824 and also scaled 1920 x 1080 pixels resolution. On Linux, visually to my “naive observer” eye, the results looked like perfect gratings. On Windows 11 they looked a tiny bit weird on the same hardware on some stripes of the grating, but only very subtle, so i wasn’t sure if the weirdness was real or just a placebo or bias on my side, neither was my girlfriend as a naive observer, when looking at it.

Note that the “PseudoGray” method you tried to use to get up to 10.7 bits of grayscale precision, is identical to BitStealing, from what people told me. It does require a monitor that is well linearized and it was also designed for use on 8 bpc analog VGA input CRT monitors. It is entirely possible that results on a LCD monitor may be less good, given that those LCD panels only simulate a “CRT monitor like” gamma response curve, and given that subtle details matter here, it could be that it just doesn’t work well on your computers LCD display, or needs more stringent calibration.

Another issue could be that our software gpu accelerated gamma correction via the PsychImaging DisplayColorCorrection task applies before the Pseudogray precision boosting method, not after it. I think you should drop that method of gamma correction from the script and instead use Screen('LoadNormalizedGammaTable', w, lut); with a suitably computed linearization lut for gamma 2.2 instead for possibly better results.

Another thing to try would be connecting your analog VGA CRT monitor to the new machine via an active DisplayPort-to-VGA or HDMI-to-VGA adapter, to see if the different properties of the LCD panel versus the CRT monitor are the reason for the problem, given the method is originally meant for CRT displays.

The question is also what grayscale precision do you need? Only grayscale or also color precision?

If you can use a CRT monitor, assuming this is not an Intel graphics driver bug, the PseudoGray method is the cheapest way to get ~10.7 bits on a well calibrated setup.
For a few hundred dollars, there’s also Xiangrui Li et al. “Video Switcher” from https://lobes.osu.edu/videoSwitcher/. It only works with analog VGA CRT monitors, but has high quality builtin Psychtoolbox support via the ‘Videoswitcher’ tasks in AdditiveBlending… tutorial. It supposedly can provide close to 15 bits of grayscale resolution according to the authors (“precision is increased by up to 7 bits” says the manual).

With a modern AMD graphics card, if adding one is an option for that machine, you may fix the problem if it were an Intel graphics driver bug instead of a display problem. And with AMD graphics - on Linux only! - you can get 10 bpc or even up to 12 bpc (if the panel is high quality enough) grayscale or color precision via spatial dithering on a 8 bit panel. 10 bpc native or 12 bpc dithered on a 10 bit panel, or 12 bpc native on a true 12 bpc panel.

The panel of your machine is most likely only 8 bit native, so 10 bpc or 12 bpc would be only achievable via spatial dithering.

On Windows 10 bpc can be achieved by use of spatial dithering on certain models of more high priced AMD or NVidia cards.

So things to try cheaply:

  • Does it work better with a connected CRT monitor?
  • Does a Intel graphics driver update/downgrade on Windows improve anything?

Otherwise, is upgrading to Linux an option?

More expensive:

  • Videoswitcher? If a CRT monitor can be used.

If no CRT monitor can be used:

  • Intel chip on Linux to avoid graphics driver bugs for maybe better results with your panel and high quality timing.

  • AMD graphics card on Linux for 10 (maybe even 12) bpc dithered precision with your display and high quality timing.

  • Some higher end / more expensive NVidia Quadro or AMD Fire/Pro graphics cards on Windows for maybe 10 bpc dithered precision on Windows and maybe ok timing, depending on luck with your Windows setup.

These are roughly your theoretical options in the fast to test/free/cheap/low price range.
So what your practical options or constraints?

-mario
[Priority support nominally used up at 1 hour so far]