Failed to choose the 12-bit precision window on AMD graphics card

Good. Output suggests proper operation, as far as the software can judge this on its own.

Unfortunately the feedback was a few hours too late for the latest PTB, so that one now ships with warnings against RDNA gpu’s. But better late than never. As my vacation starts tomorrow I can’t fix this now.

If you have a photometer supported by PTB, the script MeasureLuminancePrecision.m can automate the task of finding effective bit depth on SDR displays, to make it less tedious. You’d need to change line 135 from

o.nBits=10; to o.nBits=16 to request 16 bpc framebuffers for effective up to 12 bpc, instead of the default 10 bpc framebuffers.

The script will sample a subrange of luminances around the 50% grey point, or whatever range you set by tweaking lines 132-134. At the end, a set of models for different bit depths will be fitted to the data to find a best-fit model suggesting the effective bit depth of the display. In your case it should end at 12 bpc.

For HDR display modes there is HDRTest.m for a similar purpose.

Now that you may have one working driver to use in the end, it would still be good to try those other AMDVLK drivers, to figure out which driver version exactly broke it for RDNA gpu’s. As one of your monitors has less than 64 video modes (PTB doesn’t warn about the one connected to screen 2 / DisplayPort-4 - I guess the projector - if you disconnect the 4k HDR monitor temporarily, you might be able to use other AMDVLK drivers without crash.

Oh, and of course, once you use my custom built driver, you should make sure that AMDVLK no longer gets automatically updated when new AMDVLK versions come out from AMD, or the new driver will be overwritten by broken drivers again on each automatic update.