Checking luminance persistence programmatically (is it possible?)

Hello,
I am using moving dot stimuli, and I need to check whether the monitor luminance decays fast enough so as to not leave a trace of the dot after it moves. Is there a way to “read” the monitor’s physical output?
As far as I understand (please correct me if I am wrong), Screen(‘GetImage’) would give me the image sent to the monitor, but not a feedback from it.
Thank you.

No, the only solution is external hardware.

And to do this properly is not easy. You will need a high speed photometer and make sure your placement and timing analysis can properly evaluate the rise and fall time of such a spatiotemporal target (most timing tests use a static flash, a moving target is harder to use). Most displays are really poor in handling motion, considering the luminance start/end transition variability. Gaming monitors do all sorts of non-linear overshoot manipulations to get “Good” GTG timing results but this does not eliminate artefacts. A moving dot will generate an oriented luminance smear, which can activate non-direction cells in V1, and you have to be really careful with how you interpret your results…

You should consider building a pursuit camera:

And familiarising yourself with the many articles from the blurbusters site.

Some sites like RTings review monitors using the pursuit camera and provide full plots for rise/fall times and the GTG matrix details, e.g. Gigabyte AORUS FO48U OLED Review - RTINGS.com provides https://www.rtings.com/assets/pages/aFvviB3d/charts-max-vrr-large.jpg etc.

Thank you, that is very useful.
Reading the page you linked from the RTINGS website, I understand the crucial parameter is the response time:

Like that example, the monitor I am using (VG248QE - Tech Specs|Monitors|ASUS Global) also has a 1ms response time, does this mean that motion blur should be negligible on it as well, or is it not that simple?

Not that simple… The review for that FO48U is for an OLED display, and the timing is much better than standard display technologies. Your display is a TN-type, and the stated response time can be pretty meaningless…

I have tested 240Hz monitors that claim <1ms response times with very extreme motion artifacts. Many displays can aggressively overshoot when trying to “improve” response time (mostly they want to show off how fast they are at the expense of display fidelity). Watching a movie or game you will not notice, but using a plain background and a moving dots and it shows up. You can get multiple possible defects.

Even for the new OLED monitors lie that AORUS where the rise and fall time is reliably sub-millisocond across most of the GtG sample points and the display is “steady-state” for the frame duration, blur can persist. One solution is a scanning backlight, forcing black frames in a manner similar to the luminance transients that characterise CRT displays (that is what both the Display++ and ViewPixx do at 120Hz). This seems to force our visual system to “reset” to each frame.

Another is just pushing up frame rate as far as you can. ViewPixx make a projector (ProPixx) that can display greyscale motion at 1440Hz (using a clever PTB trick with both the color planes and frame segments), and this is probably the best display available for motion stimuli, but it is really expensive.

Commercial monitors also do other things that can be a disaster for vision research. For example they can extend “contrast range” by dropping backlight luminance near black, but this causes a noticable flash across the screen as a stimulus is shown.

The poor display representation of motion is really a disaster for vision scientists. There is a fascinating visual phenomenon, the motion streak (see Geisler’s paper Motion streaks provide a spatial code for motion direction | Nature and back citations). The idea is that spatial integration can drive orientation tuned neurons parallel to motion direction. But probing it in more detail ultimately requires a real motion display…