I have a script that creates a certain number of black dots on a grey background. I would like to be able to set the contrast of the dots to background to a specific value (for example a Weber Contrast of -0.8). There doesnt seem to be an easy way to do this. What I have so far is:
This is basically just assuming that using an RGB of [0 0 0] for the dots is a contrast of 1, and [255/2 255/2 255/2] is 0, and setting the dot colour to somewhere between these values acts as a proxy for altering the contrast.
This, however, has no relation to the Weber Contrast formula really. Ideally I’d be able to calculate the luminance of the background and set the luminance of the dots accordingly.
Is there some functionality that would allow me to set the contrast?
When you say luminance, i assume you mean actual luminance of those pixels on the screen as measured with a photometer? No, you can’t do what you want purely in code, you’d have to measure (and probably linearize to make it simpler) your screen first to have the right info to go into your code
Again, what Diederick says, but to simplify the code and approach, have a look at, e.g., AdditiveBlendingForLinearSuperpositionTutorial.m on how to use a unified 0-1 color system and alpha blending. This works with all drawing primitives, not just textures.
-mario
Hmm, I understand now that you couldn’t calculate luminance. But then what do people mean when in papers they refer to stimuli being drawn at 80% contrast, or using a visual mask at 90% contrast?
It means whatever they state in their methods. Ideally papers should state whether they are referring to Weber / Michelson or some other measure. Not all papers do, and we as readers are left to best guess what they likely mean.