More monitor calibration issues

I've created a web page that demonstrates the 8-bit output of the ATI
Radeon 9200 Mac Edition card's DVI-I port even with the 15-pin VGA
adapter. And the 10-bit output of the VGA port. If you'd like to see
my data, look here:

http://www.lifesci.ucsb.edu/~mrowe/MonitorIssues.html

The other issue I mentioned previously is also spelled out there.
Perhaps this is common knowledge, but I was startled by the size of
the effect, so if it isn't common knowledge here, I want to make it
so. If you try to fill the screen at high intensity, the three
phosphors are no longer independent (at least for my hardware/software
combination; I'd be interested to hear if others can confirm this
behavior with different hardware). If one phosphor is set at a high
level and you increase the level of either of the other two phosphors,
the intensity from the first one drops. You can see this very
dramatically if you try to measure a gamma function for all three
phosphors yoked together (i.e., for every measurement, each has the
same value) across the entire screen. When the rise in intensity
should increase steeply, it instead flattens out. At low intensities,
the gamma function does not depend on the size of the rectangle you're
filling.

If you use CalibrateMonSPD and accept the default settings for the
background, your calibration will be affected by this problem. To be
safe, any time that you can measure your actual stimuli (as opposed to
assuming they are what the SettingsToPrimary or SettingsToSensor
functions compute them to be), you should measure them.

I hope the web page makes this more clear...

Finally, I'd like to apologize to Steve Elliott for my confusion over
his identity. Yahoo masked the address of the Steve who started the
OS9/Radeon 9200 thread, and it didn't occur to me that it was a
different Steve who answered my message after I jumped in. I also
want to thank Steve and Ed for explaining so much about DVI. I found
that helpful.

--
Mickey P. Rowe (mrowe@...)