I recently tried to get 10-bit color display accuracy using an ATI
7000 video card and OS 10.2 on a Mac G5. I set the values in the
gamma table ranging from 0.0 to 1.0 in 1024 linear steps while
measuring the screen output with a photometer and the results did not
show 10 bit accuracy(more like 8 bit accuracy). I then switched to a
NVidia GeForce FX5200 video card. The results were different but
still not 10-bit accuracy.
Has anyone actually had measurable 10 bit accuracy with a video card
that has 10 bit DAC's?
7000 video card and OS 10.2 on a Mac G5. I set the values in the
gamma table ranging from 0.0 to 1.0 in 1024 linear steps while
measuring the screen output with a photometer and the results did not
show 10 bit accuracy(more like 8 bit accuracy). I then switched to a
NVidia GeForce FX5200 video card. The results were different but
still not 10-bit accuracy.
Has anyone actually had measurable 10 bit accuracy with a video card
that has 10 bit DAC's?