Radeon card observations and questions

Hi Ben,

I have starting playing with a Radeon card. Thanks for your work on this.

a) I integrated your SetClut etc. stuff into the master version of
the toolbox so
it will be distributed with the next release. The card I have is in a new G4
Tower and was the result of asking Apple to put a Radeon card in the PCI
slot. The driver has a number between the two you check for in your code,
so I added it to the list. Preliminary checks indicate that the driver indeed
supports 10-bit stuff with whatever version of the Radeon Apple put into
my machine. I haven't yet actually measured light output with a
single step in the
least significant bit to verify that it really is 10-bits, but I
will. Certainly
the software behaves as if it is correctly accepting the 10-bit input, but I
don't think I could visually distinguish the output of true 10-bit DACs from
8-bit DACs coded to work with a 10-bit interface. I did once verify that the
Radius cards really do have 10-bit DACs.

b) My timing of the SetGamma and SetClut calls shows them to be very
slow. Indeed, the call takes very close to a whole frame time. Most of
the time it finishes before the next vertical retrace, but sometimes not.
Indeed, it is almost as if it waits for blanking both at the beginning and
end of the call but that sometimes my loop makes it back to the top before
the end of the blanking period. I'm not confident of this diagnosis and
haven't yet played with any of the optional preferences in SCREEN, nor
yet run ScreenTest to see what it thinks. (I had to go about 10 minutes
after my initial observations.) Do you have any insights or data on this
point.

c) I have some more G4's coming and Apple no longer will build the Radeon
PCI cards in (as of this past Monday). ATI seems to sell a 7500 and
an 8500. Do you know what the difference is, and whether either or both
is supported by the 10-bit drivers?

d) I wrote a routine that will take cluts in the range 0-1023 and depending
on the hardware convert them to the right form to pass to SetClut. For Radius
10-bit, this means leaving it alone, for Radeon this means shifting 6
bits towards
the high end, and for 8-bit hardware it means shifting the two least
significant
bits out. This routine eases coding hardware independent code. I'm
not sure it's the
exact right solution. Once I understand a little more I'll send you
what I've got.

Best,

David
--- In psychtoolbox@y..., David Brainard <brainard@p...> wrote:
>
> Hi Ben,
>
> I have starting playing with a Radeon card. Thanks for your work on this.

[snip]

> I haven't yet actually measured light output with a
> single step in the
> least significant bit to verify that it really is 10-bits, but I
> will. Certainly
> the software behaves as if it is correctly accepting the 10-bit input, but I
> don't think I could visually distinguish the output of true 10-bit DACs from
> 8-bit DACs coded to work with a 10-bit interface.

Hi Ben, David, all,

I've been setting up a 10-bit Radeon, too (thanks Ben!), and just
finished some informal measurements (just using a Tektronix
photometer). Single steps appeared to yield detectable changes in
intensity, so good news. I just sampled a bunch of random values
(extremes and near center), nothing all that systematic, but at a first
pass all was well.

-Alex

Alex Huk
Senior Research Fellow
Dept. of Physiology & Biophysics
University of Washington
web: www.shadlen.org/huk