Values from 'ReadNormalizedGammaTable'

Hi there, I would like to ask a couple of questions to try and understand the Read/Load Gamma Table functions.

Firstly: Given the non-linearity of human perception why does the gamma table matrix produced by 'ReadNormalizedGammaTable' have monotonically increasing values? My (albeit naive) logic would lead me to expect a non-linear progression that follows the display gamma power function. I am obviously misunderstanding what is happening here.

My assumption was that an RGB value of (10,10,10) would have a gamma table entry of, for example, (.04,.04,.04) but RGB=20 would not necessarily equal .08, but rather something that resembled 20^gamma.

Does that make sense? What is actually happening here?

Secondly: When I adjust the system display gamma using the 'Graphics Properties' control panel on the desktop it alters the appearance of the screen but does not cause the values in the matrix to be altered. Neither does any manipulation of the other (e.g. contrast) controls.

I should point out I am using an Acer Aspire 5630 Laptop. Would the fact that I am using a laptop with an integrated display make any difference?

Sorry if this post is a little unclear...