Disabling dithering on recent Nvidia GPU's?


I have an Nvidia RTX A4000 which seems to have two layers of dithering enabled, the typical driver one that can be disabled in Linux, and Windows with the application Color Control which can be verified with a loss less capture card (DVI2PCIe) but there seems to be a lower level layer of dithering that my lossless capture card cannot detect. The RTX A4000 is the professional version of the RTX 3070. Does any one have experience on completely disabling dithering on these cards under Linux or Windows? It almost seems as if the dithering is part of the hardware since if i swap in an older Nvidia GPU like a Kepler based Quadro K4200 there is no dithering on the screen and it produces a 100% still and flat image.

But how do you know that there is still dithering if you cannot detect it via DVI2PCIe?

Because the colors produced on the screen look different when using the RTX A4000, there is more noise and then the colors are normal when i switch back to the K4200, i can run both GPU’s at the same time even and how the colors are produced depends on which GPU the monitor is connected to. I’m using a Dell UP2720Q the UP version which is a true 10 bit panel but i feel the RTX A4000 is perhaps trying to use dithering to achieve something over 10bit color which i dont want nor can i find a way to control it. Whatever its doing it seems im not able to capture this activity through the DP connection.