Sync Trouble on Windows Laptop with Nividia Geforce RTX2060

GPUs

Mario lists the currently best supported AMD cards in this post: Ubuntu + Nvidia sync trouble

I use Radeon Prox WX5100 cards for all my lab machines, they work flawlessly with PTB and fast enough for pretty complex displays (I start to drop frames with just under ~5000 procedurally generated textures in 32bit mode). These are GCN4 or Polaris cards, the same Mario uses. As they are Pro cards they support 10-bit output as standard (though PTB has special support for extending bit depth to other models). The Vega cards are newer and Mario thinks they should be OK too.

Displays

Regarding displays (yes CRTs are impossible to find, and old ones from storage are starting to show significant nonlinearities due to their age), it really depends what types of task and what expectations for your research you have, if you are just showing pictures and absolutely no tight timing requirements (i.e. it doesn’t matter if something is ±50ms), and your experimental aims will not be impacted by poor image reproduction then any modern flat panel is going to be fine (for better or worse, many current studies are published using generic consumer flat panels these days, even ones that I would consider shouldn’t).

If you do care a bit about stimulus fidelity or timing, then you will want to ensure at a minimum the panel is both 10-bit (supporting better luminance steps and colour gamuts) and a gaming display (where extreme and variable latency delays are somewhat controlled). 10-bit IPS gaming displays (example) are really quite new, so no one has really had much chance to really study them (but I suspect they will be a step forward from older technologies). I develop with a 2019 144Hz gaming display that does render motion well with low latency, though bit-depth isn’t great. But lots of these types of displays have dynamic nonlinear “consumer” tweaks that are antithetical to vision research aims. Professional graphics monitors will be more linear and colour accurate for stimulus reproduction, but timing will be more of an issue (perhaps monitors aimed at e.g. video editors etc will handle this better). More exotic technologies are still not really yet in the mix…

So, if you are hoping to specialise in “quality” vision research, then you really should get a professional research display, that is either the Display++ as the most affordable, the more expensive ViewPixx or totally gorgeous but very expensive ProPixx. These displays are very carefully calibrated, and guarantee high bit depths and deterministic timing. They are explicitly supported by PTB, and handle things like precise synchronisation to other equipment. They are still not perfect, but this is as good as you will get at the present moment. The higher costs are more than offset if you imagine a reviewer who very well may question the quality of data from a consumer display if their defects could potentially account for some of your findings…

Computer + OS

If you are planning to do serious research, then you really should consider switching the OS to Linux. I use Dell workstations, and dual-booting them with Windows 10 I really confirm that PTB runs much better under Linux. This is a huge factor in having a stable and productive PTB system, so I really encourage everyone to follow Mario’s advice. Almost all systems can be dual-booted so there really is no excuse to continue using Windows for PTB (unless there is something that is Windows-only that you just have to run simultaneously on the same machine). The peace of mind in knowing all the fidelity checks are optimal is worth the cost of learning and adapting to a new OS.