Display++ video card

Hi all -

My lab and 2 department setups just got Cambridge Research Display++ (120 Hz LCD display, minimum resolution 1080p). So I have 3 setups to build and configure for this system.

I initially set up a Late 2014 MacMini running Linux (Ubuntu 20.0.4), but the Intel graphics card (Intel HD Graphics 5000) can’t keep up at 120 Hz without dropping many frames (if I drop the monitor to 60 Hz, it is fine).

So my question is, what system should I pick for driving these 3 new displays? I will have to buy 3 of them, so I am looking for something lasting and cost effective.

I have a few similar era MacMinis (the last era that can run Linux because they don’t have the security chips). I like the MacMinis because they seem to run forever without problems. I retire them before they drop out on me. I have some 2013-era Dells I can draw from as well (Xeons), although my experience with Dells is that they often have some sort of problem in a 3 year period.

I could buy an external GPU box for the MacMinis and get an AMD card, but I don’t know if anyone has experience using these on Linux. Any thoughts?

I can also add a modern video card to my old 2013-era Dells. Anyone want to guess if they would keep up, if they had a decent AMD video card?

Lastly, is anyone using a Display++ with a video card that they like? I plan to run Linux on either older Mac hardware or PC hardware.

Thanks
Steve

I have two Display++ and I use them with AMD Radeon Pro WX5100 cards in standard Dell workstations, and they work flawlessly (no sync errors, I log all frame drops and don’t see any @120Hz, even with quite complex PTB stimuli). The WX5100 is fairly cheap, is the same GPU family Mario develops with, and doesn’t need extra power, so can adapt to almost any form factor. I would try to use NVMe SSDs and have enough RAM, but otherwise any homebrew computer should be fine as Linux seems to work everywhere. I dislike my Dell workstations but we can only buy “brand” computers with grant money, if I could I would custom build a smaller form factor chassis for a PTB machine (I do love the Mac Mini size)…

I have no idea how a Mac + eGPU running Linux would fare, but it sounds far more likely to run into compatibility issues!

Thanks so much Ian! This is really helpful.

How large are the stimuli that you show? The 2014 MacMini was able to show small stimuli (say, 100 x 100 pixels) at 120Hz but then dropped frames for larger stimuli. I often show full screen stimuli for 2-photon imaging experiments.

I’m going to get one of these video cards to play with on one of my older Dell machines.

Best wishes
Steve

I use ProceduralGarboriumDemo(ngabors) as a quick built-in PTB benchmark, and @120Hz I can go up to around 2500 gabors before any frames are dropped on the WX5100 running in a Dell workstation released in 2014[1], you can easily test this on your Mac Mini. I certainly can do full screen gratings etc without issue (I do use procedural stimuli wherever possible though, which are more efficient), and using my object-oriented stimuli I can layer at least 11 different stimuli types (movies, pictures, bars, coherent dots, gabors, color gratings) simultaneously without dropping any frames.

There are more expensive Polaris-era cards like the WX7100 which should be even better, and Mario thinks the GPU generation after Polaris (Vega) should also work well with PTB, but at least I’ve never tested them… The latest AMD GPU generation (RDNA) hasn’t been tested yet by Mario AFAIK…

Best, Ian


[1] on a WX5100 2019 Dell workstation with a 60Hz monitor, I can get over 5000.

Thanks Ian. This is really helpful!
Best
Steve

For the fun of it: At 1920x1080@60 Hz i can run that demo with 3000 gabors on a MacBookPro 2017 (about the last Apple MBP where one can run Linux with modest setup pain, the 2018 models introduce the T1 security chip and other hostilities) with a Radeon Pro 560 of the Polaris gpu family without dropping frames.

On my year early 2019’ish 500 Euro middle-class PC with AMD Ryzen 5 2400G processor (4 physical / 8 logical cores), 8 GB RAM and AMD Raven Ridge onboard graphics, i can do 1500 gabors at 1920x1080@60Hz without dropping frames. At 120Hz the same machine can do 530 gabors. In all cases, the cpu load is less than 10%, so the limiting factor is gpu performance.

The Raven Ridge processor integrated graphics chip uses a Vega11 graphics core, so is technically more advanced than the Polaris of the MBP.

One still can see the performance advantage of a discrete gpu chip with its own dedicated VRAM memory over onboard graphics which has to use normal system RAM.

In general, AMD is currently ahead of Intel in both absolute performance and price/performance when it comes to processors and integrated graphics. So for small (macMini style) PC’s, AMD is currently a much better buy than Intel.

Wrt. AMD gpu’s: Yes, the successor of Polaris, named Vega, is expected to work as well as Polaris gpu’s. Nothing i read in tests suggests otherwise. And my 500 Euros PC has an onboard chip with Vega graphics core, so far no problems at all with PTB. There is one difference though between a Vega discrete graphics card (for plugging into a PC) and this Vega integrated chip: The display engine for bringing the picture onto the screen is a DCN-1 display engine in my PC, but Vega discrete graphics cards have an older generation DCE-12 display engine. So DCE-12 is untested by myself. I expect it to work perfectly fine, but PTB’s special low-level display engine bag of tricks, which has to be adapted to each display engine generation, is there for DCE-12 but untested, so could have bugs. I’d expect bugs to be unlikely, and easy to fix by myself with cooperation of a user, should the need arise. Also, the low-level code is of much less importance with recent AMD gpu’s and modern Linux versions, as the OS does most of the stuff that code was used for by itself. The code is mostly like an airbag in a car. You usually don’t need it, but it is a bit of extra assurance in the case of driver bugs that would slip through my own testing.

Starting with the latest generations of AMD gpu’s like Raven Ridge integrated graphics, and the new RDNA / Navi gpu’s, these have a completely redesigned display engine called DCN instead of the old DCE, and those are not supported at all by PTB’s low-level code atm. I don’t intend to add support for the time being, because there would be little extra benefit, compared to the added implementation/maintenance cost. Would do if the need would really arise. The only feature that our DCE low-level tricks currently add to AMD graphics compared to DCN for the latest generation is some true 12 bpc high precision color support - at a substantial performance cost and more involved setup steps for the machine.

So Vega gpu’s should be safe to use, and apart from higher performance than Polaris (being the high-end cards) have an improved FreeSync hardware implementation, which is useful for using PTB’s VRR support for fine-grained stimulus onset timing (help VRRSupport) on Linux + AMD with even better precision/robustness. But that doesn’t matter for a Display++ display, which is a fixed refresh rate display. You’d need a FreeSync2 compliant monitor for that, or at least a DisplayPort adaptive sync capable monitor.

As far as i know, there aren’t any Vega gpu’s up for sale with less than 80W of power consumption though, as these are rather cards for the performance hungry. That’s why i currently don’t have a suitable PC for testing one. ~75W is the limit of my 500 Euros PC.

Navi / RDNA is untested, doesn’t have PTB’s bag of tricks supported. While i assume it would work just fine for the most part, and haven’t heard otherwise so far, the general rule of thumb is to always stay one gpu generation behind the most recent one, if one wants a trouble-free experience, with any graphics card vendor or operating system. Those gpu’s are very complex hardware (== initial models of a new generation will have hardware bugs) which needs very complex device driver software (== initial driver releases will have software bugs + lack of software workarounds for the early hardware design bugs). The stuff needs time to mature, so unless one wants to be a voluntary beta tester… That said: Beta testers are always welcome to give feedback to myself, just don’t complain if you’d run into any trouble - someone with enough patience has to be the crash test dummy :wink:

-mario

Thank you Mario!
Best
Steve