I am using PTB on Windows 10, Matlab R2020b, GPU Radeon ™ Pro WX 5100.
Synchronization error shows up only when I plug in my EEG device via USB. No error shows up otherwise. Weird thing is that some months back I ran the exact same code with the EEG plugged in and everything worked smoothly.
Any help would be much appreciated.
Increase the verbosity when running PTB test code to see if you can see anything on the log. Try plugging through a USB hub rather than directly, that can improve performance. Or double boot and use Linux, where those sorts of bizarre issues are far less common…
What @Ian-Max-Andolina said. Switching to Linux is the best course of action for generally avoiding or in rare cases minimizing/handling such issues.
It is probably some device driver issue with the Windows USB subsystem. Random things to try - this is just the first link that popped up on a Google search, no idea about the quality, i only skimmed it and have no opinion about the tips or potential unwanted side-effects they may cause - if you break it, you get to keep both parts:
There was/is the free to download “dpclat.exe” tool somewhere to be found on the internet which may provide more insight into a cause for high dpc latency on Windows. If there is anything actionable to learn from that is a different issue.
If something works for you, please report back for the benefit of others.
Might work in a Windows virtual machine inside a Linux host though if you’d need it, e.g., if the EEG acquisition is USB based and you can enable USB passthrough for the device. It’s not impossible to marry both together and do most stuff on Linux. Haven’t run a VM in a while though.
Ideally, you shouldn’t be running data acquisition on the same machine as stimulus presentation. Although modern workstations appear plenty powerful enough, most modern operating systems can cause small hangs as they prioritise different system tasks and deal with contention for resources. You can drop data packets and/or stimulus frames (especially at 240Hz, ~4ms per frame). For our EEG recording we use two workstations, ensuring the EEG system (we’re also forced to use Windows) sticks to data collection only…
Depends on the task and system. At least on Linux there is in principle a lot of room upwards wrt. realtime behaviour - Current PTB is just scratching the surface, way more could be done if I had enough time ( ~ money to buy all the needed time).
You won’t get hard realtime with a standard Ubuntu install or config though, so depending on the timing requirements, running multiple machines may be easier. And most of the time you’d run a Linux VM on top of Linux, not a Windows VM, so in case of a Windows VM the goal would more be to not disturb the timing on Linux, while keeping the Windows timing whatever Windows timing can be…
And ofc. with recommended hardware and open-source drivers, even current PTB on standard Linux has ways to improve timing robustness for stuff like 240 Hz stimulation.
Not surprising if both machines were Windows! A collegue was having lots of dropped data packets using a large channel count extracellular recording system recently, wasted a ton of time trying to problem solve. Luckily the open source software was multi-platform, and switching to Linux (with the same electrophysiology software build) all his packet loss problems disappeared.
If your lab has the money, you could also consider buying custom hardware like the DataPixx, this handles the digital I/O timed with microsecond precision to the stimulus presentation. Such specialised hardware ensures your markers are precise and reliable. However, my testing shows that even something like an Arduino can have pretty reliable millisecond level command-response times. You should also be using a photodiode so this combination can pretty much ensure timing issues can be quickly resolved (I record the photodiode signal as analog drectly into the electrophysiology system), just in case, e.g. the latency of the display changes due to some firmware upgrade or some other such unexpected change…
That should give the Linux hard realtime integration into the standard kernel an extra boost, given that funding should not be the main heldback anymore with Intel’s investment.
The kind of use-cases i think should eventually become possible are like this one mentioned in a discussion with the RT Linux folks:
"Setups which need to present pictures/animations on a monitor with very reliable
and precise timing and timestamps, but also do some simultaneous
realtime data collection or stimulation via ADC/DAC boards or digital
i/o boards, e.g., realtime virtual dynamic patch-clamp techniques in
electrophysiology (cfe. this link for an idea: Real-Time linux dynamic clamp: a fast and flexible way to construct virtual ion channels in living cells.).
And a whole host of other applications, once the RT patch-set is upstreamed…"
The virtual patch-clamp use case is nice because it was already done and doable on RT-Linux over 20 years ago, as documented in that linked article. Ofc. setup at that time was more effort, and still is - compared to when the RT stuff will be fully merged into the standard Linux kernel.
One remaining problem is interoperation with the high-precision visual stimulus onset timestamping, which has its challenges, as this discussion thread between myself and one of the RT guys shows:
Unfortunately my explanations and encouragements for relatively easily resolving the issue didn’t end in the solution i hoped for:
Instead they decided to post-pone it until sometimes later – There goes 1.5 work-days for nothing… Well at least the documentation of howto resolve is out there for people with the skills and time…
This is another example where I could easily do the work myself to push this along, but lack of stable funding by our users means for me often to not prioritize what’s most effective for the Psychtoolbox users and general neuroscience / open-science community, but what brings in urgently needed money to survive. Iow. the prep work i did over 10 years ago for the marriage of high precision timing graphics with hard realtime remains fruitless for the time being.
Great, thanks for your messages!
I finally decided to transition to Linux since it will make life easier I suspect.
I did a dual boot with a clean installation of Ubuntu 20.04.3 LTS.
I installed PTB through NeuroDebian, but the installation didn’t ask me for the location of my Matlab installation, and it didn’t ask any questions as claimed in the Psychtoolbox website. Now when I try to run a simple PTB demo following errors show up:
PTB-ERROR: [glXcreateContext() failed] OpenGL context creation failed!
PTB-ERROR[Low-Level setup of window failed]:The specified gpu + display + gfx-driver combo may not support some requested feature.
Also the commands >> UpdatePsychtoolbox or >> help Psychtoolbox don’t work, not sure if everything is related.
Maybe I should create a new post for this? Thanks!
Can’t see a reason why that would not have worked if you installed “matlab-psychtoolbox-3”, and inside Matlab manually ran PsychLinuxConfiguration once afterwards, then logged out and in again for full setup?
Anyway, you can follow the steps on our Download page then for the case, as if you installed Matlab after installing PTB. That should fix it. Other than that there are forum posts which troubleshot this kind of scenario before…
You have an AMD gpu which belongs to the most well tested Polaris gpu family, so your hw+sw setup should be quite optimal for PTB on Linux, with no known problems.
Btw., depending on the complexity of your script, if it doesn’t use complex gui features, object oriented programming features or special toolboxes, it might run perfectly fine with the free Octave - no need for Matlab if you don’t want to. Otoh., at 240 Hz and a 4 msecs duty cycle, badly written code inside the innermost loop may be executed a tad faster by Matlab than Octave, so this could be one case where Matlab would give a small performance edge.
UpdatePsychtoolbox is only for a PTB installed via DownloadPsychtoolbox, not the one from Neurodebian which will update via regular OS facilities. help Psycvhtoolbox should work. You can launch Matlab via ptb3-matlab each time with PTB temporarily on the path until Matlab is quit, or once, and then savepath to make it permanent for all future regularly launched Matlab sessions.
Neurodebian, while simpler to install, can sometimes be a release or so behind PTB’s current version. You may not worry, but personally I prefer to keep up-to-date and choose my update schedule, which DownloadPsychtoolbox better supports. Indeed if you are savvy with Git, you can clone the github repo directly and use git pull to keep up-to-date (what I personally do).
Cool makes sense. In the past installing with DownloadPsychtoolbox seemed to require also installation of additional libraries for PTB to work, is this still the case? If so, is there anywhere with instructions on what additional libraries to install?
Yes. But if you have PTB from Neurodebian already installed, then you also got the dependencies. That’s how i do it - i just install the Neurodebian PTB and then do the DownloadPsychtoolbox stuff on top of it. On a virgin system a sudo apt build-dep psychtoolbox-3 may also work in pulling in all dependencies. The installer gives an overview of all that’s needed in case of failure.
For the vast majority of “bread and butter” stuff, NeuroDebian is very convenient, for more exotic stuff or using the very latest functionality, it can be a tad behind occassionally, depending on the workload of the volunteers behind it, and if a new release from us needs more than just running a few scripts, e.g., adding of new dependencies. Or something more exotic is missing, because of incompatibilities between the need for proprietary libraries vs. the policies of “fully free and open-source software only” of Debian. E.g., our old Oculus VR driver for optimal use of old Oculus Rift DK1/DK2 virtual reality headsets is not in the ND distribution, because it depends on a Facebook/Oculus runtime library that is “open-source”, but with legal strings attached that are unacceptable for Debian/Ubuntu/… Or you don’t get the cute frog on our startup screen, because while the frog image is free to use in the context we use it, its licensing doesn’t allow the freedom to do with the image whatever one wants with absolutely no restrictions on redistribution and use – a requirement of Debian/Ubuntu…
A plus is that if you want to stay on an older Linux distribution that is no longer supported by us, the Octave Psychtoolbox may still be supported by ND, because they provide the custom built mex files for those versions, whereas I only target the officially supported distribution, e,g, Ubuntu 20.04-LTS at the moment, with stuff continuing to work on 20.10,21.04,21.10 atm. This may or may not change in two months when Ubuntu 22.04 comes out. If I can support more than one LTS release version depends on resources ← depends on funding ← depends on our dear users financial support ← which is extremely poor and disappointing so far.
Perfect, thanks for all this info! I could easily install PTB in my fresh Linux.
After a few tests, “PerceptualVBLSyncTest” and “BeampositionTest” seem to output what is expected if I set my screen refresh rate to 60Hz. However, when I set it to my desired 240Hz, no flickering is generated with the “PerceptualVBLSyncTest” and the period of the sawtooth in “BeampositionTest” is not always constant. Is this normal for high refresh rate displays?
If my memory doesn’t fail me, I recall there were some functions to run in order to set the Linux machine ready for optimal synch performance. I didn’t find them in the Documentation, could you remind me what those are?
Thanks so much!
In particular, make sure gamemode is working for you (check it is installed using apt) and that PTB could install its modified config files).
I also always install the low-latency kernel (sudo apt update; sudo apt install linux-lowlatency, and make sure it is selected at boot time, check it is running using uname -a) and make sure I am using the latest kernel that Ubuntu supports (search for HWE for Ubuntu). There are custom kernels that claim to be optimised for performance, like https://liquorix.net but I don’t have experience of them.
Make sure all things like SSH or VNC sharing are disabled. I also turn off the printer search stuff  as it seems to be constantly scanning the network for new printers which is unnecessary.
Sometimes using a more recent version of MESA (the open-source graphics drivers) can improve performance a bit, but only tinker with this if necessary. I’ve used the kisak PPA (kisak-mesa fresh : kisak) but ensure you remove it if you ever upgrade Ubuntu. PPAs can be installed and then uninstalled easily.
Not performance tuning per se, but once your system is running well I’d suggest using clonezilla to clone the disk. This will allow you to quickly get back to this “optimal” config if you ever have problems down the line…
 There is a nice GUI for apt call synaptic. It allows you browse around, search and look at details for packages without knowing all the command line invocations.
 I use sudo systemctl stop cups-browsed; sudo systemctl disable cups-browsed, but you can also edit /etc/cups/cups-browsed.conf and set BrowseProtocols none