Audio device issues: jack server in Ubuntu

Hello everyone,

I have an odd problem with sound output in PTB, which I think is related to intermittent issues with the Eyelink toolbox that cause Matlab to crash.

First, some details of my system:

  • Ubuntu 20.04.2 LTS
  • Psychtoolbox 3.0.17 - Flavor: Debian package - psych toolbox-3 (3.0.17.6.dfsg1-1~nd20.04+1)
  • Matlab R2021a
  • AMD video card: Ellesmere (Radeon RX 580); driver = amdgpu
  • ViewPixx 3D screen;
  • Eyelink 1000+ tracker

I first noticed that whenever I call PsychPortAudio(‘Open’…) or Snd(‘Open’) in Matlab, the following error message is printed into the terminal from which I launched Matlab:

> Cannot connect to server socket err = No such file or directory
> Cannot connect to server request channel
> jack server is not running or cannot be started
> JackShmReadWritePtr::~JackShmReadWritePtr - Init not done for -1, skipping unlock
> JackShmReadWritePtr::~JackShmReadWritePtr - Init not done for -1, skipping unlock

The PTB output into the Matlab command window was this:

PTB-INFO: Using modified PortAudio V19.6.0-devel, revision 396fe4b6699ae929d3a685b3ef8a7e97396139a4
PTB-INFO: Choosing deviceIndex 0 [HDA Intel PCH: CX20632 Analog (hw:0,0)] as default output audio device.
PTB-INFO: New audio device -1 with handle 0 opened as PortAudio stream:
PTB-INFO: For 2 channels Playback: Audio subsystem is ALSA, Audio device name is HDA Intel PCH: CX20632 Analog (hw:0,0)
PTB-INFO: Real samplerate 44100.000000 Hz. Input latency 0.000000 msecs, Output latency 9.977324 msecs.

I did some research into what the “jack server” is and surmise that it is related to the sound card. I can start the jack server with a GUI called “qjackctl” or with this command in the terminal:
jackd -d alsa -r 44100

Once that is running, starting the sound system in PTB works better: no errors appear in the terminal that’s running Matlab, and the output in the Matlab command window looks different:

> PTB-INFO: Using modified PortAudio V19.6.0-devel, revision 
> 396fe4b6699ae929d3a685b3ef8a7e97396139a4
> PTB-INFO: Choosing deviceIndex 7 [jack] as default output audio device.
> PTB-INFO: New audio device -1 with handle 0 opened as PortAudio stream:
> PTB-INFO: For 2 channels Playback: Audio subsystem is ALSA, Audio device name is jack
> PTB-INFO: Real samplerate 44100.000000 Hz. Input latency 0.000000 msecs, Output latency 23.219955 msecs

So first question is:

  • Does anyone know why the audio output system throws errors unless I start the “jack” server outside of Matlab? Is there any way to make it automatic?

Next, I think the sound issue was causing problems when my Matlab code communicates with the Eyelink computer to calibrate, because the calibration routine also play sounds. Occasionally the sounds would not happen, and occasionally the program would freeze and Matlab would crash. But I’m not 100% sure. Since we started running the jack server manually (as of yesterday), we haven’t had any issues.

However, I haven’t been able to simultaneously use the PsychPortAudio system and the Eyelink functions. I read the instructions about first starting PsychPortAudio, then passing Snd(‘open’) the handle to a PsychPortAudio channel or slave, but that has never worked for my on Linux systems. The Eyelink calibration system sometimes hangs, gets stuck and even crashes outright. So, I’ve relied only on “Snd” in all my experiments that also involve eye-tracking.

So my second question is: is there a better way to use PsychPortAudio and Eyelink on a Linux machine?

Many thanks,
Alex White

Your mail appears unfinished. What is your question/what are you attempting to fix?

Hm, I don’t need to manually start a Jack server to use PsychPortAudio on a very similar system (Ubuntu 20.04, Radeon Pro WX5100, PTB 3.0.17, MATLAB 2021a, Display++, Eyelink 1000), so there is a configuration difference somewhere.

For the eyelink I use a customised EyelinkCallback that runs my own audio manager (I make it global from my experiment manager, called aM) and it works fine:

My audio manager is just a wrapper class around PsychPortAudio, and I call its beep method from the eyelink callback.

Sorry about that, it took me two tries but the message is all there now.

Thanks for the code! I’ll check it out. The deeper problem seems to be something about the audio system in my ubuntu installation…or how matlab/ptb interacts with it. Hoping someone here knows about that.

The Jack server for pro-audio use isn’t needed, unless one wants to share one audio card between multiple clients with high timing precision and low latency and/or use Jack’s various post-processing, advanced mixing and routing capabilities. Those “couldn’t connect” status messages in the terminal are just noise, and can be ignored if you don’t need Jack.

By default, the low-level ALSA api is used, which provides highest timing precision and lowest latency, but only allows exclusive use for one client per audio device at a time, e.g., either for one PsychPortAudio(‘Open’,…) instance or for GStreamer-based multimedia playback per sound card. You can notice this when trying to play any audio with other applications, e.g., inside your web browser after PsychPortAudio(‘Open’) – they will fall silent or freeze playback until PsychPortAudio is closed again.

The problem here is that for simplicity Eyelink TB uses Beeper() by default for simple feedback tones, which itself uses the legacy Snd() function, which itself tries to use PsychPortAudio(‘Open’), which would violate the “one client per soundcard” rule. This all in the name of backwards compatibility, because Eyelink TB’s approach predates PsychPortAudio, and lack of time for thinking about a better replacement inside Eyelink TB.

So there are three ways around this if one wants to use Eyelink and PsychPortAudio in parallel:

  1. Disable auditory feedback tones from Eyelink toolbox.

  2. Do what Ian’s Opticka high-level toolbox does: Define a customised EyelinkCallback that uses PsychPortAudio for auditory feedback and probably properly shares with other toolbox-internal uses of PsychPortAudio. That’s certainly the most elegant and flexible approach, but also the one with the highest coding effort.

  3. Use the Snd(‘Open’, pahandle) method to share a PsychPortAudio pahandle - which was first created by the experiment script - with Snd(), and thereby Beeper() and thereby Eyelink TB.

Wrt. 3 which you say you tried with little success: After a quick skim of the code, i think our Snd() function might need a little additional enhancement wrt. the Eyelink TB auditory feedback use case. If you are interested in that, see “help PsychPaidSupportAndServices” for how you can buy a community membership with priority support, or maybe two such licenses, to get an hour or two of my work time for improving that bit, and create a suitable authentication token. That would also pay for the over half an hour i already spent on your question. If you need this, be quick! Our sales & licence key management person will be on vacation starting middle next week, and then i will be on vacation, so paid support will not be provided until 2nd week of September if you miss the deadline.

Best,
-mario