Yep, I fully agree with Dee’s and Ian’s advice (apart from Ian’s opinion about macOS qualities and concistency and pleasantness wrt. to non-data-collection scenarios, but that is purely a matter of different personal taste).
You won’t find anything better than PTB for such tasks, but also currently macOS in general is not great for data collection, and macOS on any Apple Silicon Mac is an absolute no-go if precision and trustworthiness is required, and especially in the visual domain, although there seem to be various other shortcomings as well in other areas.
This is due to severe design limitations of Apples macOS OpenGL implementation for M1/M2/M… And bugs and limitations in Apples Metal and/or CoreAnimation graphics and display api’s.
So far I have spent multiple hundred work hours, only a fraction of them sponsored by Mathworks, trying to find ways to make it work, and have discovered that at least all other toolkits I know of do this incompetently. They are not only broken wrt. timing on Apple M1, but also (as opposed to PTB) on almost all old Apple IntelMacs. I would literally not trust any results collected with other software if visual timing is of importance.
The research wasn’t fruitless though. I have some ideas or hope on how one might improve the situation on macOS + M1/M2/… based on my previous research work. However this will not only require me to rent a Apple Silicon Mac, but also spend more work time on this, and this work has to be paid. So far less than 100 hours of ~275 hours have been paid, and due to the disappointing lack of financial support by the vast majority of our users, we can no longer afford to give much work away for free.
Another venue I want to explore, as the better solution or as backup plan, is if we could make Asahi-Linux work well enough to be suitable for data collection with PTB on Apple Silicon Macs, so users could dual-boot Linux/macOS for work vs. play. Asahi-Linux is an effort by partially volunteers, partially funded people, to get Linux running well on Apple Mx Macs. They have made great progress in the last 2 years, but one of the current main limitations is wrt. precise and trustworthy visual stimulation timing and control. Their focus is on getting the basic use cases working well right now, not special snowflakes like vision science. Also the Apple M1/M2/… display engines seem to have some serious limitations which make proper timing difficult. This is one area where i could chime in to see if I could come up with some workable solution.
My gut feeling is that it would be more likely to get a well working Asahi-Linux implementation than a macOS implementation.
All this requires substantial funding. I will probably try to get my hands on such a Mac for some preliminary tests about feasibility of such work, once current projects are completed. But if I’d find the time to do so, and if I’d think it might be feasible and worth giving a try, our dear users will have to fund it, e.g., with another crowd-sourcing attempt. I can guarantee though that certainly nothing will happen without users paying the bill.
-mario