Ubuntu, dual screen, and touchscreen

Ubuntu 16.04 LTS

Monitor 1: 24" Dell 1920x1080 

Monitor 2: 15" eGalax projected capacitive touchscreen on a custom 1024x768 LCD

PTB-3


Using XOrgConfCreator & XOrgConfSelector, I was able to make the dell monitor show all the application windows and the eGalax one the X-screen which can be used by PTB.  But the touch input is messed up when I see it with GetMouse().  Instead of 4096x4096 resolution that I had observed when I was using only the eGalax, now I see that Y is 4096, but X goes from 0-4095 and 0-3000 something.  X gets reset somewhere at 3/4th the width.  Anyone has any idea what's happening or what can be done?  Any help will be very much appreciated.

Hi Mario,

Not knowing anything about this, but, i see your fixed code is:
numValuators = max(dev.axes, 4);
The comment says at least 4. Shouldnt that be min instead of max?

Cheers,
Dee

On Sat, Aug 4, 2018 at 4:54 AM, mario.kleiner@... [PSYCHTOOLBOX] <PSYCHTOOLBOX@yahoogroups.com> wrote:


Ok, the bug was as trivial as it was hard to find. A real treat in terms of time wasted to search for complex low-level explanations for some dumb coding error :(

Get a fixed TouchQueueCreate.m function from this link:


Explanation here:


Oh and for future reference for you or your common sense challenged colleague who started the whole eGalax touchscreen thread: If i ask somebody from your lab to help me helping you, don't ignore my requests for 10 days and then start a new thread under a new subject line, pretending the previous conversation never happened. What do you think that would achieve, apart from wasting everybody's time and pissing me off to make sure you won't get any future help?

-mario




<dcnieho@...> wrote :

Hi Mario,

Not knowing anything about this, but, i see your fixed code is:
numValuators = max(dev.axes, 4); 
The comment says at least 4. Shouldnt that be min instead of max?

-> Hi Dee. Lack of coffee or lack of cooling liquid? Both is highly recommended at current temperatures. I find a mix of coffee + vanilla ice-cream + egg-nog + some wine or beer very effective. Especially a mix of water, beer and Holunderbluetensirup is very refreshing and hits the sweet spot between coolness, cost and required residual brain activity ;-) -- But yeah, i have to do math with real numbers each time i'm confronted with the min/max problem.

cheers,
-mario


Cheers,
Dee

On Sat, Aug 4, 2018 at 4:54 AM, mario.kleiner@... [PSYCHTOOLBOX] <PSYCHTOOLBOX@yahoogroups.com> wrote:


Ok, the bug was as trivial as it was hard to find. A real treat in terms of time wasted to search for complex low-level explanations for some dumb coding error :(

Get a fixed TouchQueueCreate.m function from this link:


Explanation here:


Oh and for future reference for you or your common sense challenged colleague who started the whole eGalax touchscreen thread: If i ask somebody from your lab to help me helping you, don't ignore my requests for 10 days and then start a new thread under a new subject line, pretending the previous conversation never happened. What do you think that would achieve, apart from wasting everybody's time and pissing me off to make sure you won't get any future help?

-mario




Haha, Living in toprical south Sweden, I don't have to go to such extremes luckily. Nonetheless, thanks for that tip!

Cheers,
Dee

On Sat, Aug 4, 2018 at 5:38 PM, mario.kleiner@... [PSYCHTOOLBOX] <PSYCHTOOLBOX@yahoogroups.com> wrote:


<dcnieho@...> wrote :

Hi Mario,

Not knowing anything about this, but, i see your fixed code is:
numValuators = max(dev.axes, 4);
The comment says at least 4. Shouldnt that be min instead of max?

-> Hi Dee. Lack of coffee or lack of cooling liquid? Both is highly recommended at current temperatures. I find a mix of coffee + vanilla ice-cream + egg-nog + some wine or beer very effective. Especially a mix of water, beer and Holunderbluetensirup is very refreshing and hits the sweet spot between coolness, cost and required residual brain activity ;-) -- But yeah, i have to do math with real numbers each time i'm confronted with the min/max problem.

cheers,
-mario


Cheers,
Dee

On Sat, Aug 4, 2018 at 4:54 AM, mario.kleiner@... [PSYCHTOOLBOX] <PSYCHTOOLBOX@yahoogroups.com> wrote:


Ok, the bug was as trivial as it was hard to find. A real treat in terms of time wasted to search for complex low-level explanations for some dumb coding error :(

Get a fixed TouchQueueCreate.m function from this link:


Explanation here:


Oh and for future reference for you or your common sense challenged colleague who started the whole eGalax touchscreen thread: If i ask somebody from your lab to help me helping you, don't ignore my requests for 10 days and then start a new thread under a new subject line, pretending the previous conversation never happened. What do you think that would achieve, apart from wasting everybody's time and pissing me off to make sure you won't get any future help?

-mario







zhivagoa@... [PSYCHTOOLBOX] <PSYCHTOOLBOX@yahoogroups.com> wrote:

Hey Mario,


The touch queue functions work now! Thank you very much for fixing the issue.


Good.

I still have a mapping issue though. It is like the left and right monitors are both mapped on to the X-screen. Touching the left part of the touch screen activates windows on the left non-X-screen monitor. Even with MultiTouchMinimalDemo.m, I can see the blobs when I touch, but they are displaced to the left.

I tried printing the various X & Y coordinates from the event structure that TouchEventGet returns, but none them seem to work. Left monitor is non-touch and right monitor is touch.

As I swipe from left to right, X increases from 0 to maximum X resolution of the non-touch monitor and then resets to 0 and then again increases to maximum X resolution of the touch monitor.

Similarly when I swipe from top to bottom on the left part of the screen, Y increases from 0 to maximum Y resolution of the non-touch monitor. And when I swipe top to bottom on the right part of the screen, Y increases from 0 to maximum Y resolution of the touch monitor till around 3/4th of the monitor and then just stays there for a while till I hit the bottom.

So, it's exactly like both the monitors are juxtaposed with their tops aligned.


Get an updated TouchEventGet.m from here:


It computes the event.mappedX and event.mappedY coordinates in a more robust way for multi-X-Screen setups, which i assume you'd want to use if the touch monitor is for stimulation and the other monitor is just for the operator to see the GUI.

On a multi-display setup with multiple monitors connected to one X-Screen, e.g., X-Screen 0, you can do this in a terminal (or via the system() command in Octave/Matlab) to change the mapping of touch surface coordinates to screen space:

xinput map-to-output 'eGalax Inc. eGalaxTouch EXC3160-5332-07.00.00' HDMI-3

It would map coordinates properly if the touchscreen is connected to video output HDMI-3 (xrandr or, e.g., in Matlab like ResolutionTest() gets the names).

This should do the trick, as tested on a dual-display setup with a touchscreen simulated via your trace files.
-mario

Please let me know if you need any logs from me.

Thanks,
Zhivago...


Hey Mario,

Thank you very much for the fix.

xinput map-to-output 'eGalax Inc. eGalaxTouch EXC3160-5332-07.00.00' HDMI-3

I'm supposed to substitute HDMI-3 with the touch monitor name as obtained from xrandr or ResolutionTest(), correct?  I'm not seeing the touch monitor name with xrandr and 'eDP1' is what I get from ResolutionTest().  If I use eDP1, I get an error as shown below:

visionlab@visionlab:~$ xinput map-to-output 'eGalax Inc. eGalaxTouch EXC3160-5332-07.00.00' eDP1
Unable to find output 'eDP1'. Output may not be connected.


Tried running my test program with the latest TouchEventGet(), but I still erroneous mapping and also see X resetting in between.

Please find the shell & MATLAB command outputs attached.

Cheers,
Zhivago...

[Attachment(s) from zhivagoa@... [PSYCHTOOLBOX] included below]

Hey Mario,

Thank you very much for the fix.

xinput map-to-output 'eGalax Inc. eGalaxTouch EXC3160-5332-07.00.00' HDMI-3

I'm supposed to substitute HDMI-3 with the touch monitor name as obtained from xrandr or ResolutionTest(), correct? I'm not seeing the touch monitor name with xrandr and 'eDP1' is what I get from ResolutionTest(). If I use eDP1, I get an error as shown below:

visionlab@visionlab:~$ xinput map-to-output 'eGalax Inc. eGalaxTouch EXC3160-5332-07.00.00' eDP1
Unable to find output 'eDP1'. Output may not be connected.


Try this to access outputs on X-Screens other than 0, e.g., X-Screen 1:

DISPLAY=:0.1 xinput map-to-output 'eGalax Inc. eGalaxTouch EXC3160-5332-07.00.00' eDP1

However, with only 1 touch monitor on X-Screen 1, i don't think you need that command. Are you sure you really use the new TouchEventGet.m function ("which TouchEventGet")?

I wonder if having that custom driver from eGalax installed makes things worse or creates some confusion? Your colleague said the touchscreen also worked without that custom driver, so maybe less is more here.

Tried running my test program with the latest TouchEventGet(), but I still erroneous mapping and

If you look at the .normX and .normY fields, their coordinates should vary in a 0.0 - 1.0 interval for touches across the screen, and those get mapped into window local coordinates in the .mappedX .mappedY fields.

Bedtime here,
-mario


also see X resetting in between.

Please find the shell & MATLAB command outputs attached.

Cheers,
Zhivago...


Mario Kleiner mario.kleiner.de@... [PSYCHTOOLBOX]
<PSYCHTOOLBOX@yahoogroups.com> wrote:
>

Hmm. I read the docs that come with the eGalax custom driver your
colleague has installed, and i think you should leave it in place, as
your specific model of touchscreen is not covered by the Linux default
driver -- again some outdated touchscreen design. It would probably
only show up as mouse.

Also, are you using our touch demos unmodified? Because it doesn't
make sense to display on screen 0, given that that is not the
touchscreen. You need to change the PsychImaging('OpenWindow', 0, ...)
to PsychImaging('OpenWindow', 1, ...) to display on the touch screen
and also for PTB to understand that that is the monitor for which the
coordinate mappings should be done. The hard-coded screen 0 is just
because that was the only way i could test with that one touchscreen
laptop i had for testing.

Wrt. touches also affecting the other (e.g., non-touch) monitors UI:
Normally touches get transferred to all monitors, something that we'd
want to suppress for such a touch application. I added some code to do
that -- grab the touch input device for exclusive use by PTB.

Modify the call to the TouchQueueCreate() function in the demo to
provide the optional 'flags' parameter as 8, instead of leaving it
out.
Then you need a new PsychHID.mexa64 from here:

https://github.com/kleinerm/Psychtoolbox-3/raw/master/Psychtoolbox/PsychBasic/PsychHID.mexa64

This should make sure that touch input only goes to PTB for
processing, once the demo is running ie. TouchQueueStart() is called,
until TouchQueueStop gets called.

-mario

>
>
>
>>
>> [Attachment(s) from zhivagoa@... [PSYCHTOOLBOX] included below]
>>
>> Hey Mario,
>>
>> Thank you very much for the fix.
>>
>> xinput map-to-output 'eGalax Inc. eGalaxTouch EXC3160-5332-07.00.00' HDMI-3
>>
>> I'm supposed to substitute HDMI-3 with the touch monitor name as obtained from xrandr or ResolutionTest(), correct? I'm not seeing the touch monitor name with xrandr and 'eDP1' is what I get from ResolutionTest(). If I use eDP1, I get an error as shown below:
>>
>> visionlab@visionlab:~$ xinput map-to-output 'eGalax Inc. eGalaxTouch EXC3160-5332-07.00.00' eDP1
>> Unable to find output 'eDP1'. Output may not be connected..
>>
> Try this to access outputs on X-Screens other than 0, e.g., X-Screen 1:
>
> DISPLAY=:0.1 xinput map-to-output 'eGalax Inc. eGalaxTouch EXC3160-5332-07.00.00' eDP1
>
> However, with only 1 touch monitor on X-Screen 1, i don't think you need that command. Are you sure you really use the new TouchEventGet.m function ("which TouchEventGet")?
>
> I wonder if having that custom driver from eGalax installed makes things worse or creates some confusion? Your colleague said the touchscreen also worked without that custom driver, so maybe less is more here.
>
>> Tried running my test program with the latest TouchEventGet(), but I still erroneous mapping and
>
>
> If you look at the .normX and .normY fields, their coordinates should vary in a 0.0 - 1.0 interval for touches across the screen, and those get mapped into window local coordinates in the .mappedX .mappedY fields.
>
> Bedtime here,
> -mario
>
>
>
>>
>> also see X resetting in between.
>>
Hey Mario,

Now the touches on the touch monitor aren't going to the the non-touch one, which is a big relief!  Thank you very much for the fix.  And yes, I had changed the screen number to make the demo run in my set-up.  Also, I'm leaving the driver installed.

The mapping issue still remains though, despite the changes you suggested to the demo codes.  Even tried it on my test code (attached).

There's one interesting behavior I observe with multi-touch in the demo though.  Except one touch, all the other touches are getting mapped correctly.  Tried it with 2-5 fingers simultaneously.

I'm attaching the events that I recorded during the following:

1) swiped my finger from the left to the right slowly along the middle of the screen
2) swiped my finger from top to bottom slowly along the left side of the screen
3) swiped my finger from top to bottom slowly along the right side of the screen

Hope this gives some clue.  Wish there was a way you could remote login to take a look at what's happening.

The current command outputs are also attached.

Cheers,
Zhivago...

I just wanted to add this to my reply:

If I keep one of my fingers touched anywhere on the screen, and then touch another finger at various places, the Mapped & Norm coordinates are correct!  I observe this with your demo too.  Hope this makes some kind of sense to you.  It's like that one touch anchors the mapping to the touch screen somehow. 

BTW, with just one touchscreen attached, everything was working fine after the TouchEventGet.m fix you gave.

So PTB's mapping seems to work just fine now. What happens if you do this:

xinput --disable 'eGalaxTouch Virtual Device for Single'

The first touch point is also used for mouse-pointer emulation to drive applications and GUI components that are not touch-enabled. And mouse pointers also tend to do things like select windows on first click and so, so maybe this is some funny interaction with mouse pointer emulation. Your proprietary/custom eGalax driver exposes the pysical touchscreen as 2 virtual touchscreens. PTB only uses and controls the Mulittouch variant, but leaves the Singletouch variant alone, and that one still attaches to the regular GUI handling, so i wouldn't be surprised if it interfered in weird ways. Above call should disable it for the running session. One could also change the x-config file to permanently disable it.

In general, that proprietary eGalax driver comes with a config file (see the documentation of the archive file with all the driver components) that allows to customize the behaviour of the virtual touch devices in all kinds of ways, like changing the mapping of touch areas to screen areas, how it behaves wrt. single-touch vs. multi-touch, "mouse button" click behaviour and so on. So many things to customize and so many ways for this to go wrong if your setup isn't what the driver vendor expected.

The only way i can test anything related to your setup is if you evemu-record more low-level traces that i can replay, but that will miss potential weird interactions with the eGalax proprietary driver and the finer details of your display setup, as my displays have different resolutions, i probably use a different desktop GUI (typically KDE, not Ubuntu's standard Unity-7 on 16.04-LTS), probably a different X-Server version: I use the 1.19.6 server from Ubuntu's hwe hardware enablement stack of Ubuntu 16.04.5-LTS, whereas you might be using an older 1.18.x server by default.

I assume evemu-record allows to record from different virtual devices for that "multi" vs. "single" which may behave differently as well, so in general traces with well defined movements like your vertical left swipe, vertical right swipe, horizontal swipe etc., help.

-mario


On Thu, Aug 9, 2018 at 7:29 AM, zhivagoa@... [PSYCHTOOLBOX] <PSYCHTOOLBOX@yahoogroups.com> wrote:

I just wanted to add this to my reply:

If I keep one of my fingers touched anywhere on the screen, and then touch another finger at various places, the Mapped & Norm coordinates are correct! I observe this with your demo too. Hope this makes some kind of sense to you. It's like that one touch anchors the mapping to the touch screen somehow.

BTW, with just one touchscreen attached, everything was working fine after the TouchEventGet.m fix you gave.