Multiple touchscreens on one machine

hi all

I have three nec v552-tm displays with native multitouch, and a patch that runs in three different instances. one for each screen. now I need to send the touch data from each screen to its corresponding v4 instance.

windowstouch and touch(devices window) won’t work in multiple instances. So I made an extra patch just for touchinput, which sends the values over udp to the different instances.
thing is - the touch(devices window) node depends main screen’s renderer size. touchinput on the other displays is recognised with different deviceIDs, but just in the area covered by the renderer on the main display.

I could start my three instances, the one on the mainscreen with the touchinput patch, and then send the values to the others. but it will just recognise input from one monitor at a time.

there has to be a better way to do this…
any ideas?

thanks :)

Just make your “touch instance” also display the content. This instance has the touch input – which it sends to the other instances – and the final renderer spanning all three screens. The other instances do all the calculations and share their output as texture.

Don’t think making instances does you any good, from tests single instance performance is pretty same or even better.

hm… I just get the touchinput from the monitor which gets touched first.
all touchcommands from other monitors are irgnored. is this normal behaviour?

I do get the monitor id corresponding to the touchinput, but not being able to read touchcommands on different monitors at the same time makes this setup quite useless…

That’s a problem with windows. We never got around it until we added rPi proxies for touch data

ok thx for the infos guys :)

a couple of mini i3s will do the trick
cheaper anyway

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.