1PC running 2 Exported Applications on each Screen with Stride Touchscreen Input

Hey there,
ive just tested running 2 exported applications Fullscreen on 1 PC, it working fine but not really sure how to use Strides Touchnode ( Touch [Stride.Input] ) to only output the TouchData for the Touchscreen each Application is running on.

Right now im just using the latest input for single touch and expected it to only receive Touch data from each renderers Screen, but currently the Screens block eachother.
Currently i can only blind-test via anydesk so any hints on how to solve this would really help.
Would it help to combine 2 apps into 1 app with two renderers?
Is this some windows limitation?

Update: Ive tested this on a surface today with no luck, maybe some…

TouchTest.vl (32.5 KB)
To clarify the idea would be to run for example this patch as an exported application twice on one machine. Using 2 touchscreens in fullscreen mode without the touchscreens blocking eachother.

ive played around with some things inside the SceneWindow without any luck
image

The only thing that could be a workaround is to only use 1 Renderer across to Screens which would be a painful workaround.

Any other solution would also be welcome, like Raspberry pi proxy or something like that.
NuGet Gallery | RawInput.Sharp 0.0.4 Found this nuget, which might be able to output raw mouse / touchdata without a window.

Hi don’t think windows actually supports two touchscreens… Recently I’ve been digging through control panel and there is thing called calibrate touch, guessing you might want to take a look… Control Panel > touch > calibrate touch screens

Hey @antokhio thx, the touchscreens are both correctly identified and calibrated in windows11. Both applications work with Touch, its just that if one application is being used, the Touch input of the other application is blocked.

First i thought that both applications might output both touch outputs at the same time and i just need to filter it. Yesterdays test with the patch above showed me they only output touches for first active ( touched screen / window ) Touchscreen.

Sounds very much like a restriction by the operating system. To rule out our software, what happens if you open two Microsoft Paint instances and try to draw on both of them?

i guess it is, but i remember doing this in beta. Yes you are right, 2 Paint Instances will block each other but I have the feeling that if the VL app is wrapped as one application with 2 Renderers the application will always be “active” for windows - no matter which Screen you touch.

Some weird thing that works: If you wrap the applications in a new vl app, with 2 renderers, you can get both touches at the same time. But they will be output by the inputsource which was touched first. I figured i could then sort them via GUID or Device name, but these values wont update/ change per touch. So i cant sort them, but i am getting touch from both screens without blocking, just cant sort…

Heres how my failed filter attempt:
image
I feel like if i could lock the inputsource to the renderer or get the correct Screen GUID this would work.

Can you span both outputs with one renderer?

@catweasel yes i can, but then i still only get touch for the first touched section.

My current escape plan is a intel NUC pc with a HDMI Dummyplug, create a vl app and send the touch data via OSC ( which breaks my heart) i was not able to get this rawinputsharp thing going https://www.nuget.org/packages/RawInput.Sharp

1 Like

Damn, yes sorry, didn’t think that through. Yes nuc is probably the only option bar reading the HID and making your own driver.

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.