Forum

UnrealEngine and VL+openCV

Hello people,
I’ve a tech question for you all.
I would like to “stream” the output of a UE4 game to VL to run some shape/body pose/computer vision tasks on UE output. Then I’ll send the results back to UE through OSC.
The question is: is it feasible?
I’m looking at this UE plugin that allows me to stream the output of the game to a network: https://docs.unrealengine.com/en-US/Platforms/PixelStreaming/PixelStreamingIntro/index.html
Do you think it would be possible to access the network stream through VL and analyse the output?

I’m a long time user of vvvv but I’ve been “away” from vvvv when you guys released gamma and I’m now trying to catch up to the new stuff.

hélo,

for now, gamma does not have a spout implementation. you could use beta and feed the output of a spout receiver to a VL plugin but I guess that’d be pretty heavy.

now about this PixelStreaming thingy, I have not looked at it but they claim it streams “rendered frames and audio to browsers and mobile devices over WebRTC”, so maybe this C# webrtc lib could help?

1 Like


make sure you use DX11 graphics backend in unity to not run into unexpected trouble.

maybe @microdee has more experience with it.

1 Like

You can try streaming with NDI which has a VL implementation

1 Like

Thanks a lot for the rapid answers, I completely forgot about sprout existence.
I’ll do some tests and see how heavy the whole thing is and if it’s doable on one machine.

tl;dr: Spout sharing from unreal works fine with said plugin

receiving spout in unreal is broke and low performance, for some reason they don’t just create a shared resource from the handle but they also copy the texture to system ram and write it back to gpu, ergh. Sending works fine. I made a plugin in which you can receive shared textures properly, but that’s not spout but directly works with DXGI shared handles (AsSharedTexture node in vvvv gives you this handle)

1 Like

Hey, I’m back.
I’ve tried the NDI, on the same machine, it’s working but the detection with YOLO slows down the process quite a lot being both on CPU, I assume.
I’d like to try with Spout but I’m having troubles creating a VL plugin with YOLO detection node.
Additionally the vvvv demos present in the VL.OpenCV folder, are giving me lots of red nodes, haven’t find any hint on the forum.
(I do have the latest alpha and dx11 installed)

Hi @ectrome, to me it looks like you do not have VL.OpenCV installed properly. All VL.OpenCV nodes are red. Could you first verify that the pack is indeed installed correctly?

Also, regarding YOLO slowing things down, you should probably use it in an async manner to prevent it from slowing down your main loop.

@ravazquez Hi, thanks for the async thing, just found out about it and my sketch is holding frames much much better now.

Regarding the improper installation, what do you mean? Do I have to install VL.OpenCV in vvvv/alpha? If so, how? are there instruction around?
Because everything else is running properly when it’s just on gamma/VL
I’m a bit confused by this division

@ectrome glad to hear the async approach is helping.

Just for clarity’s sake, alpha is no longer a used term around here, we have two main products: gamma and beta (good old vvvv), beta can be the official release version or the preview version (previously known as alpha).

gamma ships with VL.OpenCV installed, but if I am not mistaken this is not the case for beta-preview. If you need to install VL.OpenCV in beta-preview have a read here .

@ravazquez thanks for the clarification! So if I understood correctly I have to install the OpenCV nuget in the VL version that is shipped with the beta, right?

The thing that confused me the most was that VL.OpenCV in Gamma has a folder with vvvv demos that do not work with Gamma itself

Anyway, I’ll follow the instruction you linked!

@ectrome that is correct, basically inside beta open a VL template, get to the nuget commandline from the VL editor and install the VL.OpenCV nuget from there. (Don’t forget the -prerelease argument)

If you run into any issues report back here.