Forum

Realsense images over LAN: python --> Gamma

Hi guys,

Let’s suppose I have two PCs, a master one and a slave one.

On the slave one I have an Intel RealSense camera connected and a python script capable of grabbing the depth image and send it over net via TCP (I’ve to develop it yet but I think I will take some inspiration from this code).

On the master I’m running a Gamma patch and I would like to get those data from this incoming TCP connection and feed all the postprocessing depth filter with them. I think that some clue about the “crunching” of those data can be extracterd from the server.py code here but I don’t know where I can start.

Any suggestion on how can I proceed the VVVV side?
Any exhisting tutorial for doing something similar?

Thank so much for your support.
na

hey moscardo

the problem with your approach is that what comes out of the RealSense node (and is expected by subsequent downstream nodes) is not an “image” that you can easily encode and send over the network as the python script you reference shows. it is rather a type called FrameSet that holds much more info than only the depth image.

is there a reason want to do all the processing on the receiver and not the sender already? because if you did it on the sender already, you could then simply send the resulting depth image.

1 Like

Thank you @joreg for your reply and sorry if for not being sufficiently clear from the outset.

I completely agree with you, in fact I could do the processing on the sender and avoid forwarding data over the network to the master.

The fact is that if the master is able to do post-processing using Gamma, the sender PC (potentially even several senders PC, each one connected to their own sensor) have a Linux OS.

Assuming it makes sense, I’ll try another line of reasoning: probably then what I intend to do can be something different, maybe I could go straight in here?

Yo are not gonna deal with that without custom plug…
You need to convert image to byte sequnce encoded in jpg, better dds (on sender)
Send byte sequence over TCP
Receive byte sequence over TCP
Using some dx method convert bytes to image (for instance this one Texture.FromStream(Device,Stream,Usage,Pool) | Microsoft Docs) but that should work in stride context…

if you don’t want to deal with code, you have to use Spout TCP or something similar

1 Like

i don’t quite understand what the fact that the sender is on linux would have to do with the question on which end you’d do the post-processing.

doing the post-processing on the master using the RealSense nodes will be more difficult because as pointed out above, the filter nodes operate on a datatype called FrameSet which would be harder to transport via network.

so but yes, if you do the processing already on the sender, then you only send the final image and that you can then decode depending on the encoding. eg. using the ImageDecoder [Advanced] node that comes with skia.

1 Like

Thank you for all the suggestions provided.
I think that, related to what we are talking, we can add to the discussion one more element here.

I’ve just found a project on the official Intel RealSense documentation which uses two different tools:

  • rs-server (only for linux machines);
  • realsense2-net (available for different platforms);

To share data and images from a realsense connected to a networked PC with a second PC.

Although transmission bandwidth issues are obvious (USB3 is much faster than GigaBit ethernet), I was able to “stream” data from a linux PC (with RealSense connected) to a RealSense Viewer on Windows 10 on the same local network.

Is there currently an option for the RealSense node in Gamma to select a remote source via IP?
Some sort of wrapper for the realsense2-net tool to connect to the remote rs-server tool?

Thank you so much for any suggestions

no. this would require the c# wrapper of the realsense sdk to support realsense2-net. if this is done, integrating that into vvvv should be simple. note the VL.Devices.RealSense pack is opensource.

1 Like