looks like unreal engine with particle emitter on rigged mesh. they showed this feature some versions ago. ATM this is not yet possible in gamma, because there is no support for controlling rigs. You can bake an animation into a rigged model but idk of it is possible yet to access its vertices in gamma.
I think it not only use gamma&fuse .
Maybe use move.ai capture people
I saw the designer say they use much commercial tool to work it.
Now I try to contact move.ai to get some test!
If it is FUSE and vvvv gamma, then I would love to know how to achieve such visuals - for me, it doesn’t look like Stride engine graphics.
As noticed before, vvvv gamma doesn’t support the loading of rigged meshes, so, if you want to go with rigging/skinning, you have to implement your own system.
On the other hand, it is possible to go with a procedural approach and generate random points (random distribution on sphere, box, capsule, whatever) at runtime based only on joint data. In this case, you still need to find a way to send/receive joint data, but it seems like a simple task for OSC, in case if tracking system that is used allows to send only bone/joint data.
If the procedural approach sounds good to you, I could share some of my patches that solve a similar task (the only difference is that pre-recorded animation data is used).
And yes, speaking outside the univvvverse, most of the body tracking systems provide native integration with popular game engines (UE, Unity).