The Kinect Thread

I modified the Heightfield sample of vux to get the proportional distances. As my tryouts before, it’s not easy and a kind of empiric approach. But it gives quite orthogonal walls of my room… perhaps someone finds a better formula. 130.png

Highmap with correction (7.4 kB)

how do i insert depht expression? i’m using Expr to put into but i’m getting syntax errors, bout parenthesis…i feel so newbie ;)
i use like this


I got my kinect today, but the AC adapter came with a plug that doesn’t fit in the sockets where I live.

Can someone who got it working confirm whether the adapter should be plugged in for it to work (or is it only needed for the motors)? Because I want to know if the installed drivers are working, but if it only works with the power-adapter connected, I will be forced buy an adapter first.

The thing is, at this time I get a flashing green LED on the Kinect. So far the OpenNI and Prime Sense examples don’t work…

Thanks for any input.

i bet you need the AC adapter to get it working

no way without external power. tested :)

I have been playing with raw depth data using the new OpenNIGenericNode, performs much better than memory transfer method from before.

Trying to project (kinda) mapped onto thin, slowly moving/floating inflateables (Does not need to be perfect but should somehow reflect the movement of the shapes.) Got rid of some of the noise in the depthmap by (motion) blurring it. Could not get the formula from “Highmap with correction” into “Displacement with Normals” shader. Guess its the:

PosO.z = 0.7+tan1.0-(height.z*1;

Also wondering if there is an efficient way to translate the greyscale image into a (lower res) mesh, so I can feed it into other shaders as well.

Kinect_Smoothed_Heightmap.rar (15.6 kB)

@Meierhans hey buddy ;) checked your patch, there is a contrast shader node missing, it’s supposed to be in your plugins\OpenNiSkeleton\ folder.
reuppp :)

Hey Sapokan :)
As far as I remember there was no contrast shader used, just constant, blur and desaxis heightmap. Hope to find time tonight to check and reupload.

One day I will make it and create a shareable patch… ;)

apparently there is a contrast node missing, check:

Ok, I copied to .fx to the project folder, dragged it into patch, reconnected. Should work now…

(Motion) blurred depthmap from Kinect applied to Heightmap shader (21.5 kB)

i downloaded the newer project (above) with the contrast .fx in the same folder but the contrast is still showing up as missing.

i also added it manually but not sure how to reconnect the node, i don’t know what you had connected since all connection to contrast were broken when contrast was not found… any help?

Alpay Kasal

have a look at

i hope it’s not too confusing

hey everyone,
I dont know, if I am the only one, but I am having a real hard time, trying to get my kinect running… At the moment I am struggeling to get it running at all, so I am not even at the point, to try it out in vvvv…
I tried several versions and different orders of OpenNI, PrimeSense, stable and unstable and so forth. I tried to change camera resolutions in some XMLs and every other hint and tipp I could find in different tutorials and forums…
Currently I have managed to get at least the OpenNI NiViewer running on my old laptop. It runs 32bit Win XP. And this success (as little as it is) happended with a nice Auto Installer that I found. So I thought I should share this with evryone, who might also have problems with her/his Kinect…

You can find it here:

It worked, as mentionend on a 32bit Win Xp, but NOT on my 64bit Win 7. There are some issues with a dll, that is not found …

If you have similar problems or any suggestions, how I can solve these problems, please let me know!

EDIT: after some more in-depth cleaning (deinstalling drivers manually via control panel, reboot etc.), it worked also on the 64bit win 7.

I am sorry for the not working Contrast, actually you can just trash it, its spare and does nothing by default.

ahh great, but i may still need a bit of support Meierhans… and thanks for the help.

i’m not sure if i’m missing node connections as a result of the contrast missing but i see my render window show a dark blue texture in the background with what appears to be some white polygons in the foreground but no action. it doesn’t appear that anything has crashed, the screen just isn’t updating at all…

I dug through the sketch a bit but i got no hint…

also, my openNI install is healthy. the installed viewer works, as does animata, ogre3d, and some unity3d stuff.

thanks again, very much appreciated.

alpay kasal

hey guys, i want share an idea that im working on, and i need some help on a few things
the idea will be to create a virtual dresser, so i need to do motion track of the subjet and assign some hand gestures to change cloth and color ect.
i found an easy way of getting that by measuring the velocity of the hand movement ( from a damper node )

and using Homography i was able to track the position of a plane over the subject, but i need to do this with a 3d object, what is the best way to put a box where i place my plane ? i can get the position, but to track the rotation i guess that i will need to run a vector2points from the middle joint to the neck joint ? and some trigonometry to asing that to a transform3d ?
any other idea better than this one ?

i also need to know, can vvvv handle xmodels with bones ? ( i mean to bend the mesh on the righ vertices to keep the texture in place ? )
i know how to use 3dmax and motionkinect but no idea how to work that on vvvv

a short video of the idea

@Meierhans great work on the pointcloud!

@vjc4: could you share with us how you managed to do the hand gestures?

@gmt117you need to find first the part of the body that you want to use
i used my wrist ( slice 34 from the output of skeleton ), asing that to a damper to get the acceleration, from there to another damper to get a smooth “speed” , and then i set up 2 limits and a counter

check the example file with your mouse X

Kinect output always seems quite jittery, could ou use 2 Kinects from different angles to get more accurate output?
And does kinct use colour sensing to track? Maybe wearing clothing with different colours or each arm and leg would increae accuracy?

Thanks vjc4