I modified the Heightfield sample of vux to get the proportional distances. As my tryouts before, it’s not easy and a kind of empiric approach. But it gives quite orthogonal walls of my room… perhaps someone finds a better formula. https://vvvv.org/sites/default/files/imagecache/large/images/Bild 130.png
@Marf
how do i insert depht expression? i’m using Expr to put into but i’m getting syntax errors, bout parenthesis…i feel so newbie ;)
i use like this
tan(a/1024+0,5)*(33.825+5.7)
I got my kinect today, but the AC adapter came with a plug that doesn’t fit in the sockets where I live.
Can someone who got it working confirm whether the adapter should be plugged in for it to work (or is it only needed for the motors)? Because I want to know if the installed drivers are working, but if it only works with the power-adapter connected, I will be forced buy an adapter first.
The thing is, at this time I get a flashing green LED on the Kinect. So far the OpenNI and Prime Sense examples don’t work…
I have been playing with raw depth data using the new OpenNIGenericNode, performs much better than memory transfer method from before.
Trying to project (kinda) mapped onto thin, slowly moving/floating inflateables (Does not need to be perfect but should somehow reflect the movement of the shapes.) Got rid of some of the noise in the depthmap by (motion) blurring it. Could not get the formula from “Highmap with correction” into “Displacement with Normals” shader. Guess its the:
@Meierhans hey buddy ;) checked your patch, there is a contrast shader node missing, it’s supposed to be in your plugins\OpenNiSkeleton\ folder.
reuppp :)
Hey Sapokan :)
As far as I remember there was no contrast shader used, just constant, blur and desaxis heightmap. Hope to find time tonight to check and reupload.
One day I will make it and create a shareable patch… ;)
i downloaded the newer project (above) with the contrast .fx in the same folder but the contrast is still showing up as missing.
i also added it manually but not sure how to reconnect the node, i don’t know what you had connected since all connection to contrast were broken when contrast was not found… any help?
hey everyone,
I dont know, if I am the only one, but I am having a real hard time, trying to get my kinect running… At the moment I am struggeling to get it running at all, so I am not even at the point, to try it out in vvvv…
I tried several versions and different orders of OpenNI, PrimeSense, stable and unstable and so forth. I tried to change camera resolutions in some XMLs and every other hint and tipp I could find in different tutorials and forums…
Currently I have managed to get at least the OpenNI NiViewer running on my old laptop. It runs 32bit Win XP. And this success (as little as it is) happended with a nice Auto Installer that I found. So I thought I should share this with evryone, who might also have problems with her/his Kinect…
ahh great, but i may still need a bit of support Meierhans… and thanks for the help.
i’m not sure if i’m missing node connections as a result of the contrast missing but i see my render window show a dark blue texture in the background with what appears to be some white polygons in the foreground but no action. it doesn’t appear that anything has crashed, the screen just isn’t updating at all…
I dug through the sketch a bit but i got no hint…
also, my openNI install is healthy. the installed viewer works, as does animata, ogre3d, and some unity3d stuff.
hey guys, i want share an idea that im working on, and i need some help on a few things
the idea will be to create a virtual dresser, so i need to do motion track of the subjet and assign some hand gestures to change cloth and color ect.
i found an easy way of getting that by measuring the velocity of the hand movement ( from a damper node )
and using Homography i was able to track the position of a plane over the subject, but i need to do this with a 3d object, what is the best way to put a box where i place my plane ? i can get the position, but to track the rotation i guess that i will need to run a vector2points from the middle joint to the neck joint ? and some trigonometry to asing that to a transform3d ?
any other idea better than this one ?
i also need to know, can vvvv handle xmodels with bones ? ( i mean to bend the mesh on the righ vertices to keep the texture in place ? )
i know how to use 3dmax and motionkinect but no idea how to work that on vvvv
@gmt117you need to find first the part of the body that you want to use
i used my wrist ( slice 34 from the output of skeleton ), asing that to a damper to get the acceleration, from there to another damper to get a smooth “speed” , and then i set up 2 limits and a counter
Kinect output always seems quite jittery, could ou use 2 Kinects from different angles to get more accurate output?
And does kinct use colour sensing to track? Maybe wearing clothing with different colours or each arm and leg would increae accuracy?