Hey there,
I’m sure you already heard about the hacked kinect. I was wondering if someone is planning to create a kinect node. Or is it somehow possible to use the OpenKinect Data in VVVV?
Hey there,
I’m sure you already heard about the hacked kinect. I was wondering if someone is planning to create a kinect node. Or is it somehow possible to use the OpenKinect Data in VVVV?
stay tuned. some folks at the NODE10 currently are fiddling around with a kinect…
and get yourself a username ;)
@kalle :)… no text …
i am looking forward to getting my hands on one just for this reason
Alright, got myself a user name. :)
Hope you’re being quick. I want to use the Kinect for some VJing stuff next week.
look at this: http://vimeo.com/16945053
WOW! Thats was amazingly fast as well!! Do you get the image from the normal cam as well? And is it a WDM driver of does v4 access the cam directly? Guess I will buy me one for chrismas… thats gonna be lotta fun, imagine you can affect particle system in 3D. ;)
Hey, first level finished, my just bought kinect was running out of the box with the drivers from http://nuigroup.com/forums/viewthread/11249/. Be shure not to buy the bundle with the xBox360 (as I did first), because there is no power adapter and usb plug included. Ask for old XBox compatibility.
https://vvvv.org/sites/default/files/imagecache/large/images/Bild 3.png
Just to let people know plugin will be released sometime this week, need to buy a kinect tomorrow and fix a few extra things before release.
just bought my kinect yesterday and i dont even have an xbox. the guy at the counter was like “what type of xbox do you have” and i was like “…uh…im not hooking this up to an xbox” and he just looked confused. I got the driver working already on my machine and now readily await for the vvvv plugin!
This week? Very cool, thanks. And as Microsoft stated, it’s totally o.k. for them using the kinect the way we (want) do…
@vux. i think if you have to buy a kinect to finish this node then we should all buy it for you.
I’m happy to give 5EUROS, that’s only a couple of beers so come on everyone and buy VUX a couple of beers for his hard work and sharing.
would be fair enough, indeed.
and i can´t wait to check the upcoming plugin out… the kinect at MESO´s office is still waiting for its very first use.
@xd_nitro : well already bought one, want to use it as well to fiddle with some image processing anyway :)
For the moment it’s using alexp’s driver, so writing the node is really not that bad, planning to move to the open source one later.
Got motor/rgb/depth image working, so is accelerometer data.
Now only need to add threading so vvvv is not tied to the camera framerate (eg 30), so first release with texture output will be there very soon.
Will add some way to get the data as raw image so we can also use the data in stuff like opencv, as shaders are not that convenient for some uses.
Stay tuned first alpha coming vvvvery soon :)
That’s great Mr Vux!
hi!
does anybody know how accurate the depht information is you get from kinect?
cheers
@schlonzo: using the 11bit delivered by the kinect(not the 8bit of the gray-scale image often used) gives you a theoretical resolution of 2048 steps, but only 1024 seem to be delivered by the system. But I could differ between 0.2cm-0.5cm at a distance of 0.6m - 1m. The result is having some noise and irregularities also, so normally the 8bit gray should be sufficient for reliable depth information. I was also able to see the ramping of a Mac aluminum keyboard lying on my table from a distance of around 1 meter pointing the kinect exactly downwards. The kinect starts working from a distance of about 65-70cm to the nearest object. Precision seems highest there. Because the system detects the distance by calculating the horizontal offset of a projected, irregular pattern in IR-spectrum, the object needs a minimum size and must not reflect or be of dark color (but seen in IR black often isn’t black). Also very rough or detailed surfaces are bad to track because the pattern gets too distorted. As you may have seen in the screenshot above, all objects have a ‘shadow’ border to the left, because the projected pattern is hidden in the background by the object in the foreground. But it is clearly detectable which values are valid or not and can be filtered easily. I haven’t had the time to calculate the real millimeters out of the given values, if this is your question… but because of the used principle of measure, the results should be very stable and always absolute.
O.K I just found a formula empirical converting the distance values approximately to cm (in the range between 60 to 200cm):
DIST_CM=TAN(DEPTH_VAL/1024+0,5)*33.825+5.7
As you can see in the values and diagram, it’s quite near the measured results:
https://vvvv.org/sites/default/files/imagecache/large/images/Bild 123.png
Feel free to correct the constants for slope (*33.825) and offset (+5.7) to your needs. If someone has a better formula, let me know…
And to get it the other way (from cm to kinect depth value):
DEPTH_VAL=(ARCTAN((DIST_CM-5,7)/33,825)-0,5)*1024
Thank you vux:
https://vvvv.org/sites/default/files/imagecache/large/images/Bild 124.png
Really cool work…
got it up and running. youre awesome