iam a little stuck with a projekt iam doing right now.
the aim is pretty simple: People should be able to grab for objects flying around in 3D Space.
We have a Stereoskopic Setup in University with two Beamer projecting on a polarized wall.
I build a similiar setup in VVVV with two projector nodes.
But calibrating Kinect movements with that setup (for me) seems to be difficult.
Maybe somebody did something similiar already and can help me?
Or maybe my attempt is totally wrong?
I attach the VVVV patch.
pressing N moves a box at a random place in the scene
kinect3D.zip (10.9 kB)
I think you’ll have a much easier time if you use PerspectiveLookAtRect instead of Projector. Define the “Rect” as your real-world screen coordinates, and the camera position as where the viewer will be, and define your objects in the same coordinate space. Just do the offset trick with a second renderer just like you did here for the L & R camera position but both looking at the same rect. (But I think you have your R and L views swapped here.) Transform your Kinect data to the same world space and that should be it (I believe you would define your Kinect offset and orientation using Transform3D, then doing an ApplyTransform to your XYZ points from the Kinect - this is just off the top of my head, I do it in a plugin).
And one note: I set my world origin (0,0,0) as the center of the screen like vvvv likes to do, and that makes life much easier - but I’ve found that made it a pain to try and measure an accurate offset to the Kinect(s) as I have to measure the projected image size and half it and then measure from a screen corner to the Kinect. But I don’t have to do that as often as placing objects, so it’s not too bad.
Yesterday i had the chance to play around with the kinect again.
thanks for that advise works pretty well.
What i do not understand proper are these Orientation Values the Kinect outputs.
I mean okay that are orientations of joints but why X XYZ, Y XYZ, Z XYZ?
But maybe this is the wrong thread for it :).