it should be possible to track a glowing ball with Kinect, as long as it does not release too much IR light (which it probably does not as it uses LEDs emitting visible spectrum). Kinect and especially Kinect One is able to track even a small objects in 3D space.
Will there be any other objects close to the ball? (Such as your hands juggling with the ball or poi). How fast is the object going to move? Kinect One performs much better while tracking fast moving small objects compared to first version.
If you do not need a 3D coordinates of the object you may also want to consider using some 2D tracking technique such as contour/blob tracking.
Thank you very much for your reply, it’s very helpful to start on this.
There will be other objects in the room, primarily people interacting with and throwing the ball. I do also need it to be 3d.
The project in basic is a dark room with a lit ball. This ball is to be bounced or thrown around the room with the kinect tracking it.
There will be multi channel sound around the room with sound movement relative to the position of the ball (Kinect data sent from OSC to Max/MSP).
Sounds like a complicated tracking setup, you will get lot of occlusions, same with projections. You might also need a high frame-rate to track the short moments when the ball bounces off the wall.
I suggest you tackle these challenges with design of the interaction. One possibility I see is to leave people throw the ball into the area with the tracking, but do not let them enter that area.
Hmmm, I would really like to have people enter the area. I wish there to be as much immersion as possible.
I do have multiple Kinects available so will be able to track from various angles which should eliminate the issue occlusions. This is also why I wish it to be a glowing ball, so that bodies are excluded in the dark.
The sound is to go around the participants, so it wouldn’t have the same effect if they were exterior to the installation.
If the Kinect isn’t able to track a glowing ball in a dark room, would something like the Eyetoy be better?
I know that’s not 3d but a couple of them could be used at the same time to get the same effect.
There is great implementation of OpenCv made by ElliotWoods, called imagePack vvvv.packs.image
Based on the demo Kinect 3D Projectile Tracker it should be easy to track the ball, even without the light. There is even a prediction of the path implemented, that may help you to detect collision with the wall.
I tried isolating the ball from the people while testing yesterday. I am using HSCB and Levels on the output of the Kinect to isolate the glowing ball, and then using ColorTracker to follow the isolated blob. I assume this is what’s meant by isolating the ball.
The issue with my attempt so far is it’s not giving me a 3d image, and depth is quite important for this project. I have used 2 cameras for now but it’s not very accurate nor smooth. I’ve attached this patch.
OpenCV/ImagePack is looking to be the best way to go about this, I’m just having trouble knowing where to start with the ImagePack.
hi, yea that’s kinda sounds right, but instead of an colortracker u have to take pipet comopute shader, and make it output the position of bright pixels, might be i’m mistaken but i know somewhere on forum or just someone from the comunity have this shader already…
then u have to sample depth map by position of this coords to have depth over there… This is really not that much of a problem… To do that… sadly i’ll be busy for few days…
This might be a silly question, but is the Compute Pipet Shader working on a 2d rendered version of what is being picked up by the Kinect? Such as in my Kinect patch using the colour tracker?
I’ve actually found about 3 different versions of a CP shader and not sure which. Would it be the same as using the pipet node, or is it a different kettle of fish?
Also, is using the [Depth(Kinect Microsoft) [DX11, texture, vux](Depth(Kinect Microsoft) [DX11, texture, vux)] node the correct way to get a depth map?
Thanks again and apologies for the noobness, I’m very new to all this.
yo… if u track ball by RGB u have to do depth align, kinect 2 provides this texture for seamless align RGB and depth, however RGB camera on kinect 2 much slower in dark places…
sry still have to make my head around to do u this shader but specs for ur finall setup are quite vital
I bought a Kinect 2 but not got a USB adapter for it at the moment so I will try locking exposure, that seems like it would be quite accurate. Would it give 3d position data, I only have X & Y axis data at the moment (from 1 camera) and really want to get the Z in as well.