Open NI Kinect - how can I use pipet to detect a hand from the kinect suspended from the ceiling (so that it is facing a floor)

Dear All,

I am working on the art installation - which will be exhibited in couple of days. And I got stacked on one issue. Due to the space constrains I will have to suspend kinect from the ceiling so that it faces a floor (or in fact the surface onto which my interactive image will be projected). People are supposed to interact with it by hovering their hands above certain elements (kind of hot spots). There is no problem for me doing it using Open NI hand node when kinect is in a “normal” vertical position, but am getting stacked now trying to incorporate pipet so that kinect detects the hands motions and based on that interactivity can happen.

Can anyone help? Or give me an example patch? (Obviously I went through all the help files, but when trying to patch myself am getting still some not very well working nonsense - and due to the time constrains it is time to ask for help).

Thanks in advance,

Olygamy

the hand tracker is a quite advanced algorithm. i doubt that one can patch that in a few days. even if you can do it, it will be pretty slow.
the handtracker works so well, because it can use the fact, that the hand is the foremost object of the skelton.
so… either find a way to put the kinect in front of the people or put nice hand tracking out of your mind and do it with conventional image tracking like trautner (gray mask for the interactive areas) or cut all lower and higher depth data from the kinect depth and look for movement in the area and altitude range where you ‘expect’ hands.

Hi Tonfilm,

Thanks for that - I will have a go at the trautner. At the moment actually I am having a little bit of success with a contour node - although still some way to go.

O.

hey
check the examples from my friend pato

http://www.patriciogonzalezvivo.com/blog/?p=213

not vvvv, but could help you for this situation