Video differential tracking & projected texturiser

Hi there,

Your comments would be most welcome and appreciated … ;-)

The project involves a projector projecting an animated texture onto any person who walks into the scene as seen by a camera. The person could be moving or standing still.

What I would like to do is use the camera to capture the static scene and use this as a comparative image. I then ‘subtract’ that image from all other frames coming from the video camera and use that output to create one or more object’s onto which one or more textures can be rendered.

Is this even possible?
Any current patches available that may give me ideas?
Any hints?

Many thanks
Grant

Me again.

Just had to add that I understand that there will be lag and that the projection itself will constitute a change to the scene.

In it’s final version I would want to use a pure IR camera (visible light blocking) and flood the scene with IR light so as to negate the interaction between the projected image and the patch itself.

How large is your stage? Difference based keying is always tricky especially if you can´t control background and clothes colors. If its not to large i would definatly try to do it with a kinect.

The output delay can be kept pretty low, 2-3 frames at 60 fps isn´t that much

Well I have a few ideas in mind and yes they mostly involve smaller stages of 5 to 6 metres in width.

Any hints on how to use difference based keying effectively?

And … how do I place a given texture onto the result? Contours maybe?