Im trying to get movement by frame difference. I tried to get this by shader, but Im not sure if its possible.
I just check if the current frame is different to the last frame, which is stored by buffer.node. But its not working and a get some odd behaviours:
-Is buffer.node storing a frame got by my camera (which is 1/25 sec at 25fps) or does it store a frame at grafic-card which is probably faster than 1/25 sec? If so, how can I store my camera image - is there a way? Is there a buffer.node for DirectVideo or a framedelay?
without having had a look at your patches…it should be quite simple using a queue set to 2frames and then feeding the two slices of the queue into the shader on separate pins via getslice0, getslice1 out of the queue.
this way you will access two consecutive frames to calc the difference from.
But a general question - if I queue six textures I get a spread with 6 upcoming consecutive textures.
Can I directly put this spread into a shader and get all of these 6 textures? If so how?
Or do I have to split them with getslice.node and put them into shader seperatly?
By the way shaders are limited to 8 textures, dont they?
@00011000: There’s nothing like the one way how you get it. It always depends on your goal and your setup - thats what it makes quite difficult.
I’ve made a background substraction shader giving you the foreground (hand or body or or or) of your live feed. Its reliable for non changing light settings and has some features for get rid of shadows. It shoots a picture at beginning and every pixel that is different later, is interpreted as foreground. Take a look at shaders page.
And attached is a shader working by frame difference. Good for changing light situations. Not good for slow movement and whole body extraction - its being uesd at this hand-cursor example in patching questions. Im still working on better results, but its for giving you an idea.