Frame Difference in Shader?


Im trying to get movement by frame difference. I tried to get this by shader, but Im not sure if its possible.
I just check if the current frame is different to the last frame, which is stored by buffer.node. But its not working and a get some odd behaviours:

-Is buffer.node storing a frame got by my camera (which is 1/25 sec at 25fps) or does it store a frame at grafic-card which is probably faster than 1/25 sec? If so, how can I store my camera image - is there a way? Is there a buffer.node for DirectVideo or a framedelay?

thanks frank (3.2 kB)

without having had a look at your patches…it should be quite simple using a queue set to 2frames and then feeding the two slices of the queue into the shader on separate pins via getslice0, getslice1 out of the queue.

this way you will access two consecutive frames to calc the difference from.

Thanks joreg that was the node I ve been looking for.

if i want to say, queue up 6 different textures from a video feed and get an average of the movement, how would I go about this?

create 6 texture pins and sample them each at the same position - then you have 6 colors (float4). after that just add them up and divide by 6.

float4 color = (a+b+c+d+e+f)/6;

if you want to do the same with fewer variables you also can add the colors up in the same line where you sample them. don’t forget to divide by 6 at the end.

float4 col = 0;

col =+ tex2D(Samp1, In.TexCd);
col =+ tex2D(Samp2, In.TexCd);

col =/ 6;

you might want to ensure that the alpha is 1 independent from the alpha values in the textures:

col = float4( col.rgb, 1 );

Thanks gregsn Im going to try that now.

But a general question - if I queue six textures I get a spread with 6 upcoming consecutive textures.
Can I directly put this spread into a shader and get all of these 6 textures? If so how?
Or do I have to split them with getslice.node and put them into shader seperatly?
By the way shaders are limited to 8 textures, dont they?

i think you can put as many textures into the shader as you like, but only sample a limited amount within one frame. (?)

yes you need to use the getslice

yep it is exactly a maximum of 16 texture lookups you can make per pixel. this is the same for ps20 and ps30. for vs30 it is a maximum of 4.

i’d say it wouldn’t work to put more textures in a shader since there are not more than 16 sampler slots available. not sure what happens though if you do…

does anyone have any good example of tracking using shaders and textures? I can’t get my head around it at all!


I really need to learn how to track a hand in space. (i.e not on a table or screen.)

Thanks guys.

@00011000: There’s nothing like the one way how you get it. It always depends on your goal and your setup - thats what it makes quite difficult.
I’ve made a background substraction shader giving you the foreground (hand or body or or or) of your live feed. Its reliable for non changing light settings and has some features for get rid of shadows. It shoots a picture at beginning and every pixel that is different later, is interpreted as foreground. Take a look at shaders page.
And attached is a shader working by frame difference. Good for changing light situations. Not good for slow movement and whole body extraction - its being uesd at this hand-cursor example in patching questions. Im still working on better results, but its for giving you an idea. (3.8 kB)

have a close look at …:: u7 freeframe plugins.
the brandnew framebuf can be very helpful for what i think.