Please see attached patch -I am in need of help!!!
What I am trying to do is create a video sampling patch that watches a single row of video pixels input and samples it at a set interval moving the captured pixels to the top of a colour history and shifting all previous captured rows down one place.
From my attempt at the start of a patch you will see that I have got a videoIn>VideoTexture>Pipet working for both a 12x12 ‘preview’ and for a 12x1 display of the pixels i am interested in.
I’ve got a rudimentary timer mechanism but don’t have it connected to anything. I’m guessing it needs to be connected to something that will cause all stored values in a spread to be shifted down one row with the first row being replaced by the current values.
The output could either be as a Videotexture or as a Spread of colour values. I’ve been thinking along the lines of a Spread as it will allow for further color value control further down the line. However if you would like to try this as a purely Renderer mode that would be cool.
try making a dx renderer that is only one pixel high,
then render a quad to it that has the video as a texture,
and zoom in the texture to the line in the video that you want to capture.
store the ouput of the dxrenderer in a texture buffer
just fill the buffer with new textueres, as old one are being deleted.
show all buffered textures on a spread of quads in a new renderer.
thats what i suggest.
keep is as textures, not as values …
Thank you both. I was just following Eno’s instructions when i got your modified patch through. Sounds (and now looks) as though textures is the way to go.
The second (wishlist) stage (which incidentally is why I had been thinking about Spreads) was that as the video slice progresses down the screen the following rules occur:
intesting colours (ie. colours where there is good differences between RGB values) expand outwards (linearly) to take over pixels that are boring.
if only boring colours are being sampled then the interesting colours ‘linger’ for longer.
have also a look on my khronos-projector vvvv solution. the brightness of the red part of the first texture defines the position in time of the secondtexture.
advantage: you can control the time of each pixel! and the graphic card filters between the timeslices…
disadvantage, the time/volumetexture generator patch doesnt work on all graphicscards.
-and you must have a pixelshader enabled graphiccard.
edit
ohh sorry, that timescape thing i a bit different from this khronos stuff, but i let this post here, just for advertising for my patch ;)
i havent read the timespacesite in the beginning. at the first, the endresult looks similar but its another mechanism… i think the shader coul easiliy be adapted that it read it exchanges the x coordinates with z for each row
Again, thanks for all the input!
I have taken the patch by tonfilm (thanks!) and added a second renderer that output a long timestretch/mapped video. This is now looking very close to what I was after. (see attached patch)
I still need to be able to control the video as it progresses along. I want to implement the “interesting/uninteresting” colour rules as discussed in my earlier post above.
Elektromeier - I really like the look of your KronosProjector. Is there a way to have it control a buffer of live video rather than look at a video file? Couldn’t quite figure that one out?