Storing a queue of frames from a live webcam input as a spread of textures in real time. Possible?

Hi all.

I’ve built a patch which uses a slit scan style technique to create animations like this:

It works pretty well, but I’m limited to loading in spreads of textures from image files on my hard drive. It occurs to me that if there was a way to grab each frame from a webcam stream and add it to a spread, I could accomplish something pretty similar with live video, which would be amazing for live performance.

The Queue(EX9.Texture) Node kind of almost works for this purpose, but the performance is awful and VVVV crashes if I queue in more than a few frames. Is there any technique using DX11 nodes I could use to achieve something like this with better performance?

hei feather,

strange you say Queue (EX9.Texture) is not working for you since we’re not aware of any such issues with it. in fact there is a girlpower which does (in simple terms) what you’re asking for: a slitscan or timewarp:
performance with it was smooth already on graphiccards more than 10 years ago so i am not sure what could be going wrong there on your end. update graphiccard drivers?

there is also this contribution: z-slitscan ah and another one for dx11: slitscan-ringbuffer-(dx11)

Thanks Joreg!

I’ll have a look at those examples and see if I can work something out.