I’ve built a patch which uses a slit scan style technique to create animations like this:
It works pretty well, but I’m limited to loading in spreads of textures from image files on my hard drive. It occurs to me that if there was a way to grab each frame from a webcam stream and add it to a spread, I could accomplish something pretty similar with live video, which would be amazing for live performance.
The Queue(EX9.Texture) Node kind of almost works for this purpose, but the performance is awful and VVVV crashes if I queue in more than a few frames. Is there any technique using DX11 nodes I could use to achieve something like this with better performance?