first i should say im new to both vvvv and hlsl.
after messing around with it for a while i had a look at the different (very impressive) GPU contributions and decided to try and build some GPU particle engine myself, based on dx11.
the idea is to use dynamic textures to generate some initial values (positions and velocities for now) and then use texture effects to process/update the data. the updated positions texture then get passed to a vertex shader that places vertices in 3d space, to be drawn as sprites. after struggling for a while (mainly getting the right output texture formats) basic functionality seems to be working and exploration can begin.
now if i wanted an effect that confines the positions to x,y,z < +/-1 and reverses the respective velocity component when a particle leaves the allowed space, i needed to update both, positions and velocities.
is it possible to create an effect that outputs 2 textures (2 pins or maybe as texture array?) or do i have to create 2 effects, one for position and one for velocity?
or is some sort of StructuredBuffer or RWTexture better suited for that purpose. how about shared textures? what examples could i look into for that?
or should i start looking into compute shaders? is there any working example of compute shaders in vvvv around that i could study?