How to convert texture to buffer? How to emit DX11.Particles with velocities according to normals?

I’m trying to modify the Emitter (DX11.Particles.Emitter Layer) help patch (coming with DX11.Particles).

I would like to use the normals of the camera view to create forces or velocities of the emitted particles.

What I’ve got so far is a modified Emitter that outputs a normal world texture generated by DepthReconstruction.
How can I convert the texture to a buffer to use it as input for the Emitter?


Emitter Layer Normals.zip (17.1 KB)

I also lack in-depth knowledge of DX11 components. E.g. I would like to know the differences between the different buffers. Do you know any good resources?

Thanks a lot!

Emitter Layer Normals.zip (8.3 KB)

have a look at this custom emitter. you put in the normaltexture, sample it and use the sampled xyz components as velocity.

Thanks @tmp! Not sure if I’m doing something wrong though. I think uploaded patch is broken.

But using your instructions I managed to create what I was looking for.

Emitter Layer Normals.zip (18.8 KB)

Super cool. Thanks so much!

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.