Fuse prerelease questions

hi all, hi, fuse team!

I checked the latest fuse release for gamma 5. pretty cool!
And I have some questions and feedback:

DrawShaderGraph vs DrawFXGraph

I found them confusing because the connect the same way to the mesh renderer, but they take different inputs. DrawFXGraph seems to be the ShaderFX input.

I tried building the first graph using fuse nodes and the second using shaderfx.
Why does it not work with DrawShaderGraph? shaderfxvsshadergraph.vl (30.4 KB)

BufferToTexture and TextureToBuffer are super useful for debugging and baking data into textures! Very helpful!
image

ComputeTexture2D: very easy to set up and to understand. Adaptive nodes work great when changing vector type. The ComputeGraph2D generates the shader code, and RendererScheduler is the actual compute renderer?

Is domain distortion basically modifying UV sampling coordinates? Why is there a render error when I input vec2 zero as a domain? I understand this as the offset of the original UVs.

How to use External buffer Data:
So, the first dynamic buffer is cpu-> GPU upload, and from BufferIn, we are in compute world. Did I get this right? If so, the connection “dynamic buffer ->bufferin->bufferget” is pretty hard to get if you don’t know how this works under the hood.

I like everything about the particle system.
AmountEmitterThreaded and the separation between the emission and simulation stages are nicely done! It’s obvious what the elements are doing, and creates an excellent structure.

The “discard” patch could tell that this can save performance. Since there is no text, it looks like a clipping function. Is setting an invalid vertex coordinate still a thing to skip the rasterization stage altogether? If so, a node like this would be nice to have.
image

The geometry stage nodes are also pretty intuitive. PointToBox is still a bit loaded, but I guess a for each loop is coming =)

I am pretty stoked for the next release with gamma 5! Performance is pretty good, and compile time is much better than on the old version I worked with. Thank you for all the work! It shows.

1 Like

Sorry to get back this late guess I need to check the forum more often. I have added a fuse version for the draw shader to be able to patch geometry shaders. There are two versions a simple one that gets the position and color target as ShaderNode
image
In your case you were using the one with GpuVoid input to potentially set all kinds of semantics here you have to explicitely set the values as you already did for the vertex input
image

The ComputeGraph2D generates the shader code, and RendererScheduler is the actual compute renderer? Yes correct we might integrate this into the computegraph and make it optional to connected it explicitely.

Is domain distortion basically modifying UV sampling coordinates? Why is there a render error when I input vec2 zero as a domain? I understand this as the offset of the original UVs.

In this case domain distortion is a function if you look inside
image
You see that domain is a function parameter to work properly in this case it should not be set

How to use External buffer Data:
Yes you are right for most of the value inputs there is an automatic conversion that creates those nodes under the hood but it can make sense to still use them to avoid the creation of to many inputs
image
in this case you could also connect the value without the gpuin but would create three inputs instead of one

So thanks for your comments and we are looking forward to the beta release

1 Like