Not sure if it is a bug, but it feels like one in actual use:
As soon as I activate any kind of transparency (Blend, Alpha, Cutout) in a PBR material, I lose all depth information:
Depth of field no longer works correctly on those objects.
Depth sorting is broken.
During Genuary I did this animation:
As you can see, the new hexagons pop up in front and move into the distance, depth of field works.
I actually liked the version with some transparency more, but everything went haywire there:
Depth of field (same settings as above) is no longer working, everything is blurry always with the objects that have transparency, even if I change settings drastically.
So I would be unable to use DOF with transparent objects in the scene.
Object depth sorting is all over the place, things pop up more or less randomly.
Again, unusable in most cases.
Wouldn’t it make more sense to have depth sorting etc simply be based on the first ray hit, no matter the transparency? I know that transparency is complicated in real time systems, but this feels just wrong?
And if it isn’t a bug, are there any workarounds? I struggled with transparency all month and found it usually mindbogglingly broken or non-intuitive to use…
Additive not being additive, Cutout doing strange things…
I can document those separately if it isn’t user error or “just the way things are” with Stride.
I was using several 5.3.x and 6.x versions during the month and it was the same with all of them. This test was done today with 6.0084.
I’m using Windows 11 home on an Asus laptop with a RTX 3070.
I also had this depth sorting trouble in a scene with particles, where I used words via flipbook textures on sprites with alpha channel and “Blend”.
In that specific case, it was quite important which words were in front and which in the back, since otherwise the meaning could not be transported.
And it just didn’t work, I had to drop the idea for the time being.
Thanks, but somehow all those links seem to imply that depth sorting should still work correctly, don’t they?
All my objects were transparent and were flying away from the camera.
They weren’t large or deep so the “center of object” issue shouldn’t come into play.
Same with the words as sprites.
I reduced the scene to show the core issue.
As soon as either Blend or Additive are enabled/connected, depth sorting becomes random.
I created a colour ramp over lifetime to make it more obvious, things start red, then go through the spectrum till death, moving away from the camera, so smaller is always further away.
As simple as it gets?
Ah you mean it is seen as a single object then?
Outch, that would be bad, that would basically kill everything that makes sense with FUSE to use for any kind of transparency…
Yes fuse currently has no sorting algos integrated, this is something pretty high on the todo list, the spatial hash already has sorting working but it is not based on camera distance and needs to be generalized
Okay, that manual sorting seems to help with the drawing order, but DOF is still not working or using the wrong things, like in your scene it seems to use the floor instead of the particles.
Can I write into the depth buffer manually or is it just not there for transparent objects?
Like I said, first hit would be good enough for me and should work at least for simple DOF.
I disabled all the Juju so it’s more clear to see what happens: SortedInstancingDOF.vl (66.2 KB)
This feels all way more primitive than expected. :-(
I’d have thought that this is what the graphics engine is there for to do.
For the time being, I’ll leave transparency out of things…
Transparent objects will not write into the depth buffer, they will only read from it. This is commonly the case in real-time rendering, as they are drawn in the transparent stage, which is after the opaque stage. See also:
And one can’t just write into the depth buffer?
Like a distance from camera greyscale value?
Like it’s done for post-DOF in compositing?
I guess the DOF effect in Stride is just doing the same.
That would be relatively simple I guess as a part of the material to compute distance to camera.
Like I said, I’m not looking for photorealism, but at least have automatic particle sorting and DOF working on a first-ray-hit basis.
So I dove into the Stride documentation and there it states:
FrontToBackSortMode, which renders objects from front to back with limited precision, and tries to avoid state changes in the same depth range of objects (useful for opaque objects and shadows)
BackToFrontSortMode, which renders objects strictly from back to front (useful for transparent objects)
The setting is “Far to near” (which I can’t find in the docs, but sounds reasonable enough) instead of the above “BackToFrontSortMode” - which is also available.
I can’t try this since it’s precompiled, but it would be interesting if it makes any difference or if the instanced particles simply do not have any knowledge about “far” or “back” at all and are seen as one single object.
Again, I would have assumed that this basic sorting is done by the engine itself and the docs seem to imply that too, so I wonder where exactly things fail.
Would be great if it could be solved on a basic level if that should exist instead of us implementing workarounds.
Okay, so changing that setting in the “Build your own SceneWindow” help patch didn’t do any good, so I guess it’s not a solution or I lack the knowledge (very likely).
Things did flicker a lot with that custom SceneWindow, so it may be something else.
If transparent objects simply do not show up in the depth buffer, I guess we’re out of luck, and that seems to be the case, the Linear Depth patch does not work with them, they simply do not show up.
I get some sort of fake depth effect with the blur, like a lensbaby (can be handy on it’s own), but the depth buffer itself seems empty when I render it.
When I put a sphere mesh into the scene it shows up in the depth buffer, as soon as I apply a material with blend attached it vanishes.