Get Opaque (Scene texture) and Depth in transparent render stage

Hey there,

Not sure if this is more of a Stride or Fuse topic.
Found DepthBase and Opaque base inside the Stride shader explorer, it says these are available as a resource for materials / drawshaders that are inside the transparent render stage. This would be super cool since we could create some nice refraction effects etc via materials.

OpaqueBase Code:

// Copyright (c) .NET Foundation and Contributors ( & and Silicon Studio Corp. (
// Distributed under the MIT license. See the file in the project root for more information.
/// <summary>
/// Defines a texture for the output of the opaque render pass
/// and a helper function to extract the color of it.
/// </summary>
shader OpaqueBase : Texturing
    // -------------------------------------
    // Resources
    // -------------------------------------
    rgroup PerView.Opaque
        stage Texture2D OpaqueRenderTarget;

    float3 GetOpaqueColor(float2 uv)
        return OpaqueRenderTarget.SampleLevel(PointSampler, uv, 0.0).xyz;

The Scene Window has a Toggle " Bind Depth As Resource During Transparent Rendering" This toggle is missing for the Opaque Render Stage as a resource it seems to be inside the VL.Forwardrenderer but the pins are not exposed:

tried adding it myself without luck:
Screenshot 2024-02-27 104356

Inside the patch ive tried using Fuse Mixin nodes and also shaderfx trying to grad depth and opaque. Any help or pointers are greatly appreciated. (23.1 KB)

1 Like

This would only work with materials. You need to connect your ShaderFX node to the GPU input of a Material or use the MaterialExtension node. Also, make sure that the transparency of the material is set.

You shouldn’t pass the depth up in the patch, this depth texture cannot be bound to the shader because it is currently rendering to it. if you enable one of these bools, the rendering engine will create a copy and bind the copy to the shader.

So start very simple, make a ShaderFX, inherit from DepthBase and sample the depth in it, then connect that to a material that has transparency set.

thx, but not really sure how to wrap this. I think i need to RTFM

DepthBase reference:

// Copyright (c) .NET Foundation and Contributors ( & and Silicon Studio Corp. (
// Distributed under the MIT license. See the file in the project root for more information.
/// <summary>
/// Defines a depth texture.
/// Various helper functions to extract information from a depth buffer.
/// </summary>
shader DepthBase : Camera, Texturing
    // -------------------------------------
    // Resources
    // -------------------------------------
    rgroup PerView.Depth
        stage Texture2D DepthStencil;

    // Sample the depth from the texture
    float GetZProjDepthFromUV(float2 uv) {
        return DepthStencil.SampleLevel(PointSampler, uv, 0.0).x;

    float GetZProjDepthFromScreenPosition(int2 screenPosition) {
        return DepthStencil.Load(int3(screenPosition,0), 0).x;

    float ComputeDepthFromZProj(float depth) {
        // Retro project non linear 1/z depth to linear depth in view space
        return ZProjection.y / (depth - ZProjection.x);

    float ComputeDepthFromUV(float2 uv) {
        return ComputeDepthFromZProj(GetZProjDepthFromUV(uv));

    float ComputeDepthFromScreenPosition(int2 screenPosition) {
        return ComputeDepthFromZProj(GetZProjDepthFromScreenPosition(screenPosition));

Attempted using shaderFX, im getting a cpu vector2 as an input for uv as a node and a texture input ( which im trying to avoid a need for)

shader SceneDepth_ShaderFX : ComputeFloat, DepthBase
		stage float2 uv;
    override float Compute()
        float depthZ = GetZProjDepthFromUV(uv);
        return depthZ;
}; (25.9 KB)

Looks all good, unfortunately, this seems to be a bug, the depth stencil isn’t set automatically or cannot be sampled. If you are brave, you can try to use a MaterialExtension and see if that does the trick. But your patch looks correct and it should work. I would consider this a vvvv and or stride bug.

1 Like

thanks for having a look.

so tying to sum up what weve found out:

  • Not sure if the SceneWindow Pin “Bind Depth As Resource During Transparent Rendering” is working properly

  • The pin for the operation “Bind Opaque as resource during transparent Rendering” is missing - also not sure if its working

  • Its should work in Materials that are set as transparent, would be great if renderEntities inside the transparent render stage work aswell

  • Bug in Stride or vvvv: the depth stencil isn’t set automatically or cannot be sampled.

Steps to test this could be:

  • try achieving this as a material extension ( only possible for depth since opaque pin is missing - no idea how to do this myself )

  • try creating such a materials inside the stride editor and adding it as an asset to test functionality ( no idea how to do this myself - would also be great if these would be selectable as resources within the material editor in stride )

  • ShaderFX pin generation issue, its creating cpu pins for UV for some reason, not sure

  • Fuse mixin with Stride shader inheritans seems to work, and the shaders do output black but no way to test or check whats happening inside

Really not sure if these are the right points of attack but wanted to sum up things if someone wants to jump in.
Personally i would mainly need this inside fuse but having both options would be amazing.


This is correct as is, if you want to use the model texture coordinates you have to inherit from Texturing and use streams.TexCoord

Had a first look at this and here are my observations

  • Our shader node factory should recognize DepthBase.DepthStencil & OpaqueBase.OpaqueRenderTarget as well-known parameters and suppress generating a pin. From then on we can be sure that the node does not overwrite from whatever would come from below from the engine.
  • The ForwardRenderer needs to have the respective properties set to true. We find two issues here: the default as shown on BindDepthAsResourceDuringTransparentRendering shows false, while internally the default is true. This should get double checked. Second issue is as you describe that the BindOpaqueAsResourceDuringTransparentRendering is not exposed at all.
  • The depth buffer is only bound during the transparent stage for render features inhertiting from RootEffectRenderFeature. Currently only two render features inherit from that base class, that is MeshRenderFeature and ParticleEmitterRenderFeature. The former is about the material system we all know, the latter I don’t know about - sounds interesting though. In any case, our RenderEntity does not make use of a render feature inherting from RootEffectRenderFeature. Whether or not that would make sense/would be possible to implement I don’t know - would be a journey on its own. The conclusion here is, that it will currently only work inside the material pipeline for transparent materials.
  • Whether or not a material is considered transparent by the MeshRenderFeature is determined by its MeshTransparentRenderStageSelector which in turn looks at the MaterialPass.HasTransparency property. And that one is currently only set to true by Additive/Blend and the glass based materials.

So I’d say things on Stride side are fine. It’s just rather hard to tap into it.

In a quick hack (where I skipped generating a pin for the DepthStencil, point 1 above) I believe I managed to get your example to a state where it samples the depth buffer. But since one needs to combine it with an arbitratry glass material feature to get it rendered on the transparent stage doesn’t really help in understanding what’s going on.

Currently not sure how to proceed here. I think it would be good to know what we want to achieve here, so if you could zoom out of all these details and try to explain what you’re after from a top down view would be good.

1 Like

I hope this is the same topic, but during Genuary I found that transparency basically disables depth. I got all kind of explanations in return but stayed somehow unconvinced.
This leads to the DOF post FX not working as soon as transparent materials are involved and depth sorting also is broken by it, so objects render in a kind of random depth order (I was mostly using FUSE).

If any of your findings here can improve those realworld-issues, it would be very helpful.

If this is another topic, feel free to remove my post.



@Elias thanks a lot for checking this issue. That already sounds like a lot of progress!

to ZOOM out for you:

  • the general motivation here is to have the depthstencil and the opaque texture or color available as a resource within the Material pipeline on objects that are set to transparent. ( everything else inside the patch is me trying to understand how this works and or not works)

  • This allows you to create to create a bunch of nice material effects, Screenspace reflection and refraction, Fake Screenspace SubsurfaceScattering, adding a fake density to materials, adding shoreline effects to intersecting objects, nicer raymarching materials.

  • Ideally this would be available as a resource within fuse, hopefully via the mixin nodes in the patch.

  • the RenderEntity part of the patch was me trying to figure out how this would work or not. Really getting these working any way at all without using pads would be amazing.

  • my personal motivation for these features would be the ability to port a lot of unity and unreal material effects that use these resources.

let me know if i can support by providing example patches

@Thomas_Helzle i think this is unrelated and an issue in many game engines with Transparency. Often the solution is using “cut out” as a transparency mode, then using a dithered alpha mask to set the transparency while still writing to the depth stencil. but i would suggest creating a new topic.