Match Kinect depth image to scene window depth buffer

hi all,

I want my kinect color image to clip correctly with objects within a stride scene. and write a shader, using depth image from kinect and depth buffer from the scene window. I need to do closeups, so precision is a priority.

how can I match the r16unorm kinect stream to the
D24 deptstencil of a stride scene window? "A 32-bit z-buffer format that supports 24 bits for depth and 8 bits for stencil.

I tried using linear depth, but I cant see my objects in my depthbuffer. I have no clue on which coordinates (or color values) they are.

and concerning the shader - gamma texture standard is R16G16B16A16 float. So will my stride shaders automatically run 16bit float? gamma lets me connect anything to the shader, and output seems to be max input resolution and max input depth.

kinrv.zip (8.8 KB)

The following methods can be used to linearize the depth value.

linearDepth = ZProjection.y / (depth - ZProjection.x); // [0.0-1.0] to [near-far]

Source code of the relevant stride.
stride/CameraKeys.cs at master · stride3d/stride (github.com)
stride/Camera.sdsl at master · stride3d/stride (github.com)

To obtain a kinect depth that matches this, the z channel of the WorldImage node can be used.

So will my stride shaders automatically run 16bit float?

I think that, no matter what format texture is input, as long as single precision(float) is used in the shader, it will be computed in 32 bits and the result will be truncated according to the output format.

kinrv_2.zip (11.5 KB)

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.