I want my kinect color image to clip correctly with objects within a stride scene. and write a shader, using depth image from kinect and depth buffer from the scene window. I need to do closeups, so precision is a priority.
I tried using linear depth, but I cant see my objects in my depthbuffer. I have no clue on which coordinates (or color values) they are.
and concerning the shader - gamma texture standard is R16G16B16A16 float. So will my stride shaders automatically run 16bit float? gamma lets me connect anything to the shader, and output seems to be max input resolution and max input depth.
To obtain a kinect depth that matches this, the z channel of the WorldImage node can be used.
So will my stride shaders automatically run 16bit float?
I think that, no matter what format texture is input, as long as single precision(float) is used in the shader, it will be computed in 32 bits and the result will be truncated according to the output format.