Shader to encode 16bit channel to 2x 8Bit channels

Im trying to record the depthbuffer of a kinect, obviously thats in L16, but I need to record as close to realtime as possible, and HDR saving drops the fps too much, so I want to stored the 16bits of grey into 2 channels of an 8 bit image.
I think these shaders should do this, but when I do a == compare I only see something when the texture is R16 not L16, and its not pure white so it isn’t working.
Any ideas what I can do to make this work, or where my concept is amiss?

Thanks

Cat

Shader to encode 16bit channel to 2x 8Bit channels (6.3 kB)

L16 is a weird format…

There is a problem in your compare shader, it wont get col==col2 exactly (there will be a micro difference between these values) - better “highlight” the difference, like abs(col.r-col2.r)*1000 or something

also, how could it be faster to save 8bit texture? Have you tried saving R16F .dds ?