How to map colors from an HDR image into SDR range?

hi all,

this topic is related to UV coordinates in prerendered footage - SRGB issues

I use 16 bit footage with baked in uvmaps, and a shader to map the faces onto. This is what the windows preview looks like, (I guess it somehow remaps the tone values) and what I want the final output too look like.

When playing back dds R16G16B16A16 you can actually see that a lot of the image is outside SDR monitor range. I want to remap the colors back into SDR range. (The faces are marked in the alpha channel)

image

So, what’s an easy way to do this? I don’t need perfectly matching colours, just a good approximation to bump up saturation and dynamic.

is it HDR or is it just missing gamma?

that’s the last message you posted on the linked thread.

@woei that comment was just about how Blender displays the 16bit image in its viewport. It seems to remap to SDR, because it had to look more washed out. When exporting everything is fine with the 16bit UV map.

The animation was not done in-house, I believe they worked with After Effects in SDR space, then switched the face textures for my 16bit uv map, then reset the composition settings and exported as 16bit png sequence. I could get more information if that helps.

When mapping the R8G8B8A8 images from my camera into the R16G16B16A16 footage, the faces look like they are in SDR range. I want to shift the dynamic range from the “background” to match the faces.

My composited final output image is R8G8B8A8 .

it looks like they stored sRGB values. so if you use a format without sRGB in the name, you need to convert the values to linear color space. so sRGB to RGB.

but make sure to extract/treat the UV values differently, you most likely don’t want to apply the curve to them.

but the real issue is that the texture contains the wrong values, if you can, tell them to store the values in linear color space.

The AE project ist set to 16bit linear, my uv texture is 16bit linear (because the mappig works correctly) and the animated images are 8bit sRGB.

They convert the 8bit images to linear rgb, export in linear HDR, I can map that more easily back to sdr, and afterwards do camma correctin myself. Did I get this right?

you don’t need to do anything special if the values are already in linear space because the Stride pipeline is HDR in linear color space. all conversions will happen automatically on the GPU at the right moment unless you do something special with the raw values.

where do you think you have to convert to SDR?

can you post one of your 16bit images?

the preview you posted doesn’t really look like HDR data rendered directly to me; i’d expect white clipping and excessive contrast.
in case it really is, you’d need to tonemap HDR range to SDR.
if it’s only about linear vs gamma rgb, here are the simple approximations for conversion:

Gamma(x) = pow( x, 1.0/2.2 )
DeGamma(x) = pow( x, 2.2 )

DeGamma did the trick!

@tonfilm good to know for next time, but that’s the quicker fix for the rendered footage

These are not the correct ones in this case. They convert the screen gamma, but not the sRGB to linear, these are two different things, although quite similar. The right ones for sRGB are here:

you can use them as TextureFX or in the shader by adding ColorUtilityTemp to your shader or by copying the shader function.

1 Like

thats why i said

the even less accurate formulae for saving even more gpu cyles would be

Gamma(x) = sqrt(x)
DeGamma(x) = x*x

for anyone too lazy to read the links mentioned above:
the reason the precise srgb conversion is more complex is that it has a linear portion at the dark end and the majority of the curve has gamma 2.4

1 Like