While working on a
shader
I came across an issue regarding the depth-buffer. In some games it doesn't matter whether RESHADE_DEPTH_INPUT_IS_LOGARITHMIC is set to 1 or 0. DisplayDepth.fx shows you the depth-buffer regardless. But when it comes to using ReShade::GetLinearizedDepth() the results are quite different.
Here is an example (The Evil Within, d3d11)
The shaders used are: DisplayDepth.fx,
MeshEdges.fx
and qUINT_mxao.fx
In the images on the top: RESHADE_DEPTH_INPUT_IS_LOGARITHMIC=0
In the images on the bottom: RESHADE_DEPTH_INPUT_IS_LOGARITHMIC=1
What is going on there? Is it just that the values from the linear depth-buffer are in a range where they are still fairly linear after the conversion from logarithmic?