I think the current way to check for the right depth input settings - display depth, changing preprocessor definitions and reloading a couple of times - is a bit clumsy. And in order to make that a bit more user friendly I had the following ideas:

- Have a shader that let's you change the depth input options (reversed, logarithmic, upside down) in real time
- By default the shader shows you the derived normals and not the raw depth values - so you actually have a clue about what you're looking at and you see that the logarithmic setting actually makes a difference

Here's a rough draft of what I came up with:

DisplayDepth
You just have to pick the 'Depth Settings' value that makes the output look the best.

What should be put into the preprocessor definitions is shown in the tooltip:

0: RESHADE_DEPTH_INPUT_IS_REVERSED=0
RESHADE_DEPTH_INPUT_IS_LOGARITHMIC=0
1: RESHADE_DEPTH_INPUT_IS_REVERSED=1
RESHADE_DEPTH_INPUT_IS_LOGARITHMIC=0
2: RESHADE_DEPTH_INPUT_IS_REVERSED=0
RESHADE_DEPTH_INPUT_IS_LOGARITHMIC=1
3: RESHADE_DEPTH_INPUT_IS_REVERSED=1
RESHADE_DEPTH_INPUT_IS_LOGARITHMIC=1

And what it then looks like:

Your thoughts?