16-bit texture LUT not working

  • pneumatic
  • Topic Author
More
6 years 7 months ago - 6 years 7 months ago #1 by pneumatic 16-bit texture LUT not working was created by pneumatic
I am trying to make a slight modification to the CustomFX ColorCorrection shader where instead of reading in an 8-bit CFX_lut.png, it reads a 16-bit CFX_lut.png. The purpose is to give greater precision lut values which then enables CeeJay's dither shaders to work properly with them for smooth gradients after applying the lut (CeeJay's dither shader effectively works by only dithering pixels that are between 8-bit values, which don't exist in an 8-bit texture lut).

I have created the 16-bit CFX_lut.png and set Format = RGBA16/RGBA16F in the creation of the texture. The shader compiles without errors but the entire raster is just black pixels. Here is the 16-bit CFX_lut.png file and below is the shader. Any help appreciated.
NAMESPACE_ENTER(CFX)
#include CFX_SETTINGS_DEF

#if (USE_CUSTOM == 1)
texture texLUT < string source = "ReShade/CustomFX/Textures/CFX_lut.png"; > {Width = 256; Height = 1; Format = RGBA16;};

sampler2D SamplerLUT
{		  
	Texture = texLUT;
	MinFilter = NONE;
	MagFilter = NONE;
	MipFilter = NONE;
	AddressU = Clamp;
	AddressV = Clamp; 
		 
};

float4 PS_Custom(float4 vpos : SV_Position, float2 texcoord : TEXCOORD) : SV_Target
{
	float4 col = tex2D(RFX_backbufferColor, texcoord);	
	col.r = tex2D(SamplerLUT, float2(saturate(col.r),0)).r;
	col.g = tex2D(SamplerLUT, float2(saturate(col.g),0)).g;
	col.b = tex2D(SamplerLUT, float2(saturate(col.b),0)).b;
	return col;
		
}

technique Custom_Tech <bool enabled = RFX_Start_Enabled; int toggle = Custom_ToggleKey; >
{
	pass CustomPass
	{
		VertexShader = RFX_VS_PostProcess;
		PixelShader = PS_Custom;
	}
}

#endif

#include CFX_SETTINGS_UNDEF
NAMESPACE_LEAVE()
Last edit: 6 years 7 months ago by pneumatic.

Please Log in or Create an account to join the conversation.

  • Marty McFly
More
6 years 7 months ago - 6 years 7 months ago #2 by Marty McFly Replied by Marty McFly on topic 16-bit texture LUT not working
ReShade can't read that format for external textures. All R RG RGBA 16 and 32 are for internal textures only. And the output of each shader is then rounded to 8 bit per channel because that's the format of the backbuffer, applying any dither afterwards that only relies on values between 8 bit ones is senseless. You'd require to integrate the dither shader into the lut one.
Last edit: 6 years 7 months ago by Marty McFly.
The following user(s) said Thank You: pneumatic

Please Log in or Create an account to join the conversation.

  • pneumatic
  • Topic Author
More
6 years 7 months ago - 6 years 7 months ago #3 by pneumatic Replied by pneumatic on topic 16-bit texture LUT not working
Thanks Marty, understood.

Marty McFly wrote: And the output of each shader is then rounded to 8 bit per channel because that's the format of the backbuffer, applying any dither afterwards that only relies on values between 8 bit ones is senseless.


Indeed, CeeJay's dither shader is being applied to the high precision floating point values *before* final output (truncation to 8-bit) stage. The way he's done it is pretty clever actually, in such a way that the final 8-bit output only receives noise on pixels that were in between 8-bit values inside dither.h (method 2). I believe this is what keeps the visible noise level so low, which in my testing is needed to keep the noise level acceptable on 6-bit/FRC monitors.

Shame that we cannot read in a 16-bit LUT .png, because it's the only way to take advantage of CeeJay's exact method. The method I am using currently is inferior - by first applying random dither noise to the image (by increasing the dither_shift numerator in dither.h method 2; a value of ~3.75 seems to work well on my 8-bit monitor) and then afterwards applying your 1D texture lut shader. The gradient becomes free of banding artefacts, however noise is much more visible than CeeJay's original implementation, to the point where it doesn't look great on a 6-bit monitor. I have also adapted Martin's grain.h shader to do the same thing temporally, and the result is better (especially using CeeJay's subpixel trick of dithering separately on red+blue - very interesting and I have no idea why it works so well).
Last edit: 6 years 7 months ago by pneumatic.

Please Log in or Create an account to join the conversation.

We use cookies
We use cookies on our website. Some of them are essential for the operation of the forum. You can decide for yourself whether you want to allow cookies or not. Please note that if you reject them, you may not be able to use all the functionalities of the site.