Get bit depth of backbuffer?
- Topic Author
- Posts: 28
TreyM wrote: The current TriDither shader is apparently useless because of this, and I'm going to do a pull request to make it a callable function instead of a shader as it is now. Once that's done, we can add the dither function call to problematic effects, then we can (in theory if crosire is in favor) optionally toggle "dithering" globally via preprocessor definition.
You can make Tridither work by moving it to the start of the pipeline and increasing the strength of the noise:
color.rgb += triDither(color.rgb, texcoord, Timer.x) * 4;
But this method of adding noise beforehand and then applying effects to the noisy image before rounding again is not as good as it requires more noise to work well and is less accurate (10-bit instead of 16-bit according to screenshot analysis).
Once you see how well the high precision method works you won't care about this crappy method anymore! Especially if you do it on green magenta subpixels and make it temporal by adding frac(RFX_Timer/1000) to the seed, then it's totally invisible even on 6-bit+FRC monitors and on an 8-bit monitor you have effectively 16-bit depth with no side effects, it's pretty amazing tbh.