Nvidia antialiasing - compatibility bits (DX1x) with ReshadeEffectShaderToggler
- Darkloke
-
Topic Author
Less
More
7 months 3 weeks ago #1
by Darkloke
Nvidia antialiasing - compatibility bits (DX1x) with ReshadeEffectShaderToggler was created by Darkloke
Greetings.
Question - anyone tried to find these antialiasing compatibility bits for DX11 games using Reshade Effect Shader Toggler addon? As i understand (correct me if i am wrong) this addon shows different rendering stages on which you can enable/disable different shader effects (for example you can find a stage which doesnt render a game UI, so in this case reshade shaders wont affect your GUI). So i wonder can this stages be used as AA bits? I made a several tests in Mass Effect 2 LE, since this game aliased a lot and all reshade AA shaders i tried not good enough for me.
imgur.com/a/5eOJyAm
It seems these values at least affect the game, but regretfully i didnt found an "ideal" one. Still i am not an expert here and i hope addon creator or anyone else could shed some light here and say if this search is possible at all.
--- Why all this? ---
As you probably aware now its almost impossible to force different nvidia anti aliasing presets via nvidia profile inspector for games on DX11 and above. Theoretically it could be possible using compatibility bits, but nvidia itself doesnt provide them and i am not sure if there is a way to discover them manually except the method i described above. There is a google sheet which provides such bits for DX9 games, but there is none for DX11.
Question - anyone tried to find these antialiasing compatibility bits for DX11 games using Reshade Effect Shader Toggler addon? As i understand (correct me if i am wrong) this addon shows different rendering stages on which you can enable/disable different shader effects (for example you can find a stage which doesnt render a game UI, so in this case reshade shaders wont affect your GUI). So i wonder can this stages be used as AA bits? I made a several tests in Mass Effect 2 LE, since this game aliased a lot and all reshade AA shaders i tried not good enough for me.
imgur.com/a/5eOJyAm
It seems these values at least affect the game, but regretfully i didnt found an "ideal" one. Still i am not an expert here and i hope addon creator or anyone else could shed some light here and say if this search is possible at all.
--- Why all this? ---
As you probably aware now its almost impossible to force different nvidia anti aliasing presets via nvidia profile inspector for games on DX11 and above. Theoretically it could be possible using compatibility bits, but nvidia itself doesnt provide them and i am not sure if there is a way to discover them manually except the method i described above. There is a google sheet which provides such bits for DX9 games, but there is none for DX11.
Please Log in or Create an account to join the conversation.
- BONKERS
-
Less
More
4 months 1 week ago - 4 months 1 week ago #2
by BONKERS
Replied by BONKERS on topic Nvidia antialiasing - compatibility bits (DX1x) with ReshadeEffectShaderToggler
Hey, I am the one who has been maintaining that document the last decade.
I haven't tried(I did try to use that tool to disable problematic shader effects once. But it was too unpredictable), but there isn't much point because Nvidia didn't flesh out the functionality of the driver for DX10+. The functions in the driver just don't really work even though there *are* some functions in there. It's hard to tell if what is there actually functions in anything but DX10. Even the DX10 titles I have historically tried no combination of flags will work. Even though there are flags to look for certain buffer formats (RGBA8,16,32f eg)
The one flag that is actually extremely crucial for it to work in DX9 at all is completely missing. Which is "FORCE_OFFSCREEN_SUPERBUFFERS_NOT_TO_DOWNFILTER" (When combined with looking for RGBA8 buffers in DX9 flags it's 0x000000C0) . I'm not a programmer or anything so I don't know if this is the crucial difference or not in terms of working in DX10+ or if it's irrelevant or not. Or if the driver even actually attempts to hook rendering in DX11 or just DX10. But in DX8/9 most games , especially games with deferred lighting or rendering forcing SGSSAA doesn't work without this function. (MSAA can work without it, it becomes 0x00000040. But the performance hit for MSAA when using this is nearly the same as SGSSAA in a lot of cases. I've tested a lot of them. It makes less sense to use MSAA unless you really hate SGSSAA. )
Just for reference this is what the list of functions the driver has for DX9 when trying to force AA
u.cubeupload.com/MrBonk/AAflags.jpg
And this is DX10
u.cubeupload.com/MrBonk/AAflags2.jpg
Believe me, this is something I have been frustrated about for near a decade now. And nothing frustrated me more than when Sega moved Phantasy Star Online 2 from DX9 to DX11. They downgraded several visual effects and the entire way characters are lit, on top of the fact that there was no way to get good AA any more. Their TAA is inoffensively low quality (Adding FXAA to it doesn't help much). And when they added FSR1 and DLSS it didn't change things. FSR1 is a joke unless you have good AA underneath it. And their implementation of DLSS is one of the worst i've seen and the quality is offensively bad compared to SGSSAA. In addition DLSS doesn't even work in base PSO2 in cutscenes. The game switches it off automatically, turns on TAA+FXAA instead. When I emailed support about a dozen times after it was added, they didn't consider it a bug and wouldn't be fixed.
The literal, only person and game studio that hears my pain and cries is Durante and his studio and the releases they do.
AND EDIT: Sorry I am stupid and didn't read all of your post. If you want to try. 0x81101FFF is the combination value that enables all DX10+ bits available for forcing AA
0x0000007F Will enable looking for RGBA8,R16F,R32F,Z buffers,RG11B10 and R24X8 formats. But it specifies only textures same size as the primary flip chain. So anything smaller than your output resolution likely won't be caught.
0x0000017F does all of the 2nd but also "Force multisampling on in the rasterizer state". Whatever that actually means for hooking into the rendering
I haven't tried(I did try to use that tool to disable problematic shader effects once. But it was too unpredictable), but there isn't much point because Nvidia didn't flesh out the functionality of the driver for DX10+. The functions in the driver just don't really work even though there *are* some functions in there. It's hard to tell if what is there actually functions in anything but DX10. Even the DX10 titles I have historically tried no combination of flags will work. Even though there are flags to look for certain buffer formats (RGBA8,16,32f eg)
The one flag that is actually extremely crucial for it to work in DX9 at all is completely missing. Which is "FORCE_OFFSCREEN_SUPERBUFFERS_NOT_TO_DOWNFILTER" (When combined with looking for RGBA8 buffers in DX9 flags it's 0x000000C0) . I'm not a programmer or anything so I don't know if this is the crucial difference or not in terms of working in DX10+ or if it's irrelevant or not. Or if the driver even actually attempts to hook rendering in DX11 or just DX10. But in DX8/9 most games , especially games with deferred lighting or rendering forcing SGSSAA doesn't work without this function. (MSAA can work without it, it becomes 0x00000040. But the performance hit for MSAA when using this is nearly the same as SGSSAA in a lot of cases. I've tested a lot of them. It makes less sense to use MSAA unless you really hate SGSSAA. )
Just for reference this is what the list of functions the driver has for DX9 when trying to force AA
u.cubeupload.com/MrBonk/AAflags.jpg
And this is DX10
u.cubeupload.com/MrBonk/AAflags2.jpg
Believe me, this is something I have been frustrated about for near a decade now. And nothing frustrated me more than when Sega moved Phantasy Star Online 2 from DX9 to DX11. They downgraded several visual effects and the entire way characters are lit, on top of the fact that there was no way to get good AA any more. Their TAA is inoffensively low quality (Adding FXAA to it doesn't help much). And when they added FSR1 and DLSS it didn't change things. FSR1 is a joke unless you have good AA underneath it. And their implementation of DLSS is one of the worst i've seen and the quality is offensively bad compared to SGSSAA. In addition DLSS doesn't even work in base PSO2 in cutscenes. The game switches it off automatically, turns on TAA+FXAA instead. When I emailed support about a dozen times after it was added, they didn't consider it a bug and wouldn't be fixed.
The literal, only person and game studio that hears my pain and cries is Durante and his studio and the releases they do.
AND EDIT: Sorry I am stupid and didn't read all of your post. If you want to try. 0x81101FFF is the combination value that enables all DX10+ bits available for forcing AA
0x0000007F Will enable looking for RGBA8,R16F,R32F,Z buffers,RG11B10 and R24X8 formats. But it specifies only textures same size as the primary flip chain. So anything smaller than your output resolution likely won't be caught.
0x0000017F does all of the 2nd but also "Force multisampling on in the rasterizer state". Whatever that actually means for hooking into the rendering
Last edit: 4 months 1 week ago by BONKERS.
The following user(s) said Thank You: acknowledge
Please Log in or Create an account to join the conversation.
- nVTi
-
Less
More
3 months 11 hours ago #3
by nVTi
Replied by nVTi on topic Nvidia antialiasing - compatibility bits (DX1x) with ReshadeEffectShaderToggler
What is the software you made the screenshots from? I would like to learn about the bits, but nVIDIA Profile Inspector does not have any description of them.
Please Log in or Create an account to join the conversation.