Need help optimize FPS, adjusting shaders for complex preset.
- Yundaz
- Topic Author
I am currently working on creating a rather complex preset that utilizes a lot of the great shaders out there to achieve the best-looking result in-game.
However, having plenty of shaders on top of one another will cause fps drops.
Especially when you utilize shaders on-top of one another like qUINT_mxao, qUINT_ssr, ReflectiveBumpMapping, RadiantGI and a few others.
Now I know what you are thinking... That must look horrendous using all those shaders together especially when you read I want to use qUINT_ssr in-game live?!
But I am often using these so subtle and with a CanvasMask shader so I can locate a specific shader in a specific area only so I make it look like light is bouncing on distance environment objects for example by using SSR shader. So I am using most of these type of shaders in tricky hacky ways.
It started out as a concept to see how much shaders I can use. Now since a lot of (Guild wars2) player really like the look the problem that occurs is large FPS drops which make them rather not use it for now. from 130 fps to 50fps and lower. guildwars2 is already a pretty bad fps wise so having a preset that even takes that down further is a big nono for the community.
The question is. Is there any way I can cut back on fps here, since I am often using these shaders in a rather subtle not your usual type of way I often do not need all the detail and accuracy or resolution most of these shaders provide.
I am not known with code but I was thinking perhaps there is a way to lower the resolution and calculations these shaders need in order to achieve what I want it to achieve? I am a 30 year old vfx artist who found a hobby in creating complex preset files and sharing what I have been creating with the tools provided. Any help is welcome! would really like to make this work and happen. Could anybody help me with this endeavour? Thank you for your time.
My discord name: Yundaz#4269
ps: I can send you some example images if interested.
Please Log in or Create an account to join the conversation.
- robgrab
Reshade High Contrast Filmic LUT
Obviously, depth based shaders for things like depth of field and ambient occlusion are the costliest and will kill performance regardless. If you want to know how to create a LUT here's a link to that thread:
LUTs: Powerful Color Correction - The Guide
Please Log in or Create an account to join the conversation.
- Yundaz
- Topic Author
Please Log in or Create an account to join the conversation.
Please Log in or Create an account to join the conversation.
- Viper_Joe
I've noticed this as well whenever I convert multiple color grading effects to a LUT image. Is there actually a particular reason as to why that happens? Because I was under the impression that it should look exactly the same.robgrab wrote: One tip is to combine shaders that affect things like color and contrast into a single LUT file, which has no performance impact. I'd created a Resident Evil 3 Reshade preset, consisting of about 10 different shaders, that was killing my performance. Once I combined everything into a LUT file I only needed to load one shader (LUT.fx). While it didn't look 'exactly' the same it was close enough.
Please Log in or Create an account to join the conversation.
- canceralp
One other reason might be the shader itself. Let's take Contrast stretch shader for example. It is a smart filter that adjust it's parameters according to the image at that moment. So, if you take a screenshot for any moment, it will only capture that moment's values and the LUT will not be containing that shader's intentions.
Vibrance, Clarity, anything that hours or sharpens, Lightroom Gamma are all subject to errors on the LUT file.
Please Log in or Create an account to join the conversation.