- Posts: 2
4.4 causes FPS issues.
- Tempus Lux
- Topic Author
I could literally be standing AFK in-game doing nothing, and the game would lose 10 - 20 FPS for a few seconds, and my game would stutter so atrociously. Any reason why this might be?
I'm playing Transformers Fall Of Cybertron in case anyone was wondering.
- Posts: 1
Didn't mean to steal your thread but just sharing my experience too in case it bolsters the admins to take a look at any potential fixes if possible.
I personally haven't seen what xtremeheat is notcing above, but I want to report an odd performance issue that's current with 4.4:
The repository EyeAdapation shader, and the older Pirate's Eye Adaption Shader, which presumably work in similar ways, both cause FPS to quite literally halve when enabled even as the only shader selected. Looking at the code it doesn't appear to be a complex shader, but I've tested this across a multitude of games. I don't remember this performance impact on older versions of Reshade, I suspect it started with the 4.4 series (I vaguely remember using it prior to 4.4 at times). It's very bizarre -- I mean, it's slices FPS far more than MXAO for example, which has to be the most demanding shader in the repository currently.
Can you investigate Crosire?
Edit: If it matters, I'm running at 4k though I don't imagine Eye Adaption is impacted by resolution as shaders like MXAO are for eg.
- Posts: 302
I'm the author of EyeAdaption and just tested it with a dx9 and dx11 game (on windows 7) and could not reproduce your issue. Fps hit is the same like in older versions. On what os and api do you experience this?
I'm running Windows 10 1809. API for most recent tests is DX11. If it matters, this is also while use SLI (2 x 1080Tis)
I noticed this reduction in FPS most recently in Dying Light and now The Outer Worlds. I tried to take screenshots showing the Reshade UI and FPS counter, but the screenshot tool doesn't capture Reshade interface. In short however in my most recent example:
The Outer Worlds - 4K @ 60fps (vsync capped by screen) Ultra details except shadows (to maintain 60fps) --
No shaders -- 60 fps
Only enable Eye Adaption -- 44 fps.
Only enable MXAO -- 60 fps (and considering this is MXAO at 4K!)
I can't imagine why Eye Adaption would have a more significant impact than MXAO (which is usually the most demanding shader). I don't remember this happening with older Reshade versions and Eye Adaption.
I checked the log in Reshade and there appear no compile errors or warnings, but can paste a log if you need it.
EDIT: ok further testing. tried some other games
Dues EX: MD
Deep Rock Galactic
Rise of the Tomb Raider
All of these worked fine with EA. No performance issues. All of the above are with SLI too. Looking at the stats page, EA was using on average 0.5ms of GPU time with these games on my system. Loading The Outer World's again, EA uses 9.8ms. MXAO on Outer Worlds uses 2.1ms. Maybe the issue is very engine specific?
- Posts: 3785
- Tempus Lux
- Topic Author
- Posts: 2
Basically, my laptop's fans were clogged with dust & I had to get them cleaned coz I noticed my system was getting hot by simply browsing the internet.
I Just got my fans cleaned & some new thermal paste applied, and now all my games are running fine at 60FPS again, and with as many shaders as I desire.
The only "dips" I'm getting are only during quick loading zones as I enter a new BSP, which last like 0.1 seconds, but that's a given seeing as how many games freeze as you're entering a new area in campaigns anyway
Other than that, I'm getting no FPS drops anymore.
MORAL OF THE STORY: Keeping fans clean, grants a healthy screen.
Sorry for delayed reply.
crosire wrote: It does matter a lot. SLI does not work well with ReShade, because the driver has to do game-specific magic behind the scenes for that to work, which greatly messes up ReShade: ReShade expects all textures to reside on a single GPU, now if the driver moves those allocations around or duplicates them, things get bad (loading a texture over the PCIE bus is slow). This is not specific t o 4.4, you just either get lucky or not when enabling SLI. The stats may change just from enabling/disabling a few other effects, depending on what the driver is doing.
Well as usual @Crosire you are right. I tested The Outer Worlds with SLI disabled and the Eye Adaption shader used a fraction of its GPU usage, and performance was back on par.
That said, I've been using SLI since the Voodoo days which means I've always had SLI enabled since the inception of SweetFX and everything that came after including, of course, Reshade and this is the first time I've noticed any performance issues (or, if they have been present, have not been noticeable or impactful).
I have 112 shaders in my Shader directory and I've been using Reshade and SLI across what must no be hundreds of games over the years. It appears only this shader technique, and this game is an issue (I re-checked Dying Light and it was fine, it's only Outer Worlds).
I also tried Pirate's Light Adaption as an extra test, with the same result (that is, massive performance hit in Outer Worlds with SLI). So I'd say Reshade with SLI is 99% fine, with the exception of this particular technique and this particular engine.