I have what appears to be a not so uncommon issue where my laptop will not run a .exe if there is a dxgi.dll in the root menu. I know that this somehow due to the fact that my laptop has both a dedicated NVIDIA graphics card and an integrated Intel graphics card. So just disabling the integrated card should solve the issue, right? Only it doesn't. Because when I disable the Intel graphics card, my laptop defaults to Microsoft Basic Display Adapter and will not run *anything* on the NVIDIA card because it thinks there are no displays attached to that card. My thought is, if I could figure out how to set the NVIDIA as the actual default adapter for my laptop monitor, I could actually get this working. But I can't figure out how to change the default adapter? All the information I can find just keeps showing me how to change the default card *for certain programs* through the NVIDIA control center, which is incredibly not helpful and not at all the same as actually changing the default card for the laptop monitor itself. Is there any chance someone could provide or point me toward some instructions on how to do this? I'd be very grateful.
(Also, I am aware already that renaming the dll to d3d11.dll is a much easier solution that sometimes works. The problem is, for a few games, specifically anything on Bioware's Frostbite engine, this doesn't work. If I rename the dll, the game doesn't crash like with the dll named dxgi.dll, but ReShade still does not actually run.)
Last edit: 3 years 11 months ago by aboutthe1910s. Reason: oops