I'm not a programmer so what I'm going to say may be incorrect, but I'm afraid that your idea can't be possible to achieve with reshade, since it would require a real-time dynamic rendering in a specific position of the 3d scene.
This is not much about post processing.
The administrator has disabled public write access.
While it's possible to make volumetric clouds in Reshade, such an effect would only be believable if you stood still and did not look around.
If you did move or looked around, the illusion would quickly be broken because Reshade cannot detect where the camera is or where it is looking.
So the effect would be like a video playing on a pair of glasses you were wearing.
It is possibly to create a tool specifically for one game that reads the camera position and orientation by figuring out where and how the game stores this, but since very few games store this the same place and not all in the same way, it's practically impossible to do generically for all games and we are not about to include a feature that is just for one game - that is how ENB does things but we want features to work for all games.
It's also why we currently cannot do believable rain or snow - We need to know where we are looking and we need to know our motion vector.
I did speculate that if we supported VR headset input we could detect where the headset was looking, which would solve one of the issues but only for VR games.
And it still doesn't solve the other one. Sure you can also track a headset position but many games allow the VR gamer to sit or stand still while in motion inside the game - like in a plane or driving simulator.
So yeah - We want it too! But can't figure out how to make it possible.
Can I say the same with the fog volumes and layers of titles, or there is an possible solution , where the bloom effect affect the HUD and fog and other layers affect mostly the Ambient Occlusion and GI shaders.?