Fallout 4 PC discussion

I got the 970 today and played for a few hours. Went back to the Corvega plant as well. That area runs several times faster on the 970 than on the 6970, and on all Ultra settings too.

I haven't played on any GCN hardware but maybe the game isn't running very well on AMD in general if a R9 290 doesn't handle the Corvega plant well.
 
That's what I'm starting to wonder... My 980Ti chews through this game like it was pretty much like any of the previous Fallout 3 series. When I see someone with an R9 290 hitting 45fps and lower, that just seems "off" to me.

I did notice a few occasions where I was getting <60fps, so I've gone back to native 1440p rez instead of DSR. All other settings remained ultra except godrays at high, plus TAA. Insofar as MSI Afterburner can tell me, my peak recorded GPU utilization was 78% running stock voltage (1.199v) with peak clocks of 1467 core / 3758 mem (~7.5GHz effective.) I haven't changed uGrids just yet.
 
I got the 970 today and played for a few hours. Went back to the Corvega plant as well. That area runs several times faster on the 970 than on the 6970, and on all Ultra settings too.

I haven't played on any GCN hardware but maybe the game isn't running very well on AMD in general if a R9 290 doesn't handle the Corvega plant well.
This seems to be "hit'n'miss", works fine on some, terrible on others.
Also, this game seems to really like fast memories (especially on Intel): http://www.techspot.com/review/1089-fallout-4-benchmarks/page6.html
I just bought another 2x4GB set but my memories are just DDR3-1600, wondering if I should sell all 4x4 GB and buy faster ones just for this game
 
I should also mention I use a Phenom II 965@ 4 ghz, I could check out my CPU and GPU usage in Corvega and see if my 290 is being massively bottlenecked by my old cpu.
Seems to be be a good game to test memory controller overclocking too, I can get mine up to about 2700 mhz but there's no performance increase in things besides memory benchmarks, just more power consumption.
 
@Moloch see the TechSpot benchmark article above; its quite likely your CPU capping performance. The whole AMD line, both CPU and GPU, are seriously constrained in FO4.
 
Yes, my initial projections about performance may have been way off, the first hour of the game seems to perform much better than the rest, I was seeing maxed out GPU utilization then and in Corvega I just checked and my GPU utilization feel way down, while chugging.
Oh my catleap 1440P monitor appears to be broken, rebooted to test out some memory overclocking and now it doesn't want to turn on, using an old tell TN panel now ... looks like shit.
 
This seems to be "hit'n'miss", works fine on some, terrible on others.
Also, this game seems to really like fast memories (especially on Intel): http://www.techspot.com/review/1089-fallout-4-benchmarks/page6.html
I just bought another 2x4GB set but my memories are just DDR3-1600, wondering if I should sell all 4x4 GB and buy faster ones just for this game
That's an insane difference. I run 16GB 1600MHz 9-9-9-24 CR1 which is pretty good for latency. This is on Ivy Bridge as well, which was shown a while back to not be as sensitive to DDR3 speeds as Haswell. Wonder if that's changed. Performance is great as previously noted, so I wonder which matters more: latency or bandwidth.

I've never even tried to overclock my RAM, and TBH even now I don't think I will since my monitor only does 60Hz anyway.
 
I'm "only" running DDR3-1666MHz, but I also get the quad channel goodness from the X79 platform. My memory really can't do anything faster reliably, no matter the timings.
 
I'm running DDR3 1600 CL8. I have a feeling GPU power is more beneficial than RAM speed as usual. Going from a 6970 to a 970 multiplied my speed probably by 4 in some cases.

I noticed that the god rays aren't quite perfect even on NV though. Still some blockyness sometimes.



I should also mention I use a Phenom II 965@ 4 ghz, I could check out my CPU and GPU usage in Corvega and see if my 290 is being massively bottlenecked by my old cpu.
Seems to be be a good game to test memory controller overclocking too, I can get mine up to about 2700 mhz but there's no performance increase in things besides memory benchmarks, just more power consumption.
The Corvega area tends to peg the 970 at 100% on my 4.3GHz 3570K. I'm running 2560x1440 and Ultra everything though. I didn't check framerate but it's probably around 30fps there. I can see some hesitation as I look around.
 
Last edited:
The techspot results show that the engine seems to be capped at 4 threads (Intel's 6-core + hyperthreading models seem to gain almost nothing from 4-core i5), and it's very dependent on single-threaded performance:

A1kVztP.png


I don't know how they managed to put the game running on 1.6GHz Jaguars. Notice how crappy this runs on a dual-module Vishera at 4GHz:

0uGD4Sa.png


I guess it's a completely different build between PC and consoles.
 
Just a heads up :

Stuttering and Memory patch ENBoost :
http://www.nexusmods.com/fallout4/mods/332/?

Clean Light :
This is a visual enhancement preset that uses ReShade & Sweetfx to give the game a clean cinematic look.
http://www.nexusmods.com/fallout4/mods/176/?


Fallout 4 Wasteland ENB :
The goal of the ENB is to enhance the colors, shaders,and overall visuals of the game world while minimizng FPS loss.
http://www.nexusmods.com/fallout4/mods/49/?

Vogue ENB - Realism :
This ENB and ReShade preset aims to make Fallout 4 more immersive, real looking, and atmospheric. It is designed for gameplay, so FPS hit should be minimal. Nights are slightly darker, textures are crisper, and the overall color palette has been adjusted to be more realistic.
http://www.nexusmods.com/fallout4/mods/42/?
 
Last edited:
I turned off godrays, makes the color looked washed out to me. Without godrays I can run 4K VSR on my 1080P TV.

Edit: Once you turn on godrays, the only way to turn them off is to select low in the global settings, then individually set the other settings to ultra. You can sellect "off", but it just sets them to low.
 
Last edited:
I do notice while my wife is playing it that it is a very washed out look, which also seems to be the focus of the initial ENB/sweetfx changes on nexus so far. I think I might apply them without her knowing lol, see if she notices.
 
I am running with everything set to ultra (godrays too) at 1440p with G-sync enabled and there's no hiccups. Corvega performed really well. I am considering upping the gridstoload setting so the draw distance can be improved a bit, but I don't want to have the AI going bonkers off screen all the time either.
 
Back
Top