Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
Great Dead Space PC breakdown video!

With the way 2022 ended and 2023 started with PC ports, I am worried for Starfield.

Currently, I have a PS5 so I can play some of the titles that I am interested in over there where the issues are noticeably lessened and I don't have to wait 1-4 mins for shaders to compile or check caches at game boot whenever I want to start a game.

That said, I don't own a Series X. In 2020, I went with a X570/3800X-based PC and in 2021 upgraded my older GPU to the RTX 3060. It was already expensive and hard to get and I wasn't willing to spend more (I got it retail but the prices back then were higher than MSRP.)

So, the PC is good, but doesn't have anywhere near enough power to brute force and minimize any optimization issues...and I don't plan on upgrading again anytime soon.

...I am worried for Starfield. 🙁
I would not worry. Considering they only have to worry about Xbox they should have much better time optimizing for them. For my part I'd hope series S doesn't have performance issues cause that's what I will eventually be playing it on
 
I would not worry. Considering they only have to worry about Xbox they should have much better time optimizing for them. For my part I'd hope series S doesn't have performance issues cause that's what I will eventually be playing it on
A lot of the issues I would say aren't due to multiple platform development.
It's core engine design or usage. The fact that getting stuttters like that when coming up to a mechanical door is crazy.
Not even running game from a HDD.

Forspoken may have had different api's but again engine design/usage. (PS5 & PC)
Which wasn't looking too bad until it came out.

All you can do is wait with fingers crossed and see how a game turns out, don't pre-order and also don't believe in mystical day 1 patch that may or may not fix 'everything'
 
Great Dead Space PC breakdown video!

With the way 2022 ended and 2023 started with PC ports, I am worried for Starfield.

Currently, I have a PS5 so I can play some of the titles that I am interested in over there where the issues are noticeably lessened and I don't have to wait 1-4 mins for shaders to compile or check caches at game boot whenever I want to start a game.

That said, I don't own a Series X. In 2020, I went with a X570/3800X-based PC and in 2021 upgraded my older GPU to the RTX 3060. It was already expensive and hard to get and I wasn't willing to spend more (I got it retail but the prices back then were higher than MSRP.)

So, the PC is good, but doesn't have anywhere near enough power to brute force and minimize any optimization issues...and I don't plan on upgrading again anytime soon.

...I am worried for Starfield. 🙁

Well good that you have a PS5 then. lol.
 
Great video. It would have been nice to see how RDNA 2 cards fared on some of these issues with their larger frame buffers.
In this case it is just not included due to time - I have to move onto other work and the video is already 24 minutes in length without them... and the video was already "late" (not really "late", but after release day). A big issue I think happens with my work is that it is requires a lot more time and effort than a console video - and people in our audience are extremely fickle. If a video covering PC comes out at a normal pace after a game release, people just do not watch it as the next new shiney thing is there for them to latch onto. That happened with Dead Space and to a degree, ForSpoken. I got Dead Space review code just at release time, but was still working on Forspoken because I likewise got Forspoken code just at release time. That means I am working on my reviews sequentially for each game as consumers already have both the games in their hands. And every day a video is not released, the less viewers it gets and the less money it makes due to how fickle our audience is.

That is why last year on Twitter I put out a poll about which things do people care the most about in my PC reviews. I was trying to guage what is most important to watchers vs less important so I could spend less time on a video in production by axing the thing people care about the least. "NV vs. AMD GPU Head2Heads" scored as least important in the list of options I gave users, so it is the one I will most readily leave out of a video if I am pressed for time (which I always am).

 
Last edited:
Interesting talk about DLSS3. Contrary to the video I found DLSS3 to be unusable in Witcher 3 but fine in Portal. I need to have a play around with W3 some more though as I think I can get it running better. I was getting a pretty poor framerate without FG (30's to 40's I think from memory), but turning DLSS3 on increased things into the 50's or so, but the latency and pacing was horrendous.

Cheers, I'm wanting to avoid a full platform upgrade again at the moment though. I don't like to swap the mobo out too often (nothings gone wrong before but I'm always conscious of it!)

I'm toying with the idea of picking up a used 5800X3D on ebay which is the fastest CPU by current system will support and will likely give me an easy 50%+ improvement. But for now I'm not hitting any real issues on this CPU outside of the Witcher, and I can live with them tbh with frame generation in place. I'm not actually playing the game anyway, just playing settings :D



Yeah I can get it a fair way slower than that too. That said it's less the city that I had a problem with (which outside of the flickering hud elements was much better with FG, albeit far from ideal), and more an area in a small village I was testing which was likely a mix of both CPU and GPU bound. Here when I turned on FG the frametimes went crazy and input latency went through the roof - unplayably so (and I'm pretty tolerant in that respect). Eyeballing it I'd say 2-300ms! However after resetting the game that settled down a lot so was probably just a bug related to changing the settings, perhaps in combo with RTSS which at that point I think may have still had a frame cap engaged, albeit a generous one.

So I have to set the record straight here as it seems I screwed up pretty badly with my PC config and in fact may have been screwing up with it for the last 2.5 years amusingly.

I recently upgraded my RAM from 16GB DDR4 3200Mhz to 32GB DDR4 3600Mhz. The thing is when I installed the 3200Mhz originally I must have failed to seat it properly which led me to believe one of my DIMM slots (one of the two primary/optimal slots) was inoperable, and thus I installed the 16GB in the non-optimal slots slots instead, but still dual channel as far as I'm aware. Clearly that had a big impact on my RAM performance (I'm not certain but it may be that it wasn't even using XMP to clock up to it's full speed).

I discovered this when installing the new RAM in the same slots and being unable to get past 2666Mhz. Anyway, I tried the new RAM in the optimal slots - properly seated this time - and hey presto it worked!! Now the RAM runs at it's full 3600Mhz.

The massive surprise though is just how huge a performance boost this gave me. I mean HUGE. I noted above I was running in the 50's with FG on in Witcher 3 but after this change, in the same area I'm running in the low 80's! Game is smooth as butter now and FG makes a massive positive difference. The HUD flickering is also gone entirely now which is a huge benefit (obviously related to a recent patch and not my upgrade). I tried another city area where standing in a specific spot gave me around 21-22fps without FG. Now it's about 31fps in the same spot! With FG it's in the low 60's and plays beautifully. So 30->60fps with FG is 100% absolutely playable and a very obvious advantage over not using it at all.

It's worth pointing out here that my previously reported horrible frame pacing was probably in large part also down to running the game from an HDD. I found moving it to my NVMe cleared up a lot of that. But the performance boost has absolutely come from the RAM swap. I was able to confirm this in NFS Heat where previously in a specific race I was dropping into the 40's and now it holds a solid 60fps. Absolutely crazy. I don't really feel the need to get that 5800X3D now!

The maddest thing is I've probably been running at this CPU performance deficit since I got the 3700x over two years ago (which I have at times suspected tbh) but it's only become an issue recently now that I'm no longer bottlenecked by the GTX 1070, and newer games are really starting to push CPU's.
 
Last edited:
"Having to check game caches every time you load a game"?

I've been PC gaming for over 30 years and not only have I never done it, I don't even know what it is.

It's apparently a thing with COD, almost every game patch necessitates a shader cache rebuild, and there are frequent patches as there are with popular multiplayer games - so you'll be seeing it often. Better than the alternative of course, frankly I wish more games had this 'problem' but I can understand how annoying it can be if you're waiting 5-10 minutes every time you boot up a game.
 
Witcher 3 stutters like crazy when I enable raytracing on my 6700XT. It's not just bad FPS, it hitches so it feels worse than the FPS average would indicate. Performance is great with RT off but it really does look a lot better with it on.
 
The discussion of vram kind of tells me that we are still in cross-gen. Developers are still not making use of some of the current-gen technologies that are supposed to help mitigate the need for 16 Gigs of vram GPUs. Many games still have a minimum spec from the Xbox One/PS4 Generation.
 
Last edited:
Status
Not open for further replies.
Back
Top