Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
Nvidia's RTX 4060 has excellent efficiency, can run Cyberpunk 2077 RT Overdrive very nicely and DLSS 3 has its uses - but that's matched against benchmark results that vary drastically against RX 7600, RTX 3060 and Arc A750 where it struggles to be competitive. Rich runs the gamut of the various benchmarks, then moves the RTX 4060 over to the mainstream gaming PC it was intended for. With a look at the future of gaming in mind, can the RTX 4060 keep up with Sony's PlayStation 5?

Edit: This video amongst other DF videos just proves once again, how far behind AMD is with RT performance. Heck, at times Intel's Arc is outperforming AMD's RT performance.
 
Last edited:


Edit: This video amongst other DF videos just proves once again, how far behind AMD is with RT performance. Heck, at times Intel's Arc is outperforming AMD's RT performance.
Dont understand the comparision to the PS5. Here in Europe the disc version costs 220€ more. The Xbox Series S would have been better because the price difference is less.
To be fair, they aren't stopping everyone from using DLSS. Starfield's system requirements only call for an nVidia 1070. That means all of the 10 series and 16 series nVidia users won't have it, nor the AMD users, nor the Xbox users. And not because of mean ol' AMD, but because those cards simply don't support it. I might be remembering it wrong, but I do believe the most popular graphics card on Steam's hardware series is a 16 series GPU.
The minimum GPU support is 1070 TI on nVidia's side. There are only three Pascal GPUs official supported and they make less than 2,2% on Steam. On the other hand there are ~37% RTX GPUs faster than the 1070 TI. So nearly 80 million PC gamers will have a worse PC experience.
 
Last edited:
Dont understand the comparision to the PS5. Here in Europe the disc version costs 220€ more. The Xbox Series S would have been better because the price difference is less.
Either Xbox would have been a more relevant comparison point because the consoles also use a variation of the DirectX API. I may be imagining it, but it feels like in a few videos that PS5 has become the default console comparison console ¯\_(ツ)_/¯
 
Dont understand the comparision to the PS5. Here in Europe the disc version costs 220€ more. The Xbox Series S would have been better because the price difference is less.

You need more than a GPU to actually play games. Nvidia is the one arguing that an equivalent GPU should cost the same as an entire console, which is ridiculous. 2 1/2 years into a consoles lifespan, a $300 GPU should certainly be comparable to a $400 console.
 
Last edited:
You need more than a GPU to actually play games. Nvidia is the one arguing that an equivalent GPU should cost the same as an entire console, which is ridiculous. 2 1/2 years into a consoles lifespan, a $300 GPU should certainly be comparable to a $400 console.

Most people already have a system though and just need a GPU upgrade.
 
The minimum GPU support is 1070 TI on nVidia's side. There are only three Pascal GPUs official supported and they make less than 2,2% on Steam. On the other hand there are ~37% RTX GPUs faster than the 1070 TI. So nearly 80 million PC gamers will have a worse PC experience.
Well, I'm one of the people with an RTX GPU, and I love DLSS, but I think I'll wait until the game comes out before I start crying about how bad my experience is. And then I'll wait until a DLSS mod comes out before I compare what we got to what could have been.
 
nice video. My favourite thing about Ray Tracing is that it depends on the GPU for it to work, not the device you connect the GPU to.

As of currently, I am playing on my 1080p 32" TV from 2013, which has great image quality -allegedly 12bits, though I dont believe that claim, but it looks neat-. I play on it 'cos I love to play in silence, and from 1440p and above that, the GPU works more and it can get noisy. The TV doesn't have HDR -it wasn't a thing 10 years ago- but I love that tradeoff as long as I can play in silence and the GPU rests in its laurels.
 
So I have to set the record straight here as it seems I screwed up pretty badly with my PC config and in fact may have been screwing up with it for the last 2.5 years amusingly.

I recently upgraded my RAM from 16GB DDR4 3200Mhz to 32GB DDR4 3600Mhz. The thing is when I installed the 3200Mhz originally I must have failed to seat it properly which led me to believe one of my DIMM slots (one of the two primary/optimal slots) was inoperable, and thus I installed the 16GB in the non-optimal slots slots instead, but still dual channel as far as I'm aware. Clearly that had a big impact on my RAM performance (I'm not certain but it may be that it wasn't even using XMP to clock up to it's full speed).

I discovered this when installing the new RAM in the same slots and being unable to get past 2666Mhz. Anyway, I tried the new RAM in the optimal slots - properly seated this time - and hey presto it worked!! Now the RAM runs at it's full 3600Mhz.

The massive surprise though is just how huge a performance boost this gave me. I mean HUGE. I noted above I was running in the 50's with FG on in Witcher 3 but after this change, in the same area I'm running in the low 80's! Game is smooth as butter now and FG makes a massive positive difference. The HUD flickering is also gone entirely now which is a huge benefit (obviously related to a recent patch and not my upgrade). I tried another city area where standing in a specific spot gave me around 21-22fps without FG. Now it's about 31fps in the same spot! With FG it's in the low 60's and plays beautifully. So 30->60fps with FG is 100% absolutely playable and a very obvious advantage over not using it at all.

It's worth pointing out here that my previously reported horrible frame pacing was probably in large part also down to running the game from an HDD. I found moving it to my NVMe cleared up a lot of that. But the performance boost has absolutely come from the RAM swap. I was able to confirm this in NFS Heat where previously in a specific race I was dropping into the 40's and now it holds a solid 60fps. Absolutely crazy. I don't really feel the need to get that 5800X3D now!

The maddest thing is I've probably been running at this CPU performance deficit since I got the 3700x over two years ago (which I have at times suspected tbh) but it's only become an issue recently now that I'm no longer bottlenecked by the GTX 1070, and newer games are really starting to push CPU's.

So I bit the bullet and finally ordered a 5800X3D to drop in my AM4 motherboard. Spiced it up with a Thermalright Peerless Assassin 120 SE to replace the AMD stock cooler too so I'm hoping for a much quieter system under load.

Can't wait to try out the various games that my 3700X was struggling to hit 60fps on!
 
It's quite apropos that Richard talked of this concept of hotspots in the 3700X review video, ( at 24m15s )


imo these hotspots are impossible to be seen on normal reviews, because they'd just a have runthrough of the place. Meanwhile if you're playing a single-player game and have to revisit this kind of hotspot again and again, you'd tear your hair out. Swan's Pond and adjacent areas in Fallout 4 were notorious for this and I saw a 50% increase going to 9900K from 3600.

Then there are some specific optimization issues for Ryzens, my 12700K was almost double of 5800X in some GTAV areas which was curiously due to water settings, despite water not being visible on the screen.

 
So I bit the bullet and finally ordered a 5800X3D to drop in my AM4 motherboard. Spiced it up with a Thermalright Peerless Assassin 120 SE to replace the AMD stock cooler too so I'm hoping for a much quieter system under load.

Can't wait to try out the various games that my 3700X was struggling to hit 60fps on!

I've been thinking about doing the same. It's tough because there are games that really perform amazingly well with the bigger cache and you don't have to worry about memory tuning so much. Then you get those games that are massively single-thread bound and the clock of the 5800x3d means it's more limited than other cpus. Overall it's a massive win in the majority of cases, and I already have an AM4 board so it's my cheapest path. But I'm also not sure I'm happy enough with the games that are coming out to really care. I think Alan Wake 2 will probably be a determining factor for me.
 
It's quite apropos that Richard talked of this concept of hotspots in the 3700X review video, ( at 24m15s )


imo these hotspots are impossible to be seen on normal reviews, because they'd just a have runthrough of the place. Meanwhile if you're playing a single-player game and have to revisit this kind of hotspot again and again, you'd tear your hair out. Swan's Pond and adjacent areas in Fallout 4 were notorious for this and I saw a 50% increase going to 9900K from 3600.

Then there are some specific optimization issues for Ryzens, my 12700K was almost double of 5800X in some GTAV areas which was curiously due to water settings, despite water not being visible on the screen.


I bought my AM4 board right around when zen 2000 series were coming out, but I got a 1700x on sale for dirt cheap. The gaming performance was ... not great. Then when all of the 3000 series stuff came out, a ton of reviews talked about how they'd really solved the gaming problem, so I grabbed a 3600x on sale at some point while I was heavy into Apex Legends. When I got it at first I was really happy, but noticed there were spots on the Apex maps where my fps tanked, and my friends with older Intel CPUs weren't having any issues. The performance of the 3600 just seemed kind of all over the place. It was generally good, but when it was bad it was very bad. Eventually I got a 5600x on sale, and I'd say it's considerably more stable than a 5600, but I think it's still true that if you run an intel cpu for gaming you just won't suffer the same performance issues in demanding areas. The 5800x3D is probably the first really incredibly good gaming CPU that AMD put out. Unfortunately I don't think the 7800x3d, or whatever it is, has the same great value. If I don't abandon AAA games, I might go intel for my next platform if current trends continue.
 

Interesting, but I think Oliver could have been a little clearer at the start in emphasizing that this toolkit is really meant to be used by developers as a bridge to get native, or pseudo-native games on the Mac. He mentions its similar to Proton, it kind of is in how it operates, but the goals are decidedly different. Proton is meant for Linux/Steam Deck users to play Windows games natively OOTB, this is not that...yet.

Also not mentioned in the video title, but at the end there's a quick look at the native No Man's Sky port on Mac, particularly how its implementation of MetalFX Temporal Upsampling fares against DLSS and FSR when reconstructing to 1080p (spoiler: Very close to DLSS in quality, FSR2 far behind).
 
Last edited:
Status
Not open for further replies.
Back
Top