Digital Foundry Article Technical Discussion [2020]

Status
Not open for further replies.
They did have 4 GB 680s back then i think from after market!

Indeed they did! But I looked at the price, looked at benchmarks showing no real difference at 1080p, and assumed I'd replace it before too long.

Have to say, it's been a really good card. But with driver optimisations a thing of the past and 2GB just being too little for some games (warnings, texture streaming bugs, the odd crash) it's taught me a lesson!

This is predicated on PC gamers keeping the same GPU for an entire generation though which I think in most instances won't be the case, at least not for those gamers who would be concerned with at least matching console graphics for the whole generation. I imagine most people who more than casually game on the PC and concern themselves with graphics quality, upgrade every 2-4 years. And on those timescales you're probably okay with a more mainstream level of RAM.

Yeah this is a fair point. But I'm finding the older I get the longer my graphics cards stick around. Resale prices for larger ram quantities seem to be better too - at least as far as I can tell from looking at ebay.

To see out the rest of this gen with console matching graphics I imagine you'd need at least 12GB. But 8GB should last at least another 2 years and 10GB a year or 2 more than that (at console settings). It's certainly be interesting to see how GPU's with 4GB compare to consoles in the latest games with equal settings. Do they have to sacrifice on texture quality? What about 6GB GPU's?

Yeah, I too think 8GB is going to be good for a few years - particularly at 1440p or 1080, but ultimately more would be a safer bet for lasting out next gen. Just in time delivery of assets on consoles might put a strain on PC versions until Direct Storage and SF are widely adopted. I'd guess that current gen games with console settings will be fine on 4GB, but I think there are already signs that 3GB is feeling the strain.

6GB is an interesting one isn't it. Ample for current gen at high resolutions but I think it'll fall by the wayside relatively quickly, with the far more common 8GB becoming the baseline as we move past cross gen games. The real kicker about the 2060 (none S) is that it's almost really good value, but I know if I bought one it'd last me for years and I'd end up in the same position I am now. :D

It's the same with the inevitable drop off in performance of PC GPU's over long periods of time. It happens because architectures cease to be supported and optimized for, both in drivers and by games developers. But while they're still in that support period they tend to stand up reasonably well on a spec to spec comparison with consoles. GCN faired better than most for the obvious reason that developers are still optimizing their core game engines for it even today, although AMD likely stopped giving it any love in drivers a while ago and PC specific features and settings likely don't accommodate for it much.

Commonality with consoles definitely helped GCN out. Async compute really gave it a boost compared to the likes of the GTX 6xx series. The fact that AMD stuck with it so long is probably a factor as well. Nvidia are constantly making big changes and that's cool to see every couple of years, but I guess there can be a down side for long term keepers like me. It's funny how quickly things go south once Nvidia stop supporting your card, it's like being dumped. But I guess that's a testament to the skill of their driver teams.

RDNA2 should hold up well this gen for the same reasons, although I do think Kepler fell back more than the average amount due to it's very deliberate scaling back of compute capabilities which were obviously a hall mark of GCN for it's day and for this whole console generation. It'd be interesting to see how Fermi holds up today in modern games vs Kepler (Kepler being scaled back from Fermi in compute capabilities). According to TPU the GTX 580, GTX 660Ti, GTX 1050 and HD 7950 all perform within 1% of each other at 1080p (presumably tested games contemporary with the architectures). Seeing how that translates to more modern games could be really interesting.

That's an interesting question about Kepler. We're long past the days of driver optimisations for either, so it'd be a real battle of the architectures to see how modern compute heavy games fared. One for DF perhaps?

I also think that RDNA 2+ will have a pretty long life. Not only is it the basis of both consoles and new PC cards, but it's yet to work its way into APUs. GCN is still living strong in the latest - and quite superb - Renoir APUs. All said and done the GCN family will end up having been in production for a decade in one form or another. I can see the RDNA family lasting a long time too. Though I'm not too sure about RDNA 1, as it seems like a kind of transitional step. When (if?) RT becomes a core component of games it's going to be in a tough spot...
 
yup for sure. He should be proud, his couple months with the team and I do think that his videos helped increase the appeal in the PC space at the very least. Very successful work.

He covers the biggest user base platform, the PC. He is the best thing that could happen to DF, i like his technical analysis, always in a good mood too ;)

All said and done the GCN family will end up having been in production for a decade in one form or another.

Yes, i should have gone with a 7970ghz 6gb edition back in mid 2012, instead of the 670 2gb lol. They were not too far off backthen, but now their on a complete different level. 7970ghz lets you play anything if you can live with console settings etc. Maybe i should go for a Navi2 next year.... :p
 
That's an interesting question about Kepler. We're long past the days of driver optimisations for either, so it'd be a real battle of the architectures to see how modern compute heavy games fared. One for DF perhaps?

One thing that's been a benefit of AMD not being able to as heavily optimize for the latest games as NV is that unoptimized, games tend to run better on AMD hardware, not always, but I've found that to have generally been the case when I still had the R9 290 to test alongside the GTX 1070.

In unoptimized games, the R9 290 had significantly fewer visual bugs in games that didn't contain optimizations in the drivers (basically anything that wasn't AAA and would potentially be used in a review).

And since then I have to be careful what NV drivers to install for the 1070 because optimizations for the newest AAA games sometimes would have unintended side effects of introducing graphical anomalies in non-AAA games (especially the one I played the most, Warframe).

Thankfully, now that the 1070 is getting old enough that NV likely won't be make driver optimizations for it anymore, perhaps I won't have to be so scared about installing a newer driver? Although it also means I don't really have a reason to install a newer driver. :p

Granted, the last AMD card I used heavily was the R9 290, so it's certainly possible that AMD's drivers have gotten as bad as NV's drivers of the past few years.

Regards,
SB
 
One thing that's been a benefit of AMD not being able to as heavily optimize for the latest games as NV is that unoptimized, games tend to run better on AMD hardware, not always, but I've found that to have generally been the case when I still had the R9 290 to test alongside the GTX 1070.

In unoptimized games, the R9 290 had significantly fewer visual bugs in games that didn't contain optimizations in the drivers (basically anything that wasn't AAA and would potentially be used in a review).

And since then I have to be careful what NV drivers to install for the 1070 because optimizations for the newest AAA games sometimes would have unintended side effects of introducing graphical anomalies in non-AAA games (especially the one I played the most, Warframe).

Thankfully, now that the 1070 is getting old enough that NV likely won't be make driver optimizations for it anymore, perhaps I won't have to be so scared about installing a newer driver? Although it also means I don't really have a reason to install a newer driver. :p

Granted, the last AMD card I used heavily was the R9 290, so it's certainly possible that AMD's drivers have gotten as bad as NV's drivers of the past few years.

Regards,
SB

That's interesting. Come to think of it, lower budget and indie stuff does seem to be more stable on the graphics side once optimisation ends. Anecdotally, of course.

I guess they just work to the final game ready driver, and don't try to do anything too clever with leet hack optimisation.

Plus it's kind of nice not to be able to remember the last time you checked driver versions. It's the console experience, on PC. :p
 
He covers the biggest user base platform, the PC. He is the best thing that could happen to DF, i like his technical analysis, always in a good mood too ;)

I think DF has become a really well rounded group that have areas of particular interest, but are happy to bounce ideas off each other. I really enjoyed the Halo Infinite gameplay reveal discussion. It was basically a better verbalised version of three people saying "...wat?".

Yes, i should have gone with a 7970ghz 6gb edition back in mid 2012, instead of the 670 2gb lol. They were not too far off backthen, but now their on a complete different level. 7970ghz lets you play anything if you can live with console settings etc. Maybe i should go for a Navi2 next year.... :p

Yeah, a 6 GB 7970 would have been a golden card to see you through the generation. Hindsight is 20:20 and all that.

In terms of performance I don't think AMD will catch Nvidia, but in terms of performance per dollar .... I mean lacking DLSS isn't so bad if a card costing about the same can pretty much do the same thing native. And if I had to chose between a 16 GB big Navi and an 8 GB 3070? Well, that might take a bit of thought.

Would like to see a DF video specifically on RT once there are enough games out there to pit RDNA2 and Ampere against each other.
 
Yeah, a 6 GB 7970 would have been a golden card to see you through the generation. Hindsight is 20:20 and all that.

Yes, never saw that one coming. The 670 served me well anyway, played 2500hrs of BF4 with that gpu, everything Ultra/1080p, between 60 and 110fps depending if its a 64mp CQ map or TDM match. But yes, should have gone with the 7970ghz 6gb oc, almost PS4 Pro performance in some cases. If i remember correctly that gpu was about the same price as a 670/680. Oh well :p
Got a 7950 used last year just to test and compare games to the kepler 670 and base ps4. It's possible to mod this one into a 7970 with abit of luck. Havent looked into it yet.

In terms of performance I don't think AMD will catch Nvidia, but in terms of performance per dollar .... I mean lacking DLSS isn't so bad if a card costing about the same can pretty much do the same thing native. And if I had to chose between a 16 GB big Navi and an 8 GB 3070? Well, that might take a bit of thought.

Would like to see a DF video specifically on RT once there are enough games out there to pit RDNA2 and Ampere against each other.

Guess NV will have the RT/DLSS advantage, which is a huge thing. Anyway, im planning on a Zen3/navi31 pc for early next year or mid 2021. Perhaps even waiting out to late 2021. Later this month AMD has their show which DF sure will cover on Navi2, a 25+TF RDNA2 gpu with 16GB vram is much more intresting then a 8gb 3070 indeed.
 
Worth noting that dynamic res was disabled for the tests, so it's not quite representative against consoles.

So it's a murky comparison.
 
Keep watching, it's turned on later.
Ah k, thanks. So it looks like there's some non-resolution bottleneck kicking things down to high 40s in the 1% - doesn't seem that bad? Wonder if it's RAM-related. I don't know how much RAM the game needs at low settings @ 1080p though.

It's a tough thing to compare for Doom Eternal though since it has so many different graphics settings, and I forget what the console versions look like comparatively.
 
7950 has almost 60% more compute than the GPU in PS4 and 40% more bandwidth than the entire console and sometimes performs comparably to it in more recent titles. Buying one in its timeframe was also more expensive than a PS4. It is not correct to say you could buy comparable hardware to a console and play games fine throughout the generation. Again this is the best case scenario. Those with Nvidia GPUs generally get what you see in the video linked by Jawed.
 

Timestamped for some sub-30fps misery.

With resolution scaling enabled the 570 8GB has a 17% performance advantage over the 570 4GB.

And that's at low settings.

This is not an issue with power or drivers, it's an issue with iD being bloody terrible about optimising for memory now they've dropped virtual texturing. No wonder they were crying about Xbox Series [edit] S! , they can't even make current gen low settings run right on a system with vram at at 80% of the entire memory available to games on the PS4.

Fucking hell, a 15% performance penalty for "only" having 4GB of vram instead of 8GB, when running LOW settings at 1080p. I mean, that's terrible.
 
Last edited:
It's a tough thing to compare for Doom Eternal though since it has so many different graphics settings, and I forget what the console versions look like comparatively.

Think it was quite many settings on low and medium on the base consoles, dips below 60 here and there aswell. No idea about resolution scaling, but i think both base and mid gen consoles are doing that for eternal. Dont forget that the base consoles often run low and medium settings in modern games, aswell as having terrible performance in some cases. They are really ready for a new generation.
It's basically only Sony's first party games that dont suffer on the base PS4 console.

7950 was released jan 2012 though.... I mean, thats almost two years before the PS4 and One. Same timeframe would be R270, R280, R290X etc.
I didnt get the 7950 in 2012 but if i had, i could have enjoyed just about any game at better then PS4/One performance for most games.
7870 is not so far off to matching, although with larger dips when they do happen as opposed to the PS4.

Maybe DF/Alex can do a DF Retro article on this, its rather time consuming but when they have time :p Its already tiring for me testing and comparing between 7950 and 670/PS4.
 
With resolution scaling enabled the 570 8GB has a 17% performance advantage over the 570 4GB.

And that's at low settings.

This is not an issue with power or drivers, it's an issue with iD being bloody terrible about optimising for memory now they've dropped virtual texturing. No wonder they were crying about Xbox Series X, they can't even make current gen low settings run right on a system with vram at at 80% of the entire memory available to games on the PS4.

Fucking hell, a 15% performance penalty for "only" having 4GB of vram instead of 8GB, when running LOW settings at 1080p. I mean, that's terrible.

I guess you mean Series S?

The new consoles are even more like a PC in some respects now, their SSDs (when using compression) allow for transfer speeds within the bandwidth range of old RAM (like DDR3). It's really not that far from being a second (and gigantic) pool of RAM.

Flushing data in and out of VRAM should make up for any deficits in the 16GB I would have thought.
 
I guess you mean Series S?

The new consoles are even more like a PC in some respects now, their SSDs (when using compression) allow for transfer speeds within the bandwidth range of old RAM (like DDR3). It's really not that far from being a second (and gigantic) pool of RAM.

Flushing data in and out of VRAM should make up for any deficits in the 16GB I would have thought.

Microsoft estimate the memory to be equivalent to have 2.5 more RAM without a SSD. There is 13.5 GB available to the game comparable to 33.75 GB of unified RAM. I think the consoles will run better at end of generation than base PS4/XB1 because of the 16GB unified memory couple with SSD and a good Zen 2 CPU not the Jaguar horror.

It will be better too on GPU side with an architecture releasing the same year than console release and maybe more competitive against Nvidia GPU this time. We will have the answer the 28th October.

Going for a price of 499 dollars/euros was the right decision, it will help console to last longer with a good level of performance.
 
Probably good thing the west hasn't been given PS5's, DF work load would be crazy.

Although shame MS didn't send them 2 or 3, so they could spread the work?
Then could be ready with relevant articles/vids when each embargoe gets lifted.

Maybe next embargo is next week?
 
Status
Not open for further replies.
Back
Top