cheapchips
Veteran
Well I know Alex at least frequents the forum, so good job guys.
yup for sure. He should be proud, his couple months with the team and I do think that his videos helped increase the appeal in the PC space at the very least. Very successful work.Well I know Alex at least frequents the forum, so good job guys.
They did have 4 GB 680s back then i think from after market!
This is predicated on PC gamers keeping the same GPU for an entire generation though which I think in most instances won't be the case, at least not for those gamers who would be concerned with at least matching console graphics for the whole generation. I imagine most people who more than casually game on the PC and concern themselves with graphics quality, upgrade every 2-4 years. And on those timescales you're probably okay with a more mainstream level of RAM.
To see out the rest of this gen with console matching graphics I imagine you'd need at least 12GB. But 8GB should last at least another 2 years and 10GB a year or 2 more than that (at console settings). It's certainly be interesting to see how GPU's with 4GB compare to consoles in the latest games with equal settings. Do they have to sacrifice on texture quality? What about 6GB GPU's?
It's the same with the inevitable drop off in performance of PC GPU's over long periods of time. It happens because architectures cease to be supported and optimized for, both in drivers and by games developers. But while they're still in that support period they tend to stand up reasonably well on a spec to spec comparison with consoles. GCN faired better than most for the obvious reason that developers are still optimizing their core game engines for it even today, although AMD likely stopped giving it any love in drivers a while ago and PC specific features and settings likely don't accommodate for it much.
RDNA2 should hold up well this gen for the same reasons, although I do think Kepler fell back more than the average amount due to it's very deliberate scaling back of compute capabilities which were obviously a hall mark of GCN for it's day and for this whole console generation. It'd be interesting to see how Fermi holds up today in modern games vs Kepler (Kepler being scaled back from Fermi in compute capabilities). According to TPU the GTX 580, GTX 660Ti, GTX 1050 and HD 7950 all perform within 1% of each other at 1080p (presumably tested games contemporary with the architectures). Seeing how that translates to more modern games could be really interesting.
yup for sure. He should be proud, his couple months with the team and I do think that his videos helped increase the appeal in the PC space at the very least. Very successful work.
All said and done the GCN family will end up having been in production for a decade in one form or another.
That's an interesting question about Kepler. We're long past the days of driver optimisations for either, so it'd be a real battle of the architectures to see how modern compute heavy games fared. One for DF perhaps?
One thing that's been a benefit of AMD not being able to as heavily optimize for the latest games as NV is that unoptimized, games tend to run better on AMD hardware, not always, but I've found that to have generally been the case when I still had the R9 290 to test alongside the GTX 1070.
In unoptimized games, the R9 290 had significantly fewer visual bugs in games that didn't contain optimizations in the drivers (basically anything that wasn't AAA and would potentially be used in a review).
And since then I have to be careful what NV drivers to install for the 1070 because optimizations for the newest AAA games sometimes would have unintended side effects of introducing graphical anomalies in non-AAA games (especially the one I played the most, Warframe).
Thankfully, now that the 1070 is getting old enough that NV likely won't be make driver optimizations for it anymore, perhaps I won't have to be so scared about installing a newer driver? Although it also means I don't really have a reason to install a newer driver.
Granted, the last AMD card I used heavily was the R9 290, so it's certainly possible that AMD's drivers have gotten as bad as NV's drivers of the past few years.
Regards,
SB
He covers the biggest user base platform, the PC. He is the best thing that could happen to DF, i like his technical analysis, always in a good mood too
Yes, i should have gone with a 7970ghz 6gb edition back in mid 2012, instead of the 670 2gb lol. They were not too far off backthen, but now their on a complete different level. 7970ghz lets you play anything if you can live with console settings etc. Maybe i should go for a Navi2 next year....
Yeah, a 6 GB 7970 would have been a golden card to see you through the generation. Hindsight is 20:20 and all that.
In terms of performance I don't think AMD will catch Nvidia, but in terms of performance per dollar .... I mean lacking DLSS isn't so bad if a card costing about the same can pretty much do the same thing native. And if I had to chose between a 16 GB big Navi and an 8 GB 3070? Well, that might take a bit of thought.
Would like to see a DF video specifically on RT once there are enough games out there to pit RDNA2 and Ampere against each other.
Timestamped for some sub-30fps misery.
Keep watching, it's turned on later.Worth noting that dynamic res was disabled for the tests, so it's not quite representative against consoles.
So it's a murky comparison.
Ah k, thanks. So it looks like there's some non-resolution bottleneck kicking things down to high 40s in the 1% - doesn't seem that bad? Wonder if it's RAM-related. I don't know how much RAM the game needs at low settings @ 1080p though.Keep watching, it's turned on later.
Timestamped for some sub-30fps misery.
It's a tough thing to compare for Doom Eternal though since it has so many different graphics settings, and I forget what the console versions look like comparatively.
With resolution scaling enabled the 570 8GB has a 17% performance advantage over the 570 4GB.
And that's at low settings.
This is not an issue with power or drivers, it's an issue with iD being bloody terrible about optimising for memory now they've dropped virtual texturing. No wonder they were crying about Xbox Series X, they can't even make current gen low settings run right on a system with vram at at 80% of the entire memory available to games on the PS4.
Fucking hell, a 15% performance penalty for "only" having 4GB of vram instead of 8GB, when running LOW settings at 1080p. I mean, that's terrible.
I guess you mean Series S?
The new consoles are even more like a PC in some respects now, their SSDs (when using compression) allow for transfer speeds within the bandwidth range of old RAM (like DDR3). It's really not that far from being a second (and gigantic) pool of RAM.
Flushing data in and out of VRAM should make up for any deficits in the 16GB I would have thought.
Yes indeed. Thanks.I guess you mean Series S?