Digital Foundry Article Technical Discussion [2020]

Status
Not open for further replies.
Regardless of settings, if as a developer you don't plan for a given amount of vram you're going to hit problems if you go under it. 2GB is no longer important. Hasn't been for a while for AAA games.

Never settle for an acceptable amount of vram today if you're a PC builder. Be like a fat bastard at an all you can eat buffet. Get as much as you can.
 
Anyway, you didnt or dont really need that 2013 titan to outmatch the 2013 base consoles. An early 2012 7950 3gb will do.

Still playing this game? At least we're going to a better card each time! 7950 is also behind:


Lower settings than PS4 and dropped frames. We're getting closer now though, hey. ;)

Edit: COD Warzone
 
Regardless of settings, if as a developer you don't plan for a given amount of vram you're going to hit problems if you go under it. 2GB is no longer important. Hasn't been for a while for AAA games.

Never settle for an acceptable amount of vram today if you're a PC builder. Be like a fat bastard at an all you can eat buffet. Get as much as you can.
well I guess at this point in time, the number to look for is 16?
 
Still playing this game? At least we're going to a better card each time! 7950 is also behind:


Lower settings than PS4 and dropped frames. We're getting closer now though, hey. ;)

Edit: COD Warzone

Surprised by this one, but VRAM count. Curious to do the same comparison with VRAM and SSD into the mix a few years later.

And it is an HD 7970.
 
Regardless of settings, if as a developer you don't plan for a given amount of vram you're going to hit problems if you go under it. 2GB is no longer important. Hasn't been for a while for AAA games.

Never settle for an acceptable amount of vram today if you're a PC builder. Be like a fat bastard at an all you can eat buffet. Get as much as you can.

This is predicated on PC gamers keeping the same GPU for an entire generation though which I think in most instances won't be the case, at least not for those gamers who would be concerned with at least matching console graphics for the whole generation. I imagine most people who more than casually game on the PC and concern themselves with graphics quality, upgrade every 2-4 years. And on those timescales you're probably okay with a more mainstream level of RAM.

To see out the rest of this gen with console matching graphics I imagine you'd need at least 12GB. But 8GB should last at least another 2 years and 10GB a year or 2 more than that (at console settings). It's certainly be interesting to see how GPU's with 4GB compare to consoles in the latest games with equal settings. Do they have to sacrifice on texture quality? What about 6GB GPU's?

It's the same with the inevitable drop off in performance of PC GPU's over long periods of time. It happens because architectures cease to be supported and optimized for, both in drivers and by games developers. But while they're still in that support period they tend to stand up reasonably well on a spec to spec comparison with consoles. GCN faired better than most for the obvious reason that developers are still optimizing their core game engines for it even today, although AMD likely stopped giving it any love in drivers a while ago and PC specific features and settings likely don't accommodate for it much.

RDNA2 should hold up well this gen for the same reasons, although I do think Kepler fell back more than the average amount due to it's very deliberate scaling back of compute capabilities which were obviously a hall mark of GCN for it's day and for this whole console generation. It'd be interesting to see how Fermi holds up today in modern games vs Kepler (Kepler being scaled back from Fermi in compute capabilities). According to TPU the GTX 580, GTX 660Ti, GTX 1050 and HD 7950 all perform within 1% of each other at 1080p (presumably tested games contemporary with the architectures). Seeing how that translates to more modern games could be really interesting.
 
Some of the games in that video above actually show the VRAM usage. I'm a little surprised by it; COD Warzone is showing the 7970 using 1771MB (out of 3GB). Looks like they should be able to increase the texture resolution a fair amount. Performance is all over the place.

I'm not sure what to make of it.

Doom Eternal is maxing out VRAM at 29xxMB (can't remember exact figure), at low texture resolution. That one goes to show how much modern games need lots of memory.
 

Attachments

  • Screenshot_20201007-083114_YouTube.jpg
    Screenshot_20201007-083114_YouTube.jpg
    1.3 MB · Views: 13
It's certainly be interesting to see how GPU's with 4GB compare to consoles in the latest games with equal settings. Do they have to sacrifice on texture quality? What about 6GB GPU's?

Since the base PS4 only can allocate 3.5GB to games, yes 4GB will be more then enough.


I think the 2013 consoles are the first ones where you can compete/outmatch with the rest of their generation with a year older 2012 pc build.
 
Since the base PS4 only can allocate 3.5GB to games, yes 4GB will be more then enough.


I think the 2013 consoles are the first ones where you can compete/outmatch with the rest of their generation with a year older 2012 pc build.

You're like a broken record. The past two pages of this thread have comprehensively proven that you're wrong on this. It's also been explained why you're wrong.

Even the video that you've just posted in contrarian to what you're saying. Did you even watch it?

RDR2 1080p low settings.

My recommendation would be to stop with this weird agenda.
 

Attachments

  • Screenshot_20201007-114538_YouTube.jpg
    Screenshot_20201007-114538_YouTube.jpg
    991.1 KB · Views: 12
Some of the games in that video above actually show the VRAM usage. I'm a little surprised by it; COD Warzone is showing the 7970 using 1771MB (out of 3GB). Looks like they should be able to increase the texture resolution a fair amount. Performance is all over the place.

I'm not sure what to make of it.

Doom Eternal is maxing out VRAM at 29xxMB (can't remember exact figure), at low texture resolution. That one goes to show how much modern games need lots of memory.

I think many games just use more VRAM if its available without actually needing it. Although I certainly wouldn't be surprised if Doom Eternal and many other games required more than 3GB to match console setting and performance.

System RAM does complicate the comparison though since it can be used to pre-cache for VRAM where current gen consoles don't have that option (this changes somewhat with the new generation and their SSD's.

So a PC with say 4GB VRAM and 8GB RAM may work just fine with a game that on PS4 requires 5.5GB VRAM.
 
My recommendation would be to stop with this weird agenda.

I have no idea why in the world anyone would have an agenda regarding 2012 GPUs and 2013 consoles. It's almost if not fully retro hardware by now. How many are actually still on 2012 pc hardware?
If we look back at previous generations, say 6th gen, 7th gen, where even top of the line GPUs at the time of release of those consoles couldnt even keep up for a couple of years. With the 2013 ones, if you bad say a 7950, R270, R280, or even 7850/70, you can hang along an entire generation without needing to upgrade to enjoy those games. Yes some you will have to lower settings regarding 7870 cards that sported only 2gb).
It goes without saying its impressive that such old hardware can play those games at comparable visual quality.

Yes you can start list wars, showing videos where even a 6gb 7970 performs very badly, but then i can find a video where it outmatches the PS4. Thing with pc's is, they are different configurations, settings can wildly vary (in many games, PS4 is actually closer to low then medium) etc. PS4 performance isnt that great anymore either btw, many games dip well below 30, and often settings are below medium. Even resolutions do take hits.
Yes 2012 GPUs like the 7950 dont have perfect performance but so doesnt the PS4 either.

Hmm no it's 5.5gb for games on base ps4 i think. And 0.5gb more for ps4 pro.

I ment for VRAM. The (base) PS4 is actually limited, most games anyway) to about 3GB (+-500mb) for vram. Most games dont even touch that. I have no source other then what a GG dev told me.
 
Last edited:
I ment for VRAM. The (base) PS4 is actually limited, most games anyway) to about 3GB (+-500mb) for vram. Most games dont even touch that. I have no source other then what a GG dev told me.

This link says that they are using closer to 4.7GB of RAM for the tech demo, that doesn't mean thats the only memory available to them they could simply not being using it all.

memx3jnb.jpg
 
This link says that they are using closer to 4.7GB of RAM for the tech demo, that doesn't mean thats the only memory available to them they could simply not being using it all.

memx3jnb.jpg

And this is for a launch game...
I have no idea why in the world anyone would have an agenda regarding 2012 GPUs and 2013 consoles. It's almost if not fully retro hardware by now. How many are actually still on 2012 pc hardware?
If we look back at previous generations, say 6th gen, 7th gen, where even top of the line GPUs at the time of release of those consoles couldnt even keep up for a couple of years. With the 2013 ones, if you bad say a 7950, R270, R280, or even 7850/70, you can hang along an entire generation without needing to upgrade to enjoy those games. Yes some you will have to lower settings regarding 7870 cards that sported only 2gb).
It goes without saying its impressive that such old hardware can play those games at comparable visual quality.

Yes you can start list wars, showing videos where even a 6gb 7970 performs very badly, but then i can find a video where it outmatches the PS4. Thing with pc's is, they are different configurations, settings can wildly vary (in many games, PS4 is actually closer to low then medium) etc. PS4 performance isnt that great anymore either btw, many games dip well below 30, and often settings are below medium. Even resolutions do take hits.
Yes 2012 GPUs like the 7950 dont have perfect performance but so doesnt the PS4 either.



I ment for VRAM. The (base) PS4 is actually limited, most games anyway) to about 3GB (+-500mb) for vram. Most games dont even touch that. I have no source other then what a GG dev told me.

There is no limit of VRAM. This is unified memory if someone want to use only 1 GB for CPU system this is possible and allocate more memory to VRAM.
 
Last edited:
I have no idea why in the world anyone would have an agenda regarding 2012 GPUs and 2013 consoles. It's almost if not fully retro hardware by now. How many are actually still on 2012 pc hardware?
If we look back at previous generations, say 6th gen, 7th gen, where even top of the line GPUs at the time of release of those consoles couldnt even keep up for a couple of years. With the 2013 ones, if you bad say a 7950, R270, R280, or even 7850/70, you can hang along an entire generation without needing to upgrade to enjoy those games. Yes some you will have to lower settings regarding 7870 cards that sported only 2gb).
It goes without saying its impressive that such old hardware can play those games at comparable visual quality.

Yes you can start list wars, showing videos where even a 6gb 7970 performs very badly, but then i can find a video where it outmatches the PS4. Thing with pc's is, they are different configurations, settings can wildly vary (in many games, PS4 is actually closer to low then medium) etc. PS4 performance isnt that great anymore either btw, many games dip well below 30, and often settings are below medium. Even resolutions do take hits.
Yes 2012 GPUs like the 7950 dont have perfect performance but so doesnt the PS4 either.



I ment for VRAM. The (base) PS4 is actually limited, most games anyway) to about 3GB (+-500mb) for vram. Most games dont even touch that. I have no source other then what a GG dev told me.

Yes, so you're in contact with Guerrilla Games and they're feeding you incorrect information on the VRAM allocation of the PS4. /s

It's a real shame that the interesting technical breakdowns and discussions with people like pjbliverpool and Iroboto are diluted by the cyclical and erroneous nature of your messages. It's honestly exhausting having to fact check your messages constantly.

I'm yet another person to get swept up in feeling the need to correct your omissions and misleading comments. It's a shame because there are times where you're remarkably coherent, but those moments are usually undermined by the nonsensical just minutes later.
 
This link says that they are using closer to 4.7GB of RAM for the tech demo, that doesn't mean thats the only memory available to them they could simply not being using it all.

Its hard to find any information on the VRAM useage, but id guess around 3GB for most seems correct. If theres 5.5GB (more for Pro, or was that the Pro?) available to games, i dont think most games are going to be much more then the 3/4gb for vram figure.

It's a shame because there are times where you're remarkably coherent, but those moments are usually undermined by the nonsensical just minutes later.

Well, its not in my nearest intention to irritate you or anything. I dont care really about 2012 gpu hardware, i dont (seriously) game on it and so doesnt anyone else here (i hope :p ).

Findings are that mid/higher range gpus from the same era as the consoles keep up very well considering ram and driver limitations (and perhaps optimizations from devs). Dont either forget that base consoles dont perform too well either in many modern games, often dipping to the low 20's, and having many settings being equal to Low and medium on pc. In those cases, the mid gen consoles fare much better in performance, and even settings.

Therefor i stand by that a 7950 level GPU (R270, R280), can hang along just fine if you can live with matching settings to the base consoles, and have to alter settings. A 7850/70 is severly limited by its vram of 2gb in some titles, i think in 2012 2gb was seen as enough. In 2013 3 and 4gb gpu cards began to be more common.
I dont see the nonsense in this.
 
Dont either forget that base consoles dont perform too well either in many modern games, often dipping to the low 20's, and having many settings being equal to Low and medium on pc.

But what if...the mid/low of today is the high/highest of 2012 ?
Too many factors to really judge. What would a closed hardware with a 7950 (or whatever) of that time was fully utilised ?
BTW PCs from that era where not coupled with a sh*tty jaguar CPU though.
 
Status
Not open for further replies.
Back
Top