Digital Foundry Article Technical Discussion [2020]

Status
Not open for further replies.

Neat. Guess that also confirms how GPU-limited Bloodborne is as he dropped the resolution down to 720p and got a much more stable 60fps. Some areas are still in the 40s.
 
Last edited:
Neat. Guess that also confirms how GPU-limited Bloodborne is as he dropped the resolution down to 720p and got a much more stable 60fps.

They mentioned in the video that a full Pro-patch still wouldn't be able to hit 1080p60, which I found an odd statement, why not? and then in the next sentence Linneman was talking about 4K60 mode for PS5... :)
 
Neat. Guess that also confirms how GPU-limited Bloodborne is as he dropped the resolution down to 720p and got a much more stable 60fps. Some areas are still in the 40s.

Ye, there's a reason it's a 30fps game, it's not possible at 60 otherwise they would have done that.
 
Neat. Guess that also confirms how GPU-limited Bloodborne is as he dropped the resolution down to 720p and got a much more stable 60fps. Some areas are still in the 40s.
As much grief the people give this generations CPUs, most games on console are GPU limited. Otherwise, dynamically dropping rendering resolution wouldn't steady out the framerate; a technique used quite often this gen.
 
As much grief the people give this generations CPUs, most games on console are GPU limited. Otherwise, dynamically dropping rendering resolution wouldn't steady out the framerate; a technique used quite often this gen.

But thats the whole thing my guy...engines and games BECAME GPU dependent early on this gen because they were optimized and formed around taking advantage of the consoles. If the consoles had stronger CPU's, that would not have been the case.

So its not a matter of the CPU's being underestimated but the fact that devs are designing their games intentionally not to put too much stress on the CPU due to how weak they are.

If you dont do that, you get games like assassin's creed unity and just cause 3. really terrible affairs. Not to mention all the games that end up having bad performance due to devs not parallelizing work loads enough( because its pretty much mandatory to spread out work to all the cores with parallel operations) with such low clockspeed and IPC the only real thing devs could use was lowering CPU demands and paraellizing for the multi core format to get as much working performance as possible under those limitations.

That's why next gen is so good...CPU's in the consoles are a world away from jaguar and games can start using real CPU crunching again
 
Last edited:
But thats the whole thing my guy...engines and games BECAME GPU dependent early on this gen because they were optimized and formed around taking advantage of the consoles. If the consoles had stronger CPU's, that would not have been the case.

So its not a matter of the CPU's being underestimated but the fact that devs are designing their games intentionally not to put too much stress on the CPU due to how weak they are.

If you dont do that, you get games like assassin's creed unity and just cause 3. really terrible affairs. Not to mention all the games that end up having bad performance due to devs not parallelizing work loads enough( because its pretty much mandatory to spread out work to all the cores with parallel operations) with such low clockspeed and IPC the only real thing devs could use was lowering CPU demands and paraellizing for the multi core format to get as much working performance as possible under those limitations.

That's why next gen is so good...CPU's in the consoles are a world away from jaguar and games can start using real CPU crunching again
While I don't disagree with anything you said, this has been basically true of most consoles since the beginning. Consoles have been traditionally bound by their graphics processors, and almost all of them shipped with CPUs that were well below what would be considered high end in the PC space. The fact that developers had to optimize around the limitations of the platform isn't new, either.

I guess a case could be made that the weaker CPUs this gen held back gameplay innovations, but you cited 2 examples that seamed to design games without regard for the limitations of current generation systems, and they both have sequels that perform better without compromises to gameplay. So maybe gameplay wasn't held back so much, and developers just needed a bit more time optimizing for the platform. Also, in Unity and JC3's defense, they run better than many 360 and PS3 games, so maybe they thought they were doing well enough. I know Unity was glitchy at launch, but that mostly got sorted out with patches.

In the PC space, the shift from CPU to GPU reliance had been happening well before 2013 when PS4 and Xbox One released. PhysX is a great, somewhat recent example of an initiative to move work from the CPU to the GPU. But remember that in the early days of 3D, all of the transform and lighting was done on CPUs while graphics processors mostly just drew pixels. T&L moved to the GPUs, most famously with nVidia's GeForce, and I think that launched the same year as Dreamcast.
 
Actually I always thought that it was the PC that caused games to be developed to use GPU for almost everything because the communication between CPU and GPU has traditionally been far less efficient, and it was when games were optimized for consoles we saw more CPU and GPU integration in the pipelines etc.
 
Neat. Guess that also confirms how GPU-limited Bloodborne is as he dropped the resolution down to 720p and got a much more stable 60fps. Some areas are still in the 40s
He still had to enable Boost Mode to hit 60, which is interesting, because the snippets they showed of base clocks it looked like ~40fps (~3:14 in the video). Boost only gives a 30% CPU uplift, which would only get you to ~52fps, so I guess that cleared some stalls? I wonder how much just adding the specialized de/compression hardware MS and Sony are touting for the next gen would help current gen consoles, even with HDDs. Would it help the 99th percentile more than the average frametime?

It would be nice if Sony/From would work together (for their mutual benefit, as surely what they learn could be applied to other titles) to give this a 4k60 (maybe even 2k120!) mode for PS5 and, while they’re at it, throw in a 720p60 option for PS4 and 4k30upscaled/1080p60 PS4Pro modes. It would be a cool teaser for the rumored Demons Souls remaster. Heck, make it a preorder bonus, if you must, not that it would take much to get a Soulsborne fan to pay for either.

Will next-gen games all default to at least dynamic resolutions with an eye to seamless (no patches required) future hardware upgrades? How much performance would a game sacrifice by including a dynamic 4k that would only average 1440p versus a fixed 1440p upscaled to 4k? As long as I’m dreaming, if the next gen will be more future-proof, will it also be more flexible with regards to output (supporting, say, a 1440p PC monitor natively), or are we going to stick with basic TV targets (i.e., 1080p, 4k, and 8k)?
 
Last edited:
The problem also stems from the DX12 branch they are using, it's not optimized and it performs badly compared to DX11.
There will come a day in when the traditional T&L model will be tossed and only support RT lighting. I suspect when that day arrives, efficiency will be much greater than we have today.
But I think we're looking at huge rebuilds of engines. Or new engines from scratch.
 
Status
Not open for further replies.
Back
Top