Digital Foundry Article Technical Discussion [2024]

the problem is this new trend we are witnessing where certain games will increase LOD and shadow draw distance upon increasing the resolution, thus putting more burden on the CPU.
Wait, how is that a "problem"? That is how it *should* work. Your LOD selection *should* be relative to pixel density in both primary view and shadow rendering (not so much the draw distance, but the resolution of the shadow maps). It would naturally be with any raytracing path, so why would you not want it to be with rasterization?

I'm pretty confident in saying that conventional games that did not scale these things based on screen resolution were doing it wrong, mostly out of convenience and the fact that in the past aspect ratio and resolution scaling were less of wide spectrum than they are today.

I haven't yet got a chance to listen to the DF discussion yet though. With Nanite and VSMs these increases in shadow resolution and LODs do not really impact the CPU much at all, so I'm not entirely sure where that part would be coming from with UE5/Hellblade specifically.
 
@Andrew Lauritzen
I think what is being rederenced is how Hellbalde 2 recommends a 10700K for 1440p 30 fps high settings and then a 12600K for 4K 30 fps high settings. As If that extra Bit of res magically requires so much more CPU grunt.

It is probably an oversight in the rec Specs.
 
@Andrew Lauritzen
I think what is being rederenced is how Hellbalde 2 recommends a 10700K for 1440p 30 fps high settings and then a 12600K for 4K 30 fps high settings. As If that extra Bit of res magically requires so much more CPU grunt.

It is probably an oversight in the rec Specs.
You see this in like 90% of 'requirement' specs for games. And goes to show they shouldn't be taken that seriously and are only rough guesses for the most part, not something that a bunch of thorough testing went into or anything.
 
You see this in like 90% of 'requirement' specs for games. And goes to show they shouldn't be taken that seriously and are only rough guesses for the most part, not something that a bunch of thorough testing went into or anything.
It's probably something as simple as whoever wrote the requirements had one PC with a 10700K and a certain GPU and another PC with a 12600K and a much more powerful GPU, so they simply specified the requirements to be the specs of the PCs they tested with.
 
Wow Fallout 4 is almost 10 years old. It’s getting a lot of air time for such an old game. The shadow draw distance on console looks pretty horrible in the video. I guess people just get used to it.
 
Starfield had to drop to 900p with DRS to maintain 60fps in the new performance mode, it can't do 60fps in big cities though, which suggests a clear CPU bottleneck as the game drops to 40fps or lower in these sections.

Big question is - how much of the CPU bottleneck was actually improved through all this?
 
Starfield had to drop to 900p with DRS to maintain 60fps in the new performance mode, it can't do 60fps in big cities though, which suggests a clear CPU bottleneck as the game drops to 40fps or lower in these sections.

Like Final Fantasy from last year .

This is why I think MS can capitalize on a zen 5/6 cpu console next year
 

How is this even possible? We had been hearing the exact opposite for years now lol…

What was MS their strategy, having journalists spread FUD about superior streaming competitors?
Again, very short sighted
 
How is this even possible? We had been hearing the exact opposite for years now lol…
Because it was. Sony bought a company last year that uses AI to upscale. Even if not in use for PS5 streaming, MS may have just not upgraded their tech where Sony has.
What was MS their strategy, having journalists spread FUD about superior streaming competitors?
Doesn't need a conspiracy theory or MS bashing to explain. Please just stick to intelligent, technical discussion. If something confuses you, ask and maybe you'll get a sensible or informed answer.
 
Because it was. Sony bought a company last year that uses AI to upscale. Even if not in use for PS5 streaming, MS may have just not upgraded their tech where Sony has.

Doesn't need a conspiracy theory or MS bashing to explain. Please just stick to intelligent, technical discussion. If something confuses you, ask and maybe you'll get a sensible or informed answer.


the digital foundry article contradicts everything ever written about this subject so…
 
I don't understand what this page of search results means.
the digital foundry article contradicts everything ever written about this subject so…
The DF article is looking at the service now, as opposed to what it's been since Sony acquired Gaikai. Sony have greatly improved PS streaming since their earlier Gaikai days. That doesn't contradict everything said about the two services over the years.

Edit: Older DF article:


"However, the specs don't alter when streaming PlayStation 4 games, meaning that the standard 1080p framebuffer gets a 720p downscale before encoding, then it's blown up to the display output set on the client."

eg. It used to be a 720p stream for every game.
 
The PS premium service is only meant to used for play before you install. It does not have high concurrent availability and no DLC is supported. Meaning it’s not a high availability cloud streaming service. High fidelity games can only be for select PS5 games and for PS5 console owners. All other titles use the older gaikai service.

It’s extremely limited.
 
Last edited:
Back
Top