Digital Foundry Article Technical Discussion [2024]

The upside to this idea is clarity. The downside is loss of brevity which people naturally gravitate towards. "1080p" or similar explains what's happening for 99% of titles. 1920x1080 doesn't add anything but doubles the amount of communication bandwidth. When your audience is very much the layman rather than an engineer reading a technical document, choice of language will tend towards the vernacular.

This is why despite my petitioning, 4K is always presented as 4K and not 2160p, even though 4K isn't even defined. It's assumed to be 2160p. No-one targeting the average Joe is going to choose '3920 x 2160' over '4K' in describing an output resolution.

If you want accurate data in an article like this, you need to wrap it in a language-friendly presentation, meaning concise. You could do things like give a number as a percentage of 3920 x 2160 pixels drawn and quality.

"Hellblade 2 renders 0.20 work and 0.25-0.35 drawn"

Brief and informative, meaning 20% as many pixels of a 3920x2160 display are calculated and 25% to 35% of the pixels are drawn. However, that doesn't read nicely without being 'trained' to read it, whereas "xxxxp" is now naturalised.

I doubt a site making something up on their own and trying to normalise it won't get very far, and I doubt it can be considered worth the effort to try. The negatives of short-form reporting using the current language are fairly niche and most people don't care. The fact console-warriors weaponise data isn't DF's responsibility so long as they aren't deliberately (mis)representing the data to help gain hits from eager console warriors looking for ammo.
I agree with your point. But sometimes I see them just have massive letterbox across the screen that they write out the full resolution. I find that helpful because I’m not always doing mental math as I watch the video if the resolution isn’t normal.
 
As a suggestion, perhaps DF could add a callout box with some key Tops Trumps specs. Would be outside the body of content so not slow down the narrative, but would add precision to the data for those who care.
 
The original Alone in the Dark - hailing all the way back from 1992 - is re-imagined with Unreal Engine 4. The story and puzzle elements are dramatically overhauled, where PS5, Series X and S strive for 60fps gameplay in a decent performance mode. However there are technical rough points to this release which detract from what could be a much better game.
 
I wonder if using the retro character models results in higher dynamic resolution or a reduction of frame drops in like for like scenes.
Not that anyone would want to play that way. But it would make for an interesting comparison.
 

I've tried the PC version of this which performs fine...except, it's UE4 by a small studio. So you guessed it - no shader precompilation + traversal stutter.🙄

The traversal stutter is annoying but relatively brief enough, the shader stutter is the key culprit though - it will have rooms halt and skip when you enter them and just pan around. Jesus man.
 
Last edited:
I've tried the PC version of this which performs fine...except, it's UE4 by a small studio. So you guessed it - no shader precompilation + traversal stutter.🙄

The traversal stutter is annoying but relatively brief enough, the shader stutter is the key culprit though - it will have rooms halt and skip when you enter them and just pan around. Jesus man.
Everybody knows this about the traversal stutters but almost none bothers doing a damn thing about it.
 
30fps gaming is dead and the few 30fps only games released this gen bombed bar maybe one (Starfield).
current consoles have shown that 60fps could be a standard on consoles, people were waiting for that for many years, every gen the classic discussion. It's not without tradeoffs or without 30fps quality modes, but now the CPUs are better. Next gen should be 4K 60fps standard, but who knows. Consoles' specs can improve dramatically, i.e. vram amount.

PS2 had 32MB of RAM
PS3 had 16 times that: 512MB of RAM
PS4 had 16 times that: 8GB of RAM
PS5 has 2 times that: 16GB

This has broken the progression, but PS6 should have 2TB of RAM, and PS7 should have 32TB of RAM.
 
current consoles have shown that 60fps could be a standard on consoles, people were waiting for that for many years, every gen the classic discussion. It's not without tradeoffs or without 30fps quality modes, but now the CPUs are better. Next gen should be 4K 60fps standard, but who knows. Consoles' specs can improve dramatically, i.e. vram amount.

PS2 had 32MB of RAM
PS3 had 16 times that: 512MB of RAM
PS4 had 16 times that: 8GB of RAM
PS5 has 2 times that: 16GB

This has broken the progression, but PS6 should have 2TB of RAM, and PS7 should have 32TB of RAM.
I assume this is a typo. PS6 wont have anywhere near 2 TB of RAM and there is unlikely to be a scenario where a PS7 is plausible.
 
current consoles have shown that 60fps could be a standard on consoles, people were waiting for that for many years, every gen the classic discussion. It's not without tradeoffs or without 30fps quality modes, but now the CPUs are better. Next gen should be 4K 60fps standard, but who knows. Consoles' specs can improve dramatically, i.e. vram amount.

PS2 had 32MB of RAM
PS3 had 16 times that: 512MB of RAM
PS4 had 16 times that: 8GB of RAM
PS5 has 2 times that: 16GB

This has broken the progression, but PS6 should have 2TB of RAM, and PS7 should have 32TB of RAM.

This is just an unrealistic expectation. The PC at the time of PS2 (around 2000) typically have around 128MB ~ 256MB SDRAM. So it's actually PS2 had too little main memory compared to PC. Similarly, PCs around the time of PS3 (2006 ~ 2007) typically has 2GB ~ 4GB RAM. The consoles were just catching up with the memory sizes.
If you expect the same jump in memory sizes that'd mean consoles having much more main memory than typical PC, which will make them extremely expensive.
 
This has broken the progression, but PS6 should have 2TB of RAM, and PS7 should have 32TB of RAM.

You must be referencing Cerny's new "more ram on demand" patent. Download as much as you want! :yep2::mrgreen:

aWGegL4_700bwp.webp
 
current consoles have shown that 60fps could be a standard on consoles, people were waiting for that for many years, every gen the classic discussion. It's not without tradeoffs or without 30fps quality modes, but now the CPUs are better. Next gen should be 4K 60fps standard, but who knows. Consoles' specs can improve dramatically, i.e. vram amount.

PS2 had 32MB of RAM
PS3 had 16 times that: 512MB of RAM
PS4 had 16 times that: 8GB of RAM
PS5 has 2 times that: 16GB

This has broken the progression, but PS6 should have 2TB of RAM, and PS7 should have 32TB of RAM.
The SSD's were largely in response to the lack of ability to improve RAM quantity, since price per GB improvements have slowed to a crawl this past decade or so. If you can stream large amounts of data into RAM quickly, you dont need as much spare data sitting around 'just in case', so you get a huge RAM(usage) efficiency boost. MS likes to talk about 'memory multipler', which is basically a different way of saying the same thing. Doing more with less. MS also brought out Sampler Feedback Streaming for this same thing(though to what end it's gotten used is largely unknown up to this point).

More memory/memory efficiency is more important for giving headroom for developers to do more rather than push framerates. Just being able to have more data that you can throw at the screen is important in order to have much higher fidelity or detailed characters and worlds.

I also have no idea how MS/Sony plan to address this next gen, since the low hanging fruit of switching to SSD's is already done. Just having ever faster SSD's is only going to go so far here. Perhaps sticking with a cheaper, higher capacity SSD combined with like 128GB of ReRAM or something in front of it might be the boost they need.
 
current consoles have shown that 60fps could be a standard on consoles, people were waiting for that for many years, every gen the classic discussion. It's not without tradeoffs or without 30fps quality modes, but now the CPUs are better. Next gen should be 4K 60fps standard, but who knows. Consoles' specs can improve dramatically, i.e. vram amount.

PS2 had 32MB of RAM
PS3 had 16 times that: 512MB of RAM
PS4 had 16 times that: 8GB of RAM
PS5 has 2 times that: 16GB

This has broken the progression, but PS6 should have 2TB of RAM, and PS7 should have 32TB of RAM.

The move to HD broke progression and the consoles have never recovered from that leap.

The amount of RAM they had was only a small part of the equation.

PS2 had a lot of 60fps games because it was stupid powerful for the resolution it was targeting (As was Xbox and GC)

And with how CRT TV's worked back then you could drop the resolution and still have a crisp and sharp image, where as modern flat panels (especially in the PS360 era) looked terrible with anything that wasn't native.

Consoles will always have to choose between 60fps or prettier pixels at 30fps, and I'm not sure how the console market as a whole would react to not having the leap in visuals they expect if developers go for 60fps.
 
PS2 had 32MB of RAM
PS3 had 16 times that: 512MB of RAM
PS4 had 16 times that: 8GB of RAM
PS5 has 2 times that: 16GB

This has broken the progression, but PS6 should have 2TB of RAM, and PS7 should have 32TB of RAM.
Yeah, but no. Above a certain amount it's a pointless resource and wasted. 96GB of RAM this gen would have made zero difference yet cost the earth. Same with framerate - above a certain amount, you won't gain any benefit. Same with resolution. 16GB is a pretty good fit for what the rendering can do, especially with the massive improvement in storage which isn't shown in your comparison. What was the storage speed up and latency decrease from PS1 to PS2 to PS3 to PS4 to PS5? PS4 to PS5 is what, 3 gbps to 44 gbps? More RAM was the only option in increase data availability - PS5 moved that need to a huge storage upgrade.
 
I also have no idea how MS/Sony plan to address this next gen, since the low hanging fruit of switching to SSD's is already done. Just having ever faster SSD's is only going to go so far here. Perhaps sticking with a cheaper, higher capacity SSD combined with like 128GB of ReRAM or something in front of it might be the boost they need.
Why do they need to? We shouldn't be looking for upgrades for upgrades sake. What is the work that can be done and what are the resources needed to enable that? Given x amount of compute and processing power, so-and-so costs to produce content, xyz streaming tech, what actual amount of RAM and storage specs are needed to supply the perfect amount of data? Or more realistically, what's the optimal cost/benefit to hit probably a non-ideal solution but to get that 80% of ideal resources for 25% of the cost - same way we balance clock speeds with power drawer to get a more balanced power than just clocking as high as possible.
 
Why do they need to? We shouldn't be looking for upgrades for upgrades sake. What is the work that can be done and what are the resources needed to enable that? Given x amount of compute and processing power, so-and-so costs to produce content, xyz streaming tech, what actual amount of RAM and storage specs are needed to supply the perfect amount of data? Or more realistically, what's the optimal cost/benefit to hit probably a non-ideal solution but to get that 80% of ideal resources for 25% of the cost - same way we balance clock speeds with power drawer to get a more balanced power than just clocking as high as possible.

And ray tracing next generation should allow developers to toss out a whole heap of things like cube maps from their games which itself should save a decent chunk of memory.

I wonder how much RAM a game like Spiderman 2 uses for reflections and what using RT only would reduce that too.
 
Yeah, but no. Above a certain amount it's a pointless resource and wasted. 96GB of RAM this gen would have made zero difference yet cost the earth. Same with framerate - above a certain amount, you won't gain any benefit. Same with resolution. 16GB is a pretty good fit for what the rendering can do, especially with the massive improvement in storage which isn't shown in your comparison. What was the storage speed up and latency decrease from PS1 to PS2 to PS3 to PS4 to PS5? PS4 to PS5 is what, 3 gbps to 44 gbps? More RAM was the only option in increase data availability - PS5 moved that need to a huge storage upgrade.
Imagine if the XB1/PS4 era were limited to just 4GB instead of 8GB. Games would have been necessarily limited in their detail and fidelity, especially in the second half of the generation when most all the most impressive games properly using the potential of the consoles came out.

XSX/PS5 are still limited by this, and while a doubling of RAM still raises the ceiling, it would likely not be enough on its own to provide a full 'generational leap' of fidelity and ambition. If not for the help of SSD's, this gen would likely be far more limited.

Devs will always want more memory. It's overhead. This means a higher ceiling to push higher fidelity assets and/or more detailed scenery. It's always, always useful. And because of the exponential nature of assets requiring higher file sizes to be better, similarly we need a lot more memory to be able to store all this. And if you want to do this for the most amount of assets possible, you really want a very large increase in memory capacity.

We are in the very opposite situation of having 'overkill' memory quantities. It's gonna be a fight to better use the limited memory it's economically possible to include.
 
Back
Top