Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
Seried X and PS5 both received a 50$ price hike today after 3 years in the console cycle, normally by that time they would have gotten 50$ cheaper, so in effect, they are now a 100$ more expensive than their comparable previous versions.

Manufacturing is getting more expensive, Silicon is not getting cheaper this time, a more powerful PS5 Pro would cost significantly more this time.
 
If you wait another year or so and aren't letting yourself be tied to previous consoles then you could end up with zen 5/6 with big little cores. LIke I said above the little cores can be used for the OS while the big cores can be reserved only for gaming. With zen 5 or 6 we should see some really large gains in performance over zen 2.

If we're talking 2025/2026 then it's largely pointless to define it as 'mid gen'. You're talking about an entirely new console. I mean, Zen6?
 
That was the other aspect that I didn't mention. There does not seem to be any acknowledgement that AMD may be designing unconventional APU configurations for consoles that differ what what is known compared to AMDs disclosed GPU roadmap.

Consoles are not PCs, the architecture and bottlenecks are different. Inserting higher-clocked CPUs and increasing core GPU resources can deliver quite significant performance profile compared to a PC.

Sure, but if you deviate too significantly from the current architecture, then you run the risk of any of the benefits being realized requiring extensive coding changes to existing titles. The changes from the PS4 to the PS4 Pro and Xbox/One X were relatively trivial in architectural terms, but there were still plenty of games that could have used Pro/One X upgrades that developers still never bothered with. The QA process alone can be quite a time sink.

Part of the anticipation of the rumored PS5 Pro upgrade is that it could immediately translate this increased potential power to games automatically, as the PS5 did for PS4 games. People see the proliferation of uncapped framerate/resolution modes and think "Man can't wait to slap in my PS5 Pro and play RE4 in res mode at 60fps!". If you change the architecture substantially, such as perhaps using chiplets for the actual GPU cores which yes - you could potentially get a solid performance increase that's potentially more economically viable, but also better able to be utilized in a closed system vs the PC where the management would pretty much have to be invisible for the developer.

My feeling though is that a midgen refresh will have to make some, if not most, of that potential performance available either automatically, or so trivial to implement that it's barely a thought for the developer. The more esoteric you make the architecture in the name of optimization, the more it behaves like an entirely new generation and not a midgen update. If you want developers to be managing big.little cores and spreading workloads across separate GPU dies then it kind of defeats the purpose of it being a simple tiered product line - I mean people don't have to consider getting iPad Pro optimized versions of their software vs. Ipad Air. They just pay more for the better iPad and get the benefits.
 
Seried X and PS5 both received a 50$ price hike today after 3 years in the console cycle, normally by that time they would have gotten 50$ cheaper, so in effect, they are now a 100$ more expensive than their comparable previous versions.

Wait, PS5? I see only the SX got the price increase (which does indeed reinforce your point, no doubt), but are you referring to the hike the PS5 already got in 2022 for outside the US? I don't think the PS5 got an additional increase.
 
If we're talking 2025/2026 then it's largely pointless to define it as 'mid gen'. You're talking about an entirely new console. I mean, Zen6?

I think its better than a zen2/rdna2 2024 refresh. 2025 would place the console generation at 5 years vs 7 of last gen and you'd get access to zen 5 maybe even zen 6 along with rdna 4 or 5 depending on when those are releasing
Wait, PS5? I see only the SX got the price increase (which does indeed reinforce your point, no doubt), but are you referring to the hike the PS5 already got in 2022 for outside the US? I don't think the PS5 got an additional increase.
I believe PS5 got hikes last Summer and now MS is following
 
But if they're limited to 30fps by CPU performance there's no way they are getting a 2x faster CPU in a mid gen upgrade. Also the prospect of a reasonably priced mid gen upgrade even just focussed on GPU improvements seems unlikely given both Sony and now Microsoft have been forced to increase the prices of their current gen consoles. Clearly there are no costs savings with them so more performance is certainly going to come with an appreciably higher price tag.

Not to mention that being CPU limited to 30fps might even be a limitation of single core performance in some cases, especially with engines that aren't spectacularly well multi-threaded, which is still quite a lot of them.
You aren't getting 2x single thread performance at any price in 2023. You might get pretty close with a hypothetical Zen5 APU with 128MB of 3D stacked V-Cache stuck on top of it, but from a cost and yield perspective that's a complete non-starter.
 
There is a limitation to a Zen2 CPU + ~2080 GPU performance.

There are limitations of a Zen2 CPU + ~2080 GPU but the new consoles created a lot of headroom for performance over the 4 and One. I doubt most devs eagerly took advantage of that performance increase while doing so as efficiently as possible. Given the pandemic and the length of development of the last round of games, I think there is room for improvement and more effective use of the hardware.

Devs spent years just taking advantage of low hanging fruit (things like better fps higher and resolution) for more performant gaming hardware that was above the baseline performance offered by last gen consoles that I doubt they were ready to fully take advantage of the performance offered by the new baseline (XSX and PS5).
 
Last edited:
For the 'average gamer', is there a significant difference between bilinear upscaling and any form of reconstruction at all? We can hand-wave away a ton of image quality improvements over the years by referring to some hypothetical 'average gamer' who only tuns on their console for CoD and Madden.

Hypotetical average gamer? Come on, most people are looking for a good time and do not care if the game stutters in a transition or if it's 4k or sub 4k or 1440p and what not.
For the sake of simplification, if we put all users on a normal distribution curve, the madden and CoD type of players are a larger part than the people reading DF as a holy scripture.
 
Hypotetical average gamer? Come on, most people are looking for a good time and do not care if the game stutters in a transition or if it's 4k or sub 4k or 1440p and what not.
For the sake of simplification, if we put all users on a normal distribution curve, the madden and CoD type of players are a larger part than the people reading DF as a holy scripture.
I've seen people thinking Mario Odyssey has better graphics than TLOU2 because it has a brighter color palette, so I'd say yes many people wouldn't even care as us tech nerds here.
 
Hypotetical average gamer? Come on, most people are looking for a good time and do not care if the game stutters in a transition or if it's 4k or sub 4k or 1440p and what not.
For the sake of simplification, if we put all users on a normal distribution curve, the madden and CoD type of players are a larger part than the people reading DF as a holy scripture.
I think you might be arguing against your point here -- COD games are technical standouts, and recent madden games ship with 60/120fps modes. I don't think normal gamers obsess over this stuff, but good tech helps games feel good -- all else being equal a game with better tech is a better game.
 
Wait, PS5? I see only the SX got the price increase (which does indeed reinforce your point, no doubt), but are you referring to the hike the PS5 already got in 2022 for outside the US? I don't think the PS5 got an additional increase.
Sorry, I meant both got a price hike afterall, PS5 didn't get another hike after the 2022 one.
 
Not to mention that being CPU limited to 30fps might even be a limitation of single core performance in some cases, especially with engines that aren't spectacularly well multi-threaded, which is still quite a lot of them.
You aren't getting 2x single thread performance at any price in 2023. You might get pretty close with a hypothetical Zen5 APU with 128MB of 3D stacked V-Cache stuck on top of it, but from a cost and yield perspective that's a complete non-starter.

More cache might have disproportionate gains. It's problem with CPU performance gains in general in that the actual performance of the CPUs have greatly outpaced performance improvements in terms of access to data in memory. The PS5's Zen 2 APU only has a relatively paltry 8MB of L3 (last level cache) before having to go to system memory, and GDDR6 wasn't exactly engineered with latency in mind.
 
For the 'average gamer', is there a significant difference between bilinear upscaling and any form of reconstruction at all? We can hand-wave away a ton of image quality improvements over the years by referring to some hypothetical 'average gamer' who only tuns on their console for CoD and Madden.

I think the perspective is too maybe look at in terms of say something else one might have an interest in but isn't exactly an enthusiast of.

Let's take burgers. Most people are probably fine with a McDonald's Big Mac. They aren't going to compare in detail how it stacks up to another better burger. They aren't going for the most part going out of their way to search for a better burger. But if you let them try something better chances are they'll be able to tell the difference (especially in an A/B test) and they'll prefer that going forward if you can offer it at equal convenience (access and cost), while some might also just prefer said Big Mac due to familiarity and subjective preference.

And the above can be applied to any type of gaming/hardware technology or really any other field/area.
 
Hypotetical average gamer? Come on, most people are looking for a good time and do not care if the game stutters in a transition or if it's 4k or sub 4k or 1440p and what not.
For the sake of simplification, if we put all users on a normal distribution curve, the madden and CoD type of players are a larger part than the people reading DF as a holy scripture.

By 'hypothetical' I mean that the term 'average gamer' morphs into whatever definition the user promoting the term wants at the time. Neither of us really have any idea what standards the 'average' gamer really holds. I mean I could say since Cod and Madden have been 60+ fps for years, the 'average gamer' really values 60fps.

Regardless, you don't really seem to be really arguing against my point? Using my out-of-my-own-ass definition of the average gamer, I feel they would not notice the difference between bilinear upscaling and reconstruction, or at least not care that significantly. You're seemingly in agreement. I'm saying if that demographic is the one that defines what is 'worth it' with respect to acceptable reconstruction quality, we probably wouldn't have bothered with reconstruction at all. So basically I just don't see the point of invoking them wrt to what is acceptable in terms of reconstruction quality. Yeah, they probably wouldn't immediately notice the difference between FSR and DLSS. And they also wouldn't object that much to the difference between bilinear and DLSS either, so why use them in an argument about reconstruction quality at all?

It's not like FSR 2.0 has been lauded by the gamers exposed to it I've seen. Yes, again - maybe 'average' gamers are, but I can only go by what I've seen, which are tech reviews and posters on internet forums. Even without DLSS as a benchmark, FSR 2 on console titles is not exactly blowing people away atm. I think there's room for improvement.
 
Last edited:
In my experience, FSR does a fine job considering it's performance cost. It's not perfect, and does cause some visual instabilities, of course. And I don't thin I've ever seen it in a game enhance the image quality like DLSS can, but in terms of a performance enhancing feature, it offers a tangible performance uplift with acceptable image quality at the higher settings in most cases. It's near universal compatibility is also a plus in it's favor, including it's inclusion in game and in apps like Lossless Scaling make it a great tool for GPUs without the performance penalty of XESS or the support for DLSS. Yes, it's the worst of the 3 upscalers mentioned in terms of image quality, but it's still usually better than running at it's internal resolution.

I totally agree, I think that FSR is pretty competitive when the source resolution is high enough. eg. 1440p -> 4K, or 1080->1440p
But it really starts to fall away when the input resolution is significantly lower than the final output resolution.

I sorta wish they had restricted it so that it would force devs to at least maintain a somewhat decent input res, However that would restrict it's usefulness too :(
 
I don’t imagine Sony could offer the bare minimum 2x general performance improvement required to even begin to justify a pro console without losing over one hundred dollars per unit at $600.
I don't think PS5 and XSX need a 2x performance improvement to avoid 30fps games, they need whatever boost is required to reliably hit 60. If games, are reliably hitting 40fps or 50fps, that's far less a boost to clocks or increase in CUs. This is really going on the comments from Todd Howard on Starfield's performance on XSX “It is running great” and even sometimes at 60fps. But on the consoles, we do lock it because we prefer the consistency, where you’re not even thinking about it”. Starfields runs at native 4K on XSX and 1440p on XSS.

I don't think it'll take long for mods to fix the situation on XSX. There were 60fps mods for Fallout 4 on XSX long before the title qualified for FPS boost.
 
I don’t imagine Sony could offer the bare minimum 2x general performance improvement required to even begin to justify a pro console without losing over one hundred dollars per unit at $600.

If you look at the PC space you're basically looking at 6950XT/3090ti performance to offer 2x jump in raster.

There's no way they can pack that level of performance in to a Pro console while keeping prices reasonable any time soon.
 
I don't think PS5 and XSX need a 2x performance improvement to avoid 30fps games, they need whatever boost is required to reliably hit 60. If games, are reliably hitting 40fps or 50fps, that's far less a boost to clocks or increase in CUs. This is really going on the comments from Todd Howard on Starfield's performance on XSX “It is running great” and even sometimes at 60fps. But on the consoles, we do lock it because we prefer the consistency, where you’re not even thinking about it”. Starfields runs at native 4K on XSX and 1440p on XSS.

I don't think it'll take long for mods to fix the situation on XSX. There were 60fps mods for Fallout 4 on XSX long before the title qualified for FPS boost.
It depends what the FPS is in the "worst case" situation. If the FPS is close to 30 there, then you will need a near 2X boost. I can see FPS being around 60 on the barren planets and then tanking in the cities, like we saw in The Witcher 3 and like Alex showed with Star Citizen.
 
Status
Not open for further replies.
Back
Top