Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
The REAL failure comes from Microsoft's own studios supporting the device with AAA games which push production values. Yes, MS has games which look awesome.. Forza Motorsport/Horizon, Gears, MS Flight Sim, and there's some upcoming ones like Hellblade 2.. but they don't have a ton of studios making cinematic action adventure open world games that Sony does. If Microsoft had games really raising the bar for production values, then at least they could point to their own games and say, look how amazing our hardware is.. instead of relying on 3rd parties to push out somewhat higher resolutions and framerates to underscore the power.

Great point, first party exclusives should be standard bearers for what a console can do.
 
Great point, first party exclusives should be standard bearers for what a console can do.

True, but it's still possible to read too much into them.

Ratchet and Clank looked/looks great, and a lot was made of it's inability to run on anything but a PS5 (Insomniac were a little coy but never outright dishonest).

Turns out it can run fine on just about any old SSD, even a SATA chud.
 
I wouldn't call burning lots more power on the GPU to achieve more or less the same results "more efficient".

Smarter, perhaps, but not more efficient in the physical sense.

Go for the low hanging fruit first if you're making a console. If you don't have market momentum behind you you're not going to be a priority for engine optimisation even if you have more features and more compute.
I don’t think he means power efficient, but rather die space efficiency. Higher perf per mm2. Also PS5 is at about the norm for console power usage and fits well within their thermal design, Xbox seems to have gone wider but are reluctant to clock higher probably due to yield reasons on their larger die.

Also if you’re arguing about better optimization on one vs the other, then to me that’s indicative that the two consoles are pretty much equivalent. It’s not like One X and PS4 Pro where it would be fairly difficult to not have some sort of performance advantage on One X.
 
We’re almost three years into the gen, it’s getting harder to accept that there isn’t some hardware explanation for the PS5 overperforming or the Series X underperforming. Series X has a clear computational and memory bandwidth advantage, but the PS5 is the one with higher average framerate in Cyberpunk 2077 despite not using VRS and at essentially equal resolutions.
Yea that’s fairly valid, though I disagree about essentially equal resolutions. It’s not reasonable for someone to count, but using tooling I have discovered that resolution fluctuations happen significantly more than people suspect, and it happens all very quickly. Frame rate issues may be indicative of a slightly misaligned DRS system, but it all comes down to where the bottleneck is.

I would say without knowing the actual numbers on the resolution, it’s a bit inaccurate to say they are running equal therefore the only measurable metric frame rate is all that matters, it’s certainly easier though.

But I don’t disagree with your point of view, I just wouldn’t say that upper and lower bounds are not that useful if you’re going to be precise around frame rate, we should also be precise around resolution and settings then.
 
We’re almost three years into the gen, it’s getting harder to accept that there isn’t some hardware explanation for the PS5 overperforming or the Series X underperforming. Series X has a clear computational and memory bandwidth advantage, but the PS5 is the one with higher average framerate in Cyberpunk 2077 despite not using VRS and at essentially equal resolutions.
We're talking about the PS5 that has a Higher GPU clock that gives it higher geometry throughput, faster cache, faster processing of the command buffer and higher fillrate, and has ROPs that in some situations can outperform the Series ROPS 2 to 1 at the same clock, but again, are clocked higher.

PS5 has hardware advantages over Series X, just like Series X has advantages over PS5. I think what Sony achieved with PS5 is a set of hardware that addresses common bottlenecks for most game workloads, making it easier to extract performance from it.
 
We're talking about the PS5 that has a Higher GPU clock that gives it higher geometry throughput, faster cache, faster processing of the command buffer and higher fillrate, and has ROPs that in some situations can outperform the Series ROPS 2 to 1 at the same clock, but again, are clocked higher.

PS5 has hardware advantages over Series X, just like Series X has advantages over PS5. I think what Sony achieved with PS5 is a set of hardware that addresses common bottlenecks for most game workloads, making it easier to extract performance from it.
Developers are Certainly going to get more performance out of PS5 if you’re using lots of baking and streaming technologies.

But despite some massive advantages for ps5 as you listed above, the XSX can often hold parity or step above in resolution. The 20% compute and bandwidth is doing its work, there’s no other reason for XSX to be able to keep up.
 
Developers are Certainly going to get more performance out of PS5 if you’re using lots of baking and streaming technologies.

But despite some massive advantages for ps5 as you listed above, the XSX can often hold parity or step above in resolution. The 20% compute and bandwidth is doing its work, there’s no other reason for XSX to be able to keep up.
Each console has an advantage in different workloads. I think it's clear at this point, though, that they usually come out about the same, unless a game really leverages the few parts of the systems that have sizable advantages or disadvantages.
 
There's nothing amazing about this. Sony's console sells better, and it gets more love and care from developers. Not to say devs don't love and care the Xbox versions, but at the end of the day, they make their games around the Playstation.

The REAL failure comes from Microsoft's own studios supporting the device with AAA games which push production values. Yes, MS has games which look awesome.. Forza Motorsport/Horizon, Gears, MS Flight Sim, and there's some upcoming ones like Hellblade 2.. but they don't have a ton of studios making cinematic action adventure open world games that Sony does. If Microsoft had games really raising the bar for production values, then at least they could point to their own games and say, look how amazing our hardware is.. instead of relying on 3rd parties to push out somewhat higher resolutions and framerates to underscore the power.

Playstation 5 could lose every single 3rd party head to head, and yet they could still point to games like Horizon FW, or Ratchet and Clank, to show how capable their hardware is.

It's not easy for MS to get the edge over Sony now, considering how ingrained both of the main platforms (Xbox and PC) they support are when it comes to APIs and tools. They've got to make the lives of developers easier in any ways possible.. but they've got to start showing everyone what their most powerful console is capable of on their own. Redfall and Starfield aren't going to do it... I'm hoping Hellblade 2 is a lot deeper and more fleshed out gameplay-wise and that they really stick the landing with the visuals. I hope the next Gears looks amazing, and I'm sure the next DOOM game will probably be very technically proficient.

They gotta lead the way before things will get better for them. I think if they started selling more and doing better in other parts of the world, quality of ports would increase as well.
It does also not help to overpromice in advance. Minecraft with RT support ....
Still waiting...
There was so much that went wrong this generation. I guess Sony delivered more by better hiding the compromises e.g. of RT features (e.g. very low poly objects in reflections). It also seems like MS is missing out at playable tech demos to bring a bit fun into developers and players homes. Like the little robots Sony has to demonstrate new features. Something like this is totally missing on the platform. They made a demo last gen for HDR but that was it.
They really implemented a few cool hardware based features, but are unable to demonstrate those (only with some not so much fun demos but not on the console).
 
We're talking about the PS5 that has a Higher GPU clock that gives it higher geometry throughput, faster cache, faster processing of the command buffer and higher fillrate, and has ROPs that in some situations can outperform the Series ROPS 2 to 1 at the same clock, but again, are clocked higher.

PS5 has hardware advantages over Series X, just like Series X has advantages over PS5. I think what Sony achieved with PS5 is a set of hardware that addresses common bottlenecks for most game workloads, making it easier to extract performance from it.
Basically what Cerny patiently and thoroughly explained in his first presentation. Almost everything he told was eventually proved with the games.
 
Basically what Cerny patiently and thoroughly explained in his first presentation. Almost everything he told was eventually proved with the games.
You give him way too much credit. You are looking at a near carbon copy of the 5700XT.

When XSX announced its 12TF it was assumed that all this bottlenecks were properly paired up to support the amount to compute as well. PS5 didn’t do anything particularly innovative, outside of boost clocking. XSX managed to really lopside how it operates.

Having said that I do agree entirely with others that PS5 still enjoys a solid optimization advantage as being the lead platform. I am legitimately curious to see how well it does when future call of duties land comparatively to Xbox.
 
Can post in the other thread so I'll post them.

Native 4k vs FSR Native AA

FSR 3 FG Quality vs DLSS 3 FG Quality

FSR 3 FG Performance vs DLSS 3 FG Performance

Too dark to really see the differences but it's all I could find.
Saw some benches and video here: https://www.dsogaming.com/pc-perfor...k-vs-nvidia-dlss-3-vs-amd-fsr-3-0-benchmarks/

But can't tell anything from watching the video. Perf. seems very good, but in motion DLSS is much better they say.

From image and responses I assume FSR uses a sharpening filter. If so, AMD should make this optional / tweakable ASAP imo.
Hoping for some DF analysis with magnified and slowed down footage, and some focus on expected issues e.g. on moving shadows...

Also hoping for support on older GPUs. If software features is the only selling point these days to upgrade GPU that's really lame.
 
Saw some benches and video here: https://www.dsogaming.com/pc-perfor...k-vs-nvidia-dlss-3-vs-amd-fsr-3-0-benchmarks/

But can't tell anything from watching the video. Perf. seems very good, but in motion DLSS is much better they say.

From image and responses I assume FSR uses a sharpening filter. If so, AMD should make this optional / tweakable ASAP imo.
Hoping for some DF analysis with magnified and slowed down footage, and some focus on expected issues e.g. on moving shadows...

Also hoping for support on older GPUs. If software features is the only selling point these days to upgrade GPU that's really lame.

Thanks for the link, I've used the two main menu screenshots they have to make another comparison.

FSR 3 FG vs DLSS 3 FG
 
Having said that I do agree entirely with others that PS5 still enjoys a solid optimization advantage as being the lead platform. I am legitimately curious to see how well it does when future call of duties land comparatively to Xbox.
Do we know this for a fact or are we just guessing?
 
Do we know this for a fact or are we just guessing?
there’s been a specific pattern of Xbox coming out of the gates with frame rate issues that over time are ironed out.

Specifically when it comes to optimization I’m referring to frame rate issues. Resolution isn’t something that typically improves with patches.

It’s very difficult to compare upper and lower bounds during DRS. But there are tools that can do it, setting up test results are very difficult to achieve however.

The culmination of my frame rate work can be found here with a test run with DF where we are trying to benchmark the various differences in upscaling technology. Higher values in the bar graph can be strongly correlated to more unique pixels (thus each pixel is individually computed and should therefore have less upscaling artifacts)

I have left some benchmarks around here on spectrum analysis that shows how quickly DRS can change resolution. But in my general analysis, even though the upper and lower bounds are the same when doing pixel counts, I’ve found that isn’t the case when it comes to real gameplay. It’s just one of those things where measuring 60 odd frames very accurately doesn’t line up with measuring the entire population less accurately.
 
Last edited:
Only some of it. I’m not privy to that kind of information so answering me with "it’s obvious it’s the way it works" isn’t an answer. It’s some nice guesswork but we can do better than that.

It's always been like and there's been various developers confirm it over the year.

There was a developer in here years ago that said PS2 got the most time and effort as the install base was huge, and the Xbox and GC dev time was to get the games to a stage where they were 'good enough' - I'll have to try and dig post out for you.

The platform with the largest install base, and thus with largest potential for sales is the one that gets the most attention in the development process.

PS5's install base is touching 2:1 now so it's in developers best interests to ensure that version is the best it can be as it gives them access to the most sales potential, and thus the potential for the most money to be made.
 
Last edited:
Status
Not open for further replies.
Back
Top