Digital Foundry Article Technical Discussion [2019]

Status
Not open for further replies.
I honestly think the PS5 specs, as they're presented here look like a winner if the price comes in at $400 and the Xbox is $500+. I'm not sure where the cheaper Xbox would even fit unless it's about $250.

As for clocks, the 5700xt sustains around 1.8 GHz during gaming, so 2 GHz is a pretty big jump. I'm assuming a newer process node can get it there. The Xbox clock that's speculated should not be an issue at all.

At the right price PS5 is very attractive.

They just need to avoid a performance gap enough to be noticed (checkerboard rendering wink wink). Some magic from ND and a good price and they are set.
 
They just need to avoid a performance gap enough to be noticed (checkerboard rendering wink wink). Some magic from ND and a good price and they are set.

Honestly, I doubt checkerboard rendering is even needed at this point. I don't think the artifacts are really worth it. There will be newer and better methods of temporal reconstruction and AA, especially now with contrast adaptive sharpening doing such a good job.

Assuming the specs are close to the final product (roughly 9 TFLOPS PS5 and 12 TFLOPS Xbox), If the Xbox version is true 4K (3840x2160) and the PS5 version is running at 75% resolution (3328x1872) is anyone really going to notice without having Digital Foundry point it out to them? If the Xbox version is 1440p, the PS5 version can be 2224x1251. That would probably be more noticeable, but that's the crudest way to scale if you're alu limited. With sharpening, smart algorithms for upscaling and reconstruction and AA, I really think there will be diminishing returns on FLOPS. A 33% FLOPS advantage this gen won't have as big a visual impact as a 33% advantage last gen.
 
Don't forget better targeted use of resources through VRS (variable rate shading). Combine that with better reconstruction techniques and anti-aliasing and we're going to have better quality than we've ever had on consoles.

Work smarter, not harder.
 
Don't forget better targeted use of resources through VRS (variable rate shading). Combine that with better reconstruction techniques and anti-aliasing and we're going to have better quality than we've ever had on consoles.

Work smarter, not harder.
Better quality via techniques that objectively offer worse quality. I get what you are saying, that those techniques will allow developers to push overall image quality because they have built in techniques that stabilize performance, but it's also sort of counter to pushing no compromise image quality.
 
Better quality via techniques that objectively offer worse quality. I get what you are saying, that those techniques will allow developers to push overall image quality because they have built in techniques that stabilize performance, but it's also sort of counter to pushing no compromise image quality.

It'll basically come down to how noticeable they are and whether that is a distraction while gaming.

For example, fixed foveated rendering (where the edges of the screen get less precision/are more blurry) is hugely distracting, so hopefully no developer goes in that direction.

VRS could be good or could be bad depending on how the developer uses it. We're just going to have to wait and see. As always, the better developers are likely to find the best way to use the compromises in order to achieve higher fidelity. Granted not everyone will find some of these compromises pleasing. For example, I hated the smoke effects in KZ2, but other people really like it so it was overall a good way to mitigate one of the major weaknesses of the PS3.

Basically this generation is all about providing developers with lots of options to explore in order to attempt to overcome the fact that Moore's law is basically dead. VRS, RT, SSDs, etc...and then we have heavily beefed up CPU cores (compared to the outgoing generation) that should allow for higher framerates to allow for better use of temporal reconstruction.

Regards,
SB
 
Better quality via techniques that objectively offer worse quality. I get what you are saying, that those techniques will allow developers to push overall image quality because they have built in techniques that stabilize performance, but it's also sort of counter to pushing no compromise image quality.

I mean, they don't offer objectively worse quality. They offer subjectively worse quality. There's no future in gaming without stochastic rendering, importance sampling, temporal data. Brute force native rendering comes with huge compromises and the number of titles that do that will be next to none.
 
"Testing data from AMD recently leaked giving us some hint of the technical make-up of the PlayStation 5 and the Xbox Series X. So what has been revealed, how accurate is the leak likely to be and how could the new Sony and Microsoft console possibly compare? The leak looks genuine, but take any analysis with just a pinch of salt. Regardless - here's what we think"
not very interested in next gen consoles, but if I find a cheap Lockhart mid-generation I'd buy it to complement gamepass PC with some gamepass console games, then I'd consider it.

My best friend -since childhood- brought his Xbox One S to my house and while we mostly play sports and fighting games, there are some jewels in there like Mortal Kombat XL (not a MK fan, fatalities and all, but this game is so well done, most perfect rhythm in a fighting game I've ever played, plus you clearly see who hit who, no blurry frames) or Jump Force, Overcooked 2....that I might consider having a Lockhart console to play those games.
 
Eh, price matter more.

$450-500 for the PS5 vs $600 at least for Xbox? PS5 will sell circles around it. M$ would be ten times smarter to put that 1080p machine for at least $100 less than the PS5 ASAP. But who says they're smart? I'd thought the PS5 was playing it way too safe this gen, what with the Switch's success, but at the moment it's like Sony is running circles around Microsoft, if only because MS yelled "Watch this trick noob!" then shot itself in the foot.
 
I'm still wondering how RT is implemented on PS5 (if it is. Even with the declarations about it, I'm still thinking it won't have rt dedicated hardware) ... If it's a different solution than Xbox Series X, could it be a weird situation where Series X is more powerfull in non-RT tasks, but slower in RT tasks ?
 
Eh, price matter more.

$450-500 for the PS5 vs $600 at least for Xbox? PS5 will sell circles around it. M$ would be ten times smarter to put that 1080p machine for at least $100 less than the PS5 ASAP. But who says they're smart? I'd thought the PS5 was playing it way too safe this gen, what with the Switch's success, but at the moment it's like Sony is running circles around Microsoft, if only because MS yelled "Watch this trick noob!" then shot itself in the foot.

Bizarre statement tbh. We have no information about how much these new consoles will cost yet, other than guesstimates based on the presumed die-size, based on leaked performance metrics, which we don't even know are accurate and/or final. And somehow you've managed to come up with a $150 price differential and determined that Sony are "running circles around Microsoft".
 
Assuming the specs are close to the final product (roughly 9 TFLOPS PS5 and 12 TFLOPS Xbox), If the Xbox version is true 4K (3840x2160) and the PS5 version is running at 75% resolution (3328x1872) is anyone really going to notice without having Digital Foundry point it out to them?
However, what if XBSX also targets 3328x1872 and upscales, and uses the saved pixel power to draw more and better stuff? We shouldn't assume it'll just be used for resolution.

That said, PS4 already does a great job of geometry detail and foliage and whatnot. PS3 versus XB360, less grass and the like was pretty noticeable. The amount of detail and quality deltas will be getting a bit lost in the noise by now. Human perception is exponential - a doubling of light brightness is about a 10% increase in perceived brightness, and likewise sound. Ergo, as graphics improve towards reality, I expect a doubling of power to equate to a 10% improvement in perceived quality (using totally unquantifiable numbers!) which is where we hit dminishing returns.
 
However, what if XBSX also targets 3328x1872 and upscales, and uses the saved pixel power to draw more and better stuff? We shouldn't assume it'll just be used for resolution.

That said, PS4 already does a great job of geometry detail and foliage and whatnot. PS3 versus XB360, less grass and the like was pretty noticeable. The amount of detail and quality deltas will be getting a bit lost in the noise by now. Human perception is exponential - a doubling of light brightness is about a 10% increase in perceived brightness, and likewise sound. Ergo, as graphics improve towards reality, I expect a doubling of power to equate to a 10% improvement in perceived quality (using totally unquantifiable numbers!) which is where we hit dminishing returns.

We were already seeing diminishing returns with the launch PS4 versus launch XBO in most multiplat titles at typical living room distances. If we have a similar gap in power between consoles this gen, it'll be almost impossible to see at typical living room distances on 4k TVs (I expect 1080p will be identical for the most part) at greater than 1080p resolution. You'll have to get pretty close to be able to easily see differences, IMO.

Of course, the assumption being that equal effort is put into both versions of the games (something that doesn't always happen).

Regards,
SB
 
We were already seeing diminishing returns with the launch PS4 versus launch XBO in most multiplat titles at typical living room distances. If we have a similar gap in power between consoles this gen, it'll be almost impossible to see at typical living room distances on 4k TVs (I expect 1080p will be identical for the most part) at greater than 1080p resolution. You'll have to get pretty close to be able to easily see differences, IMO.

Of course, the assumption being that equal effort is put into both versions of the games (something that doesn't always happen).

Regards,
SB
Normal TV setup should be sitting as close as being close enough so that it fills your full vision but not so close that you need to look left and right. If you are setup like that; there is a monumental difference between 1080p and 4K
 
Then the PS5 will render at a lower res and draw more and better stuff.......and so on and so forth.
You can’t keep doing that forever though, eventually the resolution will be too low to effectively upscale from without introducing artefacts. There’s also only so far you can draw stuff in the distance and stuff in the distance doesn’t need crazy detail so I imagine the ‘sweet spot’ on resolution vs detail and draw distance would be hit pretty quickly with consoles this powerful.
 
Normal TV setup should be sitting as close as being close enough so that it fills your full vision but not so close that you need to look left and right. If you are setup like that; there is a monumental difference between 1080p and 4K

From a marketing perspective it wouldn't work but MS should do a "which console is right for me" calculator. Enter viewing distance, screen size and age to get a X/S response.

Us oldies would always get S as an answer... :(
 
Assuming the specs are close to the final product (roughly 9 TFLOPS PS5 and 12 TFLOPS Xbox), If the Xbox version is true 4K (3840x2160) and the PS5 version is running at 75% resolution (3328x1872) is anyone really going to notice without having Digital Foundry point it out to them?

Oh you optimistic logical fool. People will know in their little hearts!! :love:
 
From a marketing perspective it wouldn't work but MS should do a "which console is right for me" calculator. Enter viewing distance, screen size and age to get a X/S response.

Us oldies would always get S as an answer... :(
I agree its a bit like hearing frequency. The human ear heats up to 16K and it goes downwards afterwards. If you can’t hear above 16K-18K frequency is there any point in getting anything in the 20K range?
I guess the answer for 4K is like offering those higher frequencies like between 9K-18K range. And 1080p is like 25-9K. Most people won’t notice it unless you have the right equipment to highlight those higher frequencies. The correct viewing setup; calibrated screen; room lighting.
But 4K alone is not worth discussing. Without HDR 4K is pretty pointless. Each pixel being as small as they are need to be able to shine. So that detail is seen and not loss. So it’s absolutely critical that 4K and HDR are paired together combined with a calibrated screen and lighting conditions. Viewing distance, and a game that has very good HDR inplementation and game with high quality assets abd effects to take full advantage. You will definitely see the difference then. The more FPS the better.

so I don’t blame anyone who can’t see the difference. But there is definitely a difference to be seen. But not everyone has the setup to see it.
 
Status
Not open for further replies.
Back
Top