Current Generation Games Analysis Technical Discussion [2020-2021] [XBSX|S, PS5, PC]

Status
Not open for further replies.
No. not really. Sony gaming efforts are totally dedicated to one ecosystem on a narrow set of hardware. MS efforts are not.

Plus, we are not talking about a huge disparity, and it's at a time where it's the least relevant. Launch software tends to be the least expressive in terms of showing off the capabilities and performance of the hardware. By the time we get to real next-gen level type games on consoles, most of the issues will have been worked out.

Yes, you are correct... But then we are speculating about visible performance gains against PS5. They can exist, they can be huge, or they can not exist at all or even stay bellow PS5's performance.
The fact is, Microsoft talks... everyday! Marketing is something they try to put to good use! We saw that on the power fo the Cloud and as DX 12 would not allow Sony to get a 40% advantage on PS4... But did we had a Power of the cloud console? Did DX 12 allowed Xbox One to get the same performances as PS4? NO!
So, in this generation we don´t know what to expect. Microsoft keeps taling about what she did to the sX, but on the other hand... we know nothing about sony!
VRS, Mesh Shaders, SFS... are those really things that will get Microsoft an advantage over the PS5? I don´t think anyone, except with hand on the hardware, and knowledge of Sony's changes can say that for shure!

According to Matt Hargett, VRS doesn't hole a candle against Sony GE, since although it works, the sistem will be rendering unnecessary traingles, and the gains on culling them at an earlier stage beats VRS. Is he right? Will the GE allow for better performance than VRS? We do not know!
Mesh shaders are the same thing... They are cripled on AMD, and have a lot less performance than Nvidias version. According to some, Sony changes in the GE changed the primitive shaders to allow them to do the exact same as Mesh Shaders, but with increased performance since Primitive shaders are AMD's native support. Is this true??? We don´t know. Sony is silent!
And SFS... Well I do not doubt SFS is an evolution of PRT+, but the reality is that all the talk about memory gains are made in comparison with games with no use of partial resident textures. They never showed the performance gains against PRT+.
As such, the gains on the SSD are not really measurable, since PS5 can do the same in a diferent manner, but with a faster SSD. Apparently SFS allows also for fine grain fetch of the textures, and PS5 may not allow exact the same thing. But Xbox only has 10 GB of fast RAM, and PS5 can access more than that...

And why am I talking about this? Because when people talk about performance gains in GDK, they are not really caring about the software improvements. They are talking in the expectation Xbox to can pass and leave PS5 behind... And I do not know if anyone, without hand on on both hardwares, and knowledge of the changes made by Sony, can say for shure if that will happen.
 
Or this is par the course for MS who doesn't believe that it needs anything close to a perfect XDK at the onset of a hardware release. This practically happened last gen with the Xbox One.
You're talking about Kinect?

Yes it got talked up and the games were coming and the people were waiting

and waiting

and waiting

and then it got dropped

Or are you talking about 'the power of the cloud'
demos were shown like crackdown 3, people gasped and were in awe and waited

and waited

and waited

and then crackdown released and it was ful of aw

sure past performance doesnt necessary mean squat but I wouldnt get my hopes up
 
You're talking about Kinect?

sure past performance doesnt necessary mean squat but I wouldnt get my hopes up

I think they're referring to the under-performing DX11-drivers initially used on Xbox One. It wasn't until later that they refined that driver to be a thinner and more performant layer. Or was it even a DX10 layer to begin with? So long ago and can't recall exact details.
 
I think they're referring to the under-performing DX11-drivers initially used on Xbox One. It wasn't until later that they refined that driver to be a thinner and more performant layer. Or was it even a DX10 layer to begin with? So long ago and can't recall exact details.

It was a special version of DX11 with low level xbox specific extensions. We can say Xbox had DX12 before DX12 even existed. DX12 was created based on the work done on the console DX11.
 
According to Matt Hargett, VRS doesn't hole a candle against Sony GE, since although it works, the sistem will be rendering unnecessary traingles, and the gains on culling them at an earlier stage beats VRS. Is he right? Will the GE allow for better performance than VRS? We do not know!

He said VRS doesn't hold a candle to the performance savings possible from the GE (MS have an AMD GE too, of course). He didn't say XSX doesn't have the ability to do similar things. For example, mesh shaders should allow developers to cull and shade in a very efficient, parallel, manner and for all we know Xbox might have AMD primitive shaders too, which could be exposed at some point.

There seems to be this misconception that you either cull more efficiently or use VRS. Xbox 3D pipeline is perfectly capable of doing highly efficient early culling in mesh shaders and then using VRS too when you get to the pixel shading part.

I think they're referring to the under-performing DX11-drivers initially used on Xbox One. It wasn't until later that they refined that driver to be a thinner and more performant layer. Or was it even a DX10 layer to begin with? So long ago and can't recall exact details.

Yeah, if I remember right there was a DX11 version, followed by an improved lower overhead DX11 implementation some time after launch (Mono?), followed by DX12 which shifted to a very different model.

Making things worse, the system reservations on X1 were initially quite high. Something like two cores + 10% of the six game cores. That got slimmed down to two cores and eventually 1 and a little bit as they rolled back Kinect support and presumably made other optimisations. GPU reserve got trimmed back too.
 
According to Matt Hargett, VRS doesn't hole a candle against Sony GE, since although it works, the sistem will be rendering unnecessary traingles, and the gains on culling them at an earlier stage beats VRS. Is he right? Will the GE allow for better performance than VRS? We do not know!
VRS & Sony's GE have nothing in common. VRS is there to reduce the work for Shaders not the triangle setup. So after all, if Sony could magically cull more, they would also benefit from VRS (if correctly implemented). VRS also works in different areas, it works in "visible" areas that are to dark to really see or have completely the "same pixels". I really don't get why he even compared it as it would be more compareable with the thing mesh shaders are doing and not VRS.
Btw, according the the developers of the latest Call of Duty (if I remember correctly) even software VRS is a good solution.

Mesh shaders are the same thing... They are cripled on AMD, and have a lot less performance than Nvidias version. According to some, Sony changes in the GE changed the primitive shaders to allow them to do the exact same as Mesh Shaders, but with increased performance since Primitive shaders are AMD's native support. Is this true??? We don´t know. Sony is silent!
Mesh-Shaders are not really crippled on AMD hardware. They even have higher gains on AMD hardware than on nvidia, because AMD never "optimized" the "old" path enough. They have >1000% gains with the latest drivers. So Mesh Shaders can really make thinks possible, that were not possible before.
Btw, Mesh Shaders do not really do more, they do just cull parts, that do not get rendered and specialized exactly on that, so only data that is needed gets loaded. So they should not only reduce what is rendered (and therefore get effectively more work done) they also reduce what get's loaded into memory to save memory and IO bandwidth.

And SFS... Well I do not doubt SFS is an evolution of PRT+, but the reality is that all the talk about memory gains are made in comparison with games with no use of partial resident textures. They never showed the performance gains against PRT+.
As such, the gains on the SSD are not really measurable, since PS5 can do the same in a diferent manner, but with a faster SSD. Apparently SFS allows also for fine grain fetch of the textures, and PS5 may not allow exact the same thing.
Yes, such benches would be really interesting. But as far as I know, the early PRT had also other problems than the HDD for running on old hardware. Problems that got solved with SF and the additional SFS also accelerates more steps than before.
The difference between PS5 and series s/x is, that because of features like SFS MS thought, that they got more than enough bandwidth. Sony solved the "bandwidth" problem by just more bandwidth. Different approaches, same goal. The difference in the concept here is that if you have many small read-operations (SFS & Mesh shaders have this in their concept), you only load data into memory, what you really need for the rendering at point x (at least in theory). While Sonys approach (as far as we know) is to prefetch everything that might be needed into memory and than only render what is needed. If you need other data, it can be easily compensated by more than enough bandwidth.
Both approaches have their advantages and flaws. PS5 will use memory more effectively because it can switch the data inside a bit faster and Xbox will try to get the same done by (hopefully) effectively reducing the data that is loaded.

But Xbox only has 10 GB of fast RAM, and PS5 can access more than that...
The GPU can also access the memory above 10GB and it is not really slow, just slower than the 10GB. I really don't see a problem in this concept except for the additional optimization effort. But most times, you have many things in memory (e.g. world-data, or data for the next frame, etc) which don't get overwritten often. So stuffing this into the "higher memory" (just reminds me of DOS memory management ^^) would not hurt performance while stuff that is changed constantly has the fast memory "pool".
 
Last edited:
It was a special version of DX11 with low level xbox specific extensions. We can say Xbox had DX12 before DX12 even existed. DX12 was created based on the work done on the console DX11.
XBO only shipped with DX11.
They later had to release DX11 Fast Extensions to make up some ground, but it's still largely a DX11 structured API with some lower level access.

Then they released DX12 a year after or so, and made it mandatory the year after and deprecated DX11 Fast extensions.
AKAIK DX11 FE and DX12 are not the same.
DX12 on Xbox also still contains X1 extensions, in particular for memory management and features that are specific to the console.
But on the whole, the API structure is completely different from DX11.
I would fully disagree that DX12 was based on the work done with DX11 Fast Extensions as 12 was made in conjunction largely with Nvidia, AMD, Intel, and other vendors.

As for the gap, they did make up a gap. In the end XBO held at 30% resolution differential as per their hardware choices. It started off as XBO being a 720p console and PS4 a 1080p console. Almost all launch titles bar Forza 5 launched at 720p for xbox if I recall correctly. Ghosts, BF, Dead Rising come to mind. Ryse Son of Rome being 900p.

Both consoles have largely kept this gap through the rest of the generation despite the increasing complexity and maturity of the titles.
 
Last edited:
So Series X uses mesh shaders and PS5 uses primitive shaders?
Whats the difference between the two?
It comes down to how they are invoked and how they do it and what areas of the geometry pipeline they over see.
I would say mesh shaders can be invoked in several ways and can touch on a larger part of the geometry pipeline, but require explicit programming, the mesh shader pipeline is entirely separate.
I would say primitive shaders touch on less scope of the geometry pipeline, and it's invocation options are less, but it does not require explicit programming (although it can be), and runs on the traditional vertex pipeline.

I would say having both is ideal I suppose. For teams that can't, maybe you'd lean on the geometry engine to do the conversion to primitive shaders for you. For team that can and are looking for more, mesh shaders have some additional available options.
 
XBO only shipped with DX11.
They later had to release DX11 Fast Extensions to make up some ground, but it's still largely a DX11 structured API with some lower level access.

Then they released DX12 a year after or so, and made it mandatory the year after and deprecated DX11 Fast extensions.
AKAIK DX11 FE and DX12 are not the same.
DX12 on Xbox also still contains X1 extensions, in particular for memory management and features that are specific to the console.
But on the whole, the API structure is completely different from DX11.
I would fully disagree that DX12 was based on the work done with DX11 Fast Extensions as 12 was made in conjunction largely with Nvidia, AMD, Intel, and other vendors.

As for the gap, they did make up a gap. In the end XBO held at 30% resolution differential as per their hardware choices. It started off as XBO being a 720p console and PS4 a 1080p console. Almost all launch titles bar Forza 5 launched at 720p for xbox if I recall correctly. Ghosts, BF, Dead Rising come to mind. Ryse Son of Rome being 900p.

Both consoles have largely kept this gap through the rest of the generation despite the increasing complexity and maturity of the titles.

Yes... only on 2014 with the mono driver low level was added. First Batch of games were pure DX 11.
 
Yes... only on 2014 with the mono driver low level was added. First Batch of games were pure DX 11.
right, mono driver.

Very far from DX12.


https://www.eurogamer.net/articles/digitalfoundry-2014-directx-12-revealed-coming-to-xbox-one
While the broad feature-set of DX12 looks very much like a response to the Mantle initiative, Nvidia claims that it began discussions with Microsoft on the subject four years ago, with direct work on the API starting last year. Nvidia itself has provided an initial driver to allow developers to get to grips with the new technology. In theory, any DX11 graphics card should work with DX12 - Nvidia itself has confirmed that anything from the "Fermi" 400 series onwards should work.

The Forza 5 demo was the star of the presentation, designed to show how console-level efficiency is possible on the PC. Bizarrely, according to Nvidia, the demo ran on Titan Black hardware - the most powerful single-chip graphics card on the market, and possibly not quite the best hardware to demonstrate an efficient console port. According to Turn 10, the conversion from Xbox One's DX11.x API to an alpha version of DX12 took four man-months to achieve, with some features of the existing console API migrating across to PC, while other elements (thanks PC Perspective for the shot) - such as "pipeline state objects" and the "resource binding model" will make their way across to Xbox One.
Those are pretty key elements that define DX12.
 
Are you denying that Microsoft stated the DX12 was based on the work done on the mono driver for Xbox One?
Are you serious?

While the broad feature-set of DX12 looks very much like a response to the Mantle initiative, Nvidia claims that it began discussions with Microsoft on the subject four years ago, with direct work on the API starting last year. Nvidia itself has provided an initial driver to allow developers to get to grips with the new technology.

I just need to be clear on one thing.
How do you think MS was able to put in microcode to support PSO swapping with executeIndirect (which is a DX12 feature and doesn't work for DX11) for DX12 without penalty into the command processor of XB1, well in advance of deploying XBO. This particular feature found in nvidia cards specifically Kepler architecture forward.

I hope that the assumption is not that everything MS does revolves around the console. There are other roadmaps that MS must follow to support the other vendors like Nvidia and AMD, the work with Direct X is likely not done or complete. You are likely to see even more evolution of DX12Ultimate and beyond in the coming years and it's more than possible that Series consoles will support it because they've been planning these things for years. Nvidia has been DX12U compatible since Turing. It's not new for them, just because the consoles baseline at DX12U doesn't mean Nvidia will stop there.

They definitely will keep pushing and lobbying for more support of what nvidia and amd are doing. They can't stop at the console space.
 
Last edited:
Are you serious?

While the broad feature-set of DX12 looks very much like a response to the Mantle initiative, Nvidia claims that it began discussions with Microsoft on the subject four years ago, with direct work on the API starting last year. Nvidia itself has provided an initial driver to allow developers to get to grips with the new technology.

I just need to be clear on one thing.
How do you think MS was able to put in microcode to support PSO swapping with executeIndirect (which is a DX12 feature and doesn't work for DX11) for DX12 without penalty into the command processor of XB1, well in advance of deploying XBO. This particular feature found in nvidia cards specifically Kepler architecture forward.

I hope that the assumption is not that everything MS does revolves around the console. There are other roadmaps that MS must follow to support the other vendors like Nvidia and AMD, the work with Direct X is likely not done or complete. You are likely to see even more evolution of DX12Ultimate and beyond in the coming years and it's more than possible that Series consoles will support it because they've been planning these things for years. Nvidia has been DX12U compatible since Turing. It's not new for them, just because the consoles baseline at DX12U doesn't mean Nvidia will stop there.

They definitely will keep pushing and lobbying for more support of what nvidia and amd are doing. They can't stop at the console space.

All I have to say is that Microsoft created the DirectX 11.x API for Xbox with low level extensions. At the time they claimed that the work on those extensions was the foundation of the work created for DX12. DirectX 11.x was not DX12, and was limited on what it could do, but it was the starting point for DX12.
 
All I have to say is that Microsoft created the DirectX 11.x API for Xbox with low level extensions. At the time they claimed that the work on those extensions was the foundation of the work created for DX12. DirectX 11.x was not DX12, and was limited on what it could do, but it was the starting point for DX12.
I'm not going to say that some of that work on 11.x didn't find itself in 12 because that would likely not be true and just a matter of perspective not worth discussing.

But I'm saying they already knew they were moving towards DX12. They can market it anyway they'd like for the sake of trying to get it a positive vibe. From my perspective, you don't make and deploy and then turn over an api all of a span of a single year. At best DX11.x mono driver was a milestone release on their way to deploying DX12. With the latter 2 features (PSO and resource binding changes) making up the bulk of the major changes that identify DX12.

XBO was a rushed product and nearly didn't make it out the door on time, this I know from my personal connections. There's no doubt that ideally they would have shipped with mono driver, as some interim to DX12 but they couldn't manage that either. They'd completely fallen behind on so much because the OS and kinect reservation was not working for them at all, all demos prior to release day were entirely faked. Nothing worked until about a week or 2 before if I recall my timelines right. They had project planned everything under the assumption that there would be no way PS4 would have 8GB GDDR5. They were counting on 4GB.

DX12, and their BC support had been baked into their hardware well before the software followed on xbox. They reprioritized the entire strategy after getting their ass kicked.
 
Weird that they did not point out that the lighting seems to be different on PS in comparison to Xbox. Not the first time they skip such differences.
ps5 games are often too dark and remember df mention it in Control, and back to Outriders video, its seems pattern is creating that tough xsx has higher resolution ps5 more stable fps
 


*XBSS performance is quite bad in most combat scenes.
*XBSX has a higher internal dynamic resolution but at the cost of performance.
*PS5 is pretty much locked at 60fps and has a bug-related issue with flag physics.

Article: Outriders first look: 60fps is the key upgrade for PS5 and Xbox Series consoles
They missed the visibly quite higher resolution of some textures on PS5. It's not AF because the wood walls look also higher res on PS5.

What's the point of using a slightly higher resolution if the game actually looks quite lower res than on PS5? I do wonder if this game is using VRS on Xbox series. Those blurry textures on XSX remind me Dirt 5.

hKkt3k3.png


Source VGTech screenshots, here his youtube video.
 
  • Like
Reactions: snc
They missed the visibly quite higher resolution of some textures on PS5. It's not AF because the wood walls look also higher res on PS5.

What's the point of using a slightly higher resolution if the game actually looks quite lower res than on PS5? I do wonder if this game is using VRS on Xbox series. Those blurry textures on XSX remind me Dirt 5.

hKkt3k3.png


Source VGTech screenshots, here his youtube video.
That's a really awkward picture. I almost see a pure line of delineation between share and unsharp for XSX beginning right at the bushes. I don't think that's a VRS feature, at least not typically what would be done. I also see the same thing on PS5, but it's not as strong. So I'm going to go on a limb and say their AF system or texture resolution is different between the two.
 
That's a really awkward picture. I almost see a pure line of delineation between share and unsharp for XSX beginning right at the bushes. I don't think that's a VRS feature, at least not typically what would be done. I also see the same thing on PS5, but it's not as strong. So I'm going to go on a limb and say their AF system or texture resolution is different between the two.
It's not AF. The non-oblique textures also look lower res on XSX. The fact that it's visible only in some textures and not others makes me think it's VRS.

PFkH3Ae.png


By the way some textures were also looking lower res on Watchdogs. It's now the third game that is having those blurry textures on XSX. Dirt 5, Watchdogs and now Outriders.
 
  • Like
Reactions: snc
It's not AF. The non-oblique textures also look lower res on XSX. The fact that it's visible only in some textures and not others makes me think it's VRS.

PFkH3Ae.png


By the way some textures were also looking lower res on Watchdogs. It's now the third game that is having those blurry textures on XSX. Dirt 5, Watchdogs and now Outriders.
Watchdogs doesn't use VRS (confirmed by the graphics config files for all platforms) so it's probably something else. WDL also uses image reconstruction like Outriders.

They missed the visibly quite higher resolution of some textures on PS5. It's not AF because the wood walls look also higher res on PS5.

What's the point of using a slightly higher resolution if the game actually looks quite lower res than on PS5? I do wonder if this game is using VRS on Xbox series. Those blurry textures on XSX remind me Dirt 5.

hKkt3k3.png


Source VGTech screenshots, here his youtube video.

IMO. The Xsx looks blurry due to the reconstruction, see the more sub pixel breakups on the grass. Perhaps the screen was taken in motion on X and not so on PS5.

Outriders doesn't have VRS option on PC so I doubt they implemented it on only the series consoles.
 
Last edited:
It's not AF. The non-oblique textures also look lower res on XSX. The fact that it's visible only in some textures and not others makes me think it's VRS.

PFkH3Ae.png


By the way some textures were also looking lower res on Watchdogs. It's now the third game that is having those blurry textures on XSX. Dirt 5, Watchdogs and now Outriders.
There's no indication here of VRS, it's too straight and uniform. Tier 1 was deprecated. Blurrier textures may be a sign of lower quality textures. But I wouldn't go for VRS. It doesn't match the profile here.
 
Last edited:
Status
Not open for further replies.
Back
Top