Current Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
Until somebody that actually knows the details shares it, we do not really know what PS5 lack from or add-on to RDNA2 is, its fun to speculate, but it is what it is.
 
You should read the full interview with DF. DF rightfully so said there are legal issues for making a false statement and Xbox being the only console with full RDNA2 has to respect the laws and perhabs even AMD had their say in that messaging. Xbox didnt lie and hoped Sony doesnt talk about this. In fact Sony could sue them, if it were a lie.

Please quote the user claiming that microsoft is lying instead of me.
I don't want to be associated with such claims.
 
Ahh.. VRS vs GE. The debate that never dies.
In time when devs are more willing to talk and less afraid of NDAs I will make sure I get off the record quotes on them regarding the actual feature set of the consoles. We have already heard quite a bit, it is just a matter of finding when to talk about it I guess. Regardless of what devs say - I think it makes more sense if the XSX GPU has features from RDNA 2 that the PS5 GPU does not (VRS, SFS, and mesh shaders being an advancement on primitive shaders). Xbox Series X having a bigger more feature rich GPU would align with the time tables we know about this gen. That PS5 was originally going to launch in 2019, that Xbox Series X was waiting on silicon and hardware for longer and as a result sent its dev kits out much later. It is also presumably why the PS5 development environment and SDK is much more performant, stable, and user friendly than the one for XSX as they have had more time with the near final hardware.
 
PS5 not having VRS would not result in them missing much I think. In age of dynamic resolution and checkerboard rendering, you can just lower resolution by 5% and you are good to go.

Mesh Shaders as well, again, not sure Sony wanted to wait for that if GE does its job well enough. Geometry rendering is anyway smaller part of rendering budget, and I am certain difference between two are negligible.

In any case, they had to lock design at some point and these two features probably dont warrant delay.

Now maybe Dictator is referring to some other surprise when he says "we already heard quite a bit" :)
 
VRS can be done in software, and was even done on PS4/One, so really we are splitting hairs here. Mesh vs Geometry yadda yadda, if the results are that games look and run fantastic on both, which they will, then really who the fk cares!

The whole Checkpoint Feature Wars are sooooo 2010!
 
  • Like
Reactions: JPT
VRS can be done in software, and was even done on PS4/One, so really we are splitting hairs here. Mesh vs Geometry yadda yadda, if the results are that games look and run fantastic on both, which they will, then really who the fk cares!
It can be done in software as shown in one very game-specific example in Call Of Duty Modern Warefare - but as Drobot's excellent presentation on that showed, that implementation took a lot of time to do and get it working, and the implementation had many engine design pre-requisites to make it possible. So that software implementation is not generalisable to other game engines and was labour intensive. A reason why a hardware implementation with an API is such a benefit for VRS is because it is engine implementation agnostic for the most part and because VRS API means it has very little implementation time required. In Tier 2 the dev just essentially feeds the API with a VRS texture in a call and that is it, the hw does the rest.
 
You should read the full interview with DF. DF rightfully so said there are legal issues for making a false statement and Xbox being the only console with full RDNA2 has to respect the laws and perhabs even AMD had their say in that messaging. Xbox didnt lie and hoped Sony doesnt talk about this. In fact Sony could sue them, if it were a lie.

Well the PS5 specs read RDNA2 without any qualifications so better sue them about it ;).

In all seriousness we’re going to know soon enough if and what. There are so many factors at play here that we still barely have a clue about.

The comment was very clear about Sony having made changes and whether we would see them in AMD cards that would release from around the PS5 launch if AMD thought they would be good, but we still don’t know what changes those are, or do we?

Then there is the overhead of using Microsoft’s new cross gen development platform, which has been rumored to give around a 15% overhead whatever that means. We also don’t know how much memory or LOD details depend on load times from the SSD versus the clock differences versus the much higher CU count ...

It is fun to speculate but we are probably lucky if we know everything after even a year of head to head comparisons [emoji16]
 
In time when devs are more willing to talk and less afraid of NDAs I will make sure I get off the record quotes on them regarding the actual feature set of the consoles. We have already heard quite a bit, it is just a matter of finding when to talk about it I guess.
Please! :yes: Or this will become the PS4 CUs madness of this generation. :runaway:
 
I mean, he does....
And again, regarding wasting cycles, each cycle on Series X is capable of more compute vs each cycle on PS5. So if you are trying to optimize for Series X, it seams reasonable that if you save more cycles, you get back more compute. Well, I guess that's true on PS5 also, since compute is achieved by the width of you pipeline multiplied by your clock rate. So you would still be saving compute. I don't know, maybe I'm missing something. Maybe GE is something completely new.

You say he knows what he's talking about and then you go on to explain how he doesn't. LOL
 
I've seen lots of people saying that the dev kits on the xbox side arrived much later than the PlayStation ones, reading between the lines the xbox ones arrived early this year, and the PlayStation ones were sometime last year? does that seem right? or is it more drastic than that?

EDIT:
thinking about it the xbox devkits must have been much later, like may june? because in the Bloomberg article 'Xbox series X is the first video game console born in a pandemic' (link at end) They interview Alan Hartman, who runs Turn 10, and he said that they developed a tool to run early versions of the new forza on the current xbox one systems. And that's in the last couple of months! I find it hard to believe that Microsofts premiere tech studio would have trouble getting dev kits

link:
https://www.bloomberg.com/news/arti...e-first-video-game-console-born-in-a-pandemic
 
Last edited:
PS5 has RDNA5. Just as realistic as 2022 RDNA3. I think those believing in RDNA3 should rewatch the Deep Dive by Cerny and what he thinks determines a succesfull partnership with AMD.

Quote myself:

Okay, so this is what I think has actually been the fruit of Sony/AMD's partnership. It's nothing to do with RDNA3, it's the variable frequency setup. I don't know if people noticed but AMD have a very similar variable power budget-sharing setup that users on PC will be able to do with their Ryzen and RDNA 2 GPUs. The clocks at Boost and Game clocks are similar to Sony's PS5 GPU clocks, while at Base clock are similar to MS's Series X GPU clocks. Sony apparently were aiming for high(er) clocks from a long time and the one persistent thing in the Ariel and Oberon revisions that kept seeing increases were the clock speeds for the GPU (just as an example).

I think all of those things fit together pretty well. Also, there's a chance that the RDNA 2 PC GPUs have cache scrubbers for the Infinity Cache? That's a slight possibility, I think I recall some folks saying it could serve a purpose there. Then again I've also seen others stating the complications/drawbacks of cache scrubbers too, they both likely have their points of merit but it's a bit beyond me. Anyhow, seeing how it's RDNA2 GPUs launching around the time of these consoles, I think Sony's involvement with AMD revolving a lot around some sort of advanced variable frequency implementation (and possible cache scrubbers for Infinity Cache to a lesser extent) makes the most sense.

I've seen lots of people saying that the dev kits on the xbox side arrived much later than the PlayStation ones, reading between the lines the xbox ones arrived early this year, and the PlayStation ones were sometime last year? does that seem right? or is it more drastic than that?

It's a bit more than that. The reason MS's devkits have been late is because they're integrating everything into Gamecore. Apparently Series S didn't even have its own devkit for a while, devs were just making due with setting profiles for Series S on Series X devkits, but they weren't likely optimal.

By now I believe those transitional issues have been taken care of, but it's been cutting things very close. I'm assuming they're mostly taken care of because Capcom, for example, were saying DMC5 on Series X wouldn't even have RT at launch, but they managed to get it in, so I'm assuming they might've gotten some kind of update on devkits or SDK tools within the past month or so.

Yeah this guy doesn't know what he's talking about. Blast processing indeed.

This might surprise some but Blast Processing was actually real! The problem is, for most commercial MegaDrive games, figuring out the timing for DMA access to memory was incredibly difficult, and if the timings were off your game'd end up looking like a hot mess. So it wasn't really used in vast majority of commercial titles.

Some homebrew efforts have utilized it however; I think it was maybe a couple years ago that some homebrew hackers actually figured out the timings. It'd be cool to see some new games on the platform targeting further utilization of untapped potential and implement "blast processing" for real.
 
Last edited:
I've seen lots of people saying that the dev kits on the xbox side arrived much later than the PlayStation ones, reading between the lines the xbox ones arrived early this year, and the PlayStation ones were sometime last year? does that seem right? or is it more drastic than that?

It all depends on what may or may not have an impact.

The closest to retail devkits were sometime after the June 2020 GDK release. Earlier hardware had difference in some areas. The last edition set for Summer 2020 was to address some aspects that prevented decompression speeds from being what they should be.

However Microsoft provided remote devkits that could be used by developers earlier in the year, so instead of requiring a devkit for every dev working from home, the company could make use of devkits in Azure.

Generally speaking though, Microsoft and Sony in the past and during early development have given devs PC target specs to aim for when targetting next gen. So the industry is used to not having absolute hardware to dev for until later.
 
From a raw TF standpoint XSX is 20% faster than PS5. If PS5 doesn't have VRS (which I'm thinking they don't, because they never talk about it) then they lose about another 10%. That's a 30% deficit, which is not nothing. It's also not enough for PS5 games to look bad either. I'm sure HZD2 will impress, for instance.

X fans think Sony has RDNA 1.5 and Sony fans think they have RDNA 2.5. IMO Sony has RDNA 1.8 or 1.9. Time will tell.
 
Perhaps the shading rate in the PS5 depends directly on geometry LOD, and both depend on Z values.

Couldn't that cause issues, though? You'd still have to calculate and cull the geometry before determining the shading rate, right? Not only that, but the geometry LOD may not always correlate 1:1 with the texture quality. Some surface could have very simple geometry but require a render output of a high-quality texture, maybe since the object in question could be for decorative purposes and collision between the player isn't really expected, but it's close enough to the viewer in the frame to require a higher resolution for that texture even if the geometry it's mapped over is much lower quality?

So wherever VRS would fall in the pipeline, it can't be that early is my own personal guess. In addition with that, this is just me asking here but, does RT BVH traversal testing utilize the geometry of the model the tests are passed on, or is it literally just a bounding box and that's as far as the collision zone for the object goes for running the traversal test on? I assumed the object geometry would play a factor into that but then again, I'm not really read up on how RT intersection tests work at the technical level beyond some of the basics (or even how AMD in particular does them, though that's an aside).
 
Just a question about Tier 2 VRS, I watched the digital foundry video on gears 5 and it left me wondering, if I understood it correctly Tier 2 VRS uses an edge detection algorithm in part to determine which regions need increased resolution before the image is shaded, but how does it do an edge detection on an unrendered image? Or is it an edge detection against the geometry only? If its only does an edge detection on geometry then how does it account for complex images on a piece of flat geometry, like a poster on a wall? it wouldn't know to not decrease the rendering resolution on the image
I think I might be missing something
 
it's 10% of 20%, not 20% + 10%.

Uhm ... The VRS percentage would apply across the entire TF range, not only the extra performance range. But the way they arrived at their number wasn't correct either.

Showing the work:
12.155 TF base SeriesX * 10% VRS boost = 13.37 TF Series X VRS adjusted

13.37 TF : 10.28 TF ~= 1.30 or 30% higher
 
However Microsoft provided remote devkits that could be used by developers earlier in the year, so instead of requiring a devkit for every dev working from home, the company could make use of devkits in Azure.
I thought they made it so you could develop remotely on dev kits in the office, not ones in azure.

From what has been said, the first xsx blades, only came of the assembly line couple months ago I believe it was.

I agree that devs doesn't necessarily need the hardware, its one of the reasons I lean to GDK and gamecore.
Especially considering DX12 isn't new, only the ultimate additions and direct storage, so DX12 by default shouldn't cause a problem in itself.
 
Status
Not open for further replies.
Back
Top