Baseless Next Generation Rumors with no Technical Merits [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
Hmmm. What exactly in Navi10 is GCN that'll help compatibility? It's using RDNA compute units, so that part is already completely different.

From the whitepaper:

The RDNA architecture has two modes of operation for the LDS, compute-unit and workgroup- processor mode, which are controlled by the compiler. The former is designed to match the behavior of the GCN architecture and statically divides the LDS capacity into equal portions between the two pairs of SIMDs. By matching the capacity of the GCN architecture, this mode ensures that existing shaders will run efficiently.

I don’t know what the implications are for how they schedule instructions as Navi issues every other cycle versus every 4 for GCN, unless they can add some NOPs in the BC modes ( @BRiT :p)?

The rest of the changes to the cache hierarchy may be moot - more cache hits/bandwidth?


A BetaBoost option at 2GHz would be cute. :eek:
 
Last edited:
For MS, it's not decades of habit but a conscious decision made with XBO's design. The XBox was the box designed to run Direct X, little more than a PC in architecture. Emulation should therefore be 'easy' like on PC, but it isn't and XBO's selection of OXB games is as weak as the PS2 library on PS4. 360 emulation is achieved through bloody good emulation and ongoing work to make it happen. Choices for XBO to use VMs etc. will serve them well for BC, absolutely, but it's not a 'corporate DNA' type thing.

Microsoft's number one goal is maintainability of software code and the fundamental design philosophy to this is dependency inversion. Reusing software libraries and frameworks, and building software on abstracted interfaces (application/OS API, software/hardware API) is how Microsoft does everything, it's not a conscious decision at all because this is the only way Microsoft designs software. You will not find a single Microsoft product released since the 1990s that isn't built using this design philosophy, it absolutely is habit and part of their design DNA. It is also why Microsoft has they deserved reputation of legacy support but also why Microsoft applications tend to struggle on non-Windows platforms. They can't not think and design like this and it can conflict when their applications are required to run on any technology base that they didn't design top-to-bottom.

The problem for Sony seems to be the GPU architecture changing from GCN to RDNA, and working fundamentally differently. In theory, PS4 games could be rebuilt for RDNA if Sony implement the libraries on it.

Microsoft face the same challenge. The problem for Sony is their high-performance super thin API (GNM) which we know from dev interviews can simply be jamming values into the GPUs command register. This is not good for forward compatibility. If there is no layer to abstract what the software wants the hardware to do, there is no layer to intercede when the hardware changes. This is a lesson that Microsoft learned when hardware changed every six months in the 1970s/80s.
 
Does full RDNA not support this then? That paper suggests all RDNA GPUs can operate with either mode.

Microsoft's number one goal is maintainability of software code and the fundamental design philosophy to this is dependency inversion...They can't not think and design like this...
Why doesn't XB360 have full OXB BC then? Why doesn't XBO have full BC with OXB and 360? Why isn't the hardware of these platforms as abstracted as it is in Windows?

If you look at Windows, MS is basically forced into creating legacy compatibility by their users. They've tried to introduce new ways of doing things (UWP) but the market just hasn't adopted it, so now they're embracing official support for legacy concepts. For their consoles, they didn't have to maintain compatibility so they didn't implement it. They didn't design OXB's software stack such that it'd be portable to the next console. They didn't design 360's software stack such that it'd be portable to the next console. Heck, even DirectX breaks things when it gets revisions and it's down to drivers to patch it all together. It's only XBO where they realised they'd want to take software forwards that they implemented suitable abstraction. XBO actually presented a unique opportunity for MS to create a software interface that's not tied to 1990s software and because of this, they could and did use the VM route.

Microsoft face the same challenge. The problem for Sony is their high-performance super thin API (GNM) which we know from dev interviews can simply be jamming values into the GPUs command register. This is not good for forward compatibility. If there is no layer to abstract what the software wants the hardware to do, there is no layer to intercede when the hardware changes. This is a lesson that Microsoft learned when hardware changed every six months in the 1970s/80s.
Does Navi 10 maintain the same register setup as GCN? If not, using Navi 10 doesn't help Sony with GCN compatibility, so if they have BC running on RDNA, why couldn't they get it working on Navi 10, Navi 20, or whatever?

Would a possible solution be to implement a hardware mapping, having AMD include translation silicon that takes register references and maps them to the actual hardware? In effect, a driver in hardware seeing as there's only one driver abstraction that's necessary.

Basically, I think the argument that PS5 is using Navi 10 needs a solid explanation why Navi 10 is necessary for PS4 BC.
 
The lack of raytracing tests makes me confused as the GPU 100% has it.

The lack of RT can be explained on many levels though.

folks will believe what they want to believe.

True, same for you :) What if the documents said 13TF?

it isn't and XBO's selection of OXB games is as weak as the PS2 library

I would like to see og xbox BC, there are no emulators for the og xbox as GC and ps2 have. It would be a way to preserve the xbox og games.
 
Who's Nate and how does he have intel on PS5 TFs? He mentions the information regarding the recent Oberon test belonging to PS5 devkit is inaccurate, as well as claiming TFs to be a little higher than 10.5. Maybe the dude is parroting what Klee and Jason have been rumoring on Era.
At the 20min mark.
His channel Spawn Wave does have 373k subs tho.
 
Navi10, like every AMD GPU released, only means first one, nothing more, nothing less.

Before Navi, 10 were generally bigger chips, but if rumor in 2018 about AMD putting whole lot of people on Navi created for Sony, then Navi10 in Sony case just means its first one to be designed.

It doesnt say it cannot incorporate RT or VRS, or that its last gen vs RDNA2.
 
Why doesn't XB360 have full OXB BC then? Why doesn't XBO have full BC with OXB and 360? Why isn't the hardware of these platforms as abstracted as it is in Windows?

Hardware and software abstraction only goes so far, it's why later version of Windows (NT, Win2K+) themselves have compatibility modes for certain software and more and more compatibility modes over Vista, Windows 7, Windows 8 and Windows 10. Just look at the compatibility options in Windows 10, they are there for a reason. With Xbox Microsoft transitioned from 80x86/Nvidia to PowerPC/AMD to x64/AMD. APIs can't make 80x86 code run on PowerPC, not PowerPC code on x64. While you can virtualise the CPU, or take any number of recompilation/emulation approaches, the mere existence of a fat API between the game and GPU lets you intercept calls and make them work on different hardware. We know there was GNM calls where this wasn't even a thing. As the dev proudly states: "I don't really get why they [Microsoft] choose DX11 as a starting point for the console. It's a console! Why care about some legacy stuff at all?". Yeah.. why care about legacy stuff at all. It's easy to say this if you're only looking backwards, but if you're also looking forwards... ?

This is why it was feasible for Microsoft to get some OG Xbox games running on 360 and a whole bunch of 360 games working on Xbox One. But none of this was "free", it all required effort. APIs just made it viable.

If you look at Windows, MS is basically forced into creating legacy compatibility by their users. They've tried to introduce new ways of doing things (UWP) but the market just hasn't adopted it, so now they're embracing official support for legacy concepts.

Actually if you look at Windows, you'll see Microsoft have changed a lot over the years and for a lot of old unmaintained code you have to run it in a compatibility mode. Most code of software still used has evolved with Windows so it's never a problem. You may assume things have not changed because things are not breaking, but this is not the case.

For their consoles, they didn't have to maintain compatibility so they didn't implement it. They didn't design OXB's software stack such that it'd be portable to the next console.
Abstraction =/= compatibility. Abstraction = possibility.

Who's Nate and how does he have intel on PS5 TFs? He mentions the information regarding the recent Oberon test belonging to PS5 devkit is inaccurate, as well as claiming TFs to be a little higher than 10.5.

If you don't know who he is, you have zero reasons to believe him. :nope:

The lack of RT can be explained on many levels though.

I'm listening..

True, same for you :) What if the documents said 13TF?

It still raises more questions that is answers. This does not change.

I would like to see og xbox BC, there are no emulators for the og xbox as GC and ps2 have. It would be a way to preserve the xbox og games.

There have ben a few Xbox emulators but I think all were discontinued. If you look at the popular emulator sites the number of downloads for them was tiny so I guess lack of interest killed them.
 
Last edited by a moderator:
While subs quantity could be useful to indicate a degree of credibility since people wouldn't sub you if you're talking BS or lack of trust all the time. I'm not saying it's accurate all due to that fact alone, it's worth taking note of if anything. After all we're in the baseless thread so all the info adds to speculation.
 
Could be, but I would be surprised if the driver was working for Sparkman / Arden and not Ariel / Oberon when everything indicates that Sony was further a long. Everything from the Github leaks indicates to me that the files where bulk copied of somewhere else in July this year, this makes me feel like a lot of the files / folders may be 'stale' and not up to date.
From an API perspective MS had already finished DXR a long time ago. So that's 1 of 2 already completed. As for drivers, as Albert writes, MS is further along with Scarlett at this point in time than they were with Scorpio. So it has been running very smoothly for them. MS has been relatively quiet until the TGAs with a semi announcement at E3. Microsoft has a very large product portfolio and the implications of these Scarlett chips could have additional uses outside of the console space which you may see in the coming years. There are lots of teams working on this project doing different things. But the most important aspects to consider is how much they have already completed with respect to features just within the Scorpio timeframe. This let them focus entirely on improving their existing items and to develop Scarlett with forward looking features (referring to features specific to Microsofts other goals outside the console space)

I wouldn't say Sony was further along unless you mean that they were ready to launch in 2019.

I think if you look at the rumours surrounding PS5, the rapid leadership changes, communication issues and lack of a conference in E3 - you'd probably have a much easier time painting the story that they were ready for 2019 and switched to 2020. That 1 year switch puts them behind because they will need a new design for 2020 and to incorporate different features that perhaps they were willing to launch without (like BC).
 
Last edited:
From an API perspective MS had already finished DXR a long time ago. So that's 1 of 2 already completed. As for drivers, as Albert writes, MS is further along with Scarlett at this point in time than they were with Scorpio. So it has been running very smoothly for them. MS has been relatively quiet until the TGAs with a semi announcement at E3. Microsoft has a very large product portfolio and the implications of these Scarlett chips could have additional uses outside of the console space which you may see in the coming years.

I wouldn't say Sony was further along unless you mean that they were ready to launch in 2019.

I think if you look at the rumours surrounding PS5, the rapid leadership changes, communication issues and lack of a conference in E3 - you'd probably have a much easier time painting the story that they were ready for 2019 and switched to 2020. That 1 year switch puts them behind because they will need a new design for 2020 and to incorporate different features that perhaps they were willing to launch without (like BC).

These are instruction level benchmarks and im going to be completely honest im not even sure a driver or API is involved.
 
These are instruction level benchmarks and im going to be completely honest im not even sure a driver or API is involved.
Good catch.
Though, if you don't have the instructions completed you can't make the drivers and you couldn't make the API commands for it.
So possibly one and the same just the cause is further down the chain.

I don't believe that this means the hardware isn't present.
 
Good catch.
Though, if you don't have the instructions completed you can't make the drivers and you couldn't make the API commands for it.
So possibly one and the same just the cause is further down the chain.

I don't believe that this means the hardware isn't present.

But if both are using the same raytracing implementation then the instructions should be the same, meaning both should be able to be benched. Im also still looking for answers for my first and third oddities. (2x L0 bandwidth / CU for Oberon vs Arden/Sparkman) and (double wave size for Oberon sometimes vs Arden/Sparkman (64 vs 32)).
 
But if both are using the same raytracing implementation then the instructions should be the same, meaning both should be able to be benched. Im also still looking for answers for my first and third oddities. (2x L0 bandwidth / CU for Oberon vs Arden/Sparkman) and (double wave size for Oberon sometimes vs Arden/Sparkman (64 vs 32)).
I don't know unfortunately. Hopefully new tests show up for December when silicon is finalized.

Here is a possibility of how 2 companies using the same IP block may not be on equal footing.
When we found Vega hardware in PS4 Pro that was them pulling IP blocks from a newer architecture and pulling it into an older one. Sony would have been responsible for writing custom instructions for that because of trying to get the IP blocks to work into an older architecture. They wouldn't able to to just pull the instructions over from Vega and assume it works for their custom setup.

tldr; There might be differences in which architectures both MS and Sony chose even if they are allowed to pick and choose IP blocks.
 
As for drivers, as Albert writes, MS is further along with Scarlett at this point in time than they were with Scorpio. So it has been running very smoothly for them.
That was an interesting post and anybody wanting to read Albert Penello's post on XSX's development progress can read it here on resetera.
 
I don't know unfortunately. Hopefully new tests show up for December when silicon is finalized.

Here is a possibility of how 2 companies using the same IP block may not be on equal footing.
When we found Vega hardware in PS4 Pro that was them pulling IP blocks from a newer architecture and pulling it into an older one. Sony would have been responsible for writing custom instructions for that because of trying to get the IP blocks to work into an older architecture. They wouldn't able to to just pull the instructions over from Vega and assume it works for their custom setup.

tldr; There might be differences in which architectures both MS and Sony chose even if they are allowed to pick and choose IP blocks.

I can see them picking and choosing next-gen IP blocks that aren't in current GPUs, but I honestly don't see them writing their own instructions it seems like it would make more sense for Sony/MS to pay AMD to do exactly that. I guess for most of these answers we will just have to wait and see when they are finally revealed :p.
 
Actually if you look at Windows, you'll see Microsoft have changed a lot over the years and for a lot of old unmaintained code you have to run it in a compatibility mode. Most code of software still used has evolved with Windows so it's never a problem. You may assume things have not changed because things are not breaking, but this is not the case.

That's a very important note that many people miss if they don't regularly run old and sometimes REALLY old software.

Not only do you have various legacy settings/modes for software written for specific versions of Windows that aren't 100% compatible with future versions of Windows you also have the various Windows on Windows compatibility layers for running 16-bit software on 32 bit systems and 32 bit software on 64 bit systems. And almost all of this happens without the user even knowing it is happening.

And then there's the whole aspect of Direct X compatibility. There some rough breaks along the way that make certain older Direct X titles not completely compatible with future versions of Direct X. It's relatively rare, but it still happens (I can't run my old Battle for Middle Earth games on modern Windows without a fair bit of wonking around, for example). But the great thing is that there's usually a way to do it on Windows. Much of the time it just comes down to a key file not located in a location that used to exist in previous versions of Windows but doesn't exist in newer version, so you just have to recreate the file in the location it expects it to be in. Sometimes it's as simple as the game looking for specific versions of Windows that existed when the game was released, etc.

It really is a marvel that MS is able to keep all of this working on as many hardware configurations as it does over as many hardware and OS generations as there are. And not just working, but performing well in general.

Heck, currently Microsoft is tackling that rather large challenge of having x86 and x64 Windows code running on ARM CPUs and have it be transparent to the user. There's still a long way to go, but they are making progress in compatibility and at some point I'd imagine they'll start focusing on speed.

From an API perspective MS had already finished DXR a long time ago. So that's 1 of 2 already completed. As for drivers, as Albert writes, MS is further along with Scarlett at this point in time than they were with Scorpio. So it has been running very smoothly for them. MS has been relatively quiet until the TGAs with a semi announcement at E3. Microsoft has a very large product portfolio and the implications of these Scarlett chips could have additional uses outside of the console space which you may see in the coming years. There are lots of teams working on this project doing different things. But the most important aspects to consider is how much they have already completed with respect to features just within the Scorpio timeframe. This let them focus entirely on improving their existing items and to develop Scarlett with forward looking features.

I wouldn't say Sony was further along unless you mean that they were ready to launch in 2019.

I think if you look at the rumours surrounding PS5, the rapid leadership changes, communication issues and lack of a conference in E3 - you'd probably have a much easier time painting the story that they were ready for 2019 and switched to 2020. That 1 year switch puts them behind because they will need a new design for 2020 and to incorporate different features that perhaps they were willing to launch without (like BC).

Additionally, if Phil Spencer is to be believed he at least has some type of prototype Xbox Series X device (or devkit) at his home that he is playing games on. Whether it's able to play games other than XBO/X360/Xbox games could be questionable, but I'd bet it's able to run anything that's being developed for Xbox Series X currently.

MS may have been slower to get to this point than Sony, but by now there's hardware available to develop on and run code on. This isn't the relative mess leading up to launch that XBO was.

Regards,
SB
 
Last edited:
but I honestly don't see them writing their own instructions it seems like it would make more sense for Sony/MS to pay AMD to do exactly that
Yea your right. I guess what I'm trying to say is that; it's not a straight port than if they used the architectural base it was designed for. Perhaps this is a case of borrowing RDNA 2 features and placing it into some in-between RDNA 1-2.

Time will tell. Too little information. Too stale information. Too much speculation.
I have no reason to distrust Cerny and that the feature will be some half baked form. Being slightly behind schedule in June might just be a sign of being behind on some things. Perhaps their priority was to get BC working first.

We saw proof of ray tracing at the TGAs if the DF video is correct. Even if it is a target running on PC it does showcase on what to expect.
 
Last edited:
So, the Shakespearin SoC code names used to link Playstation platforms "now" seems to be a problem. The narrative misfits were so gung-ho on linking certain aspects of their supposed PS5 evidence on fitting a certain narrative, even making up excuses on why a certain collection of works (Arden / another supposed SoC name) ended up in their dumpster diving evidence.

Point being, the problem isn't presenting your supposed evidence, the problem becomes when you try to pass it off as ironclad factual proof.

As I see it, the teraflop difference between XBSX and PS5 more than likely will reflect the OG PS4/XBO when they first launched. Possibly favoring XBSX this time around. The key differences that I see this time around will be SSD and memory configurations on extracting the best performance out of these systems. People tend to forget, real-world performance is the key -not simply raw teraflops.
 
Status
Not open for further replies.
Back
Top