Baseless Next Generation Rumors with no Technical Merits [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
Is there any reason why the github leak's devkit couldn't just be using a Navi 10 chip with 2 WGPs disabled, soldered on the motherboard?
Why would that have it's own code? If evaluating an existing slice of silicon, I'd have thought it'd have a code the describes it. Then again, I don't know how these things work. ;) Some seem to think the GitHub leaks show something particular about a particular processor, but all their reasoning is just so many numbers and I can't follow any of that discussion. I'm still not 100% sure what the flippin' code names even are!

For the same reason there weren't any Navi 2x chips mentioned in the sheets?
If both consoles are being released at around the same time, why is it only Oberon/Ariel getting all those tests done, and Arden only gets a feature checklist? And where is Lockhart?
I can understand the absence of platforms for a different customer, or later chips from the roadmap. I'd have thought if Sony were evaluating two different possibilities, we'd have comparable testing for the purpose of said evaluation. And yeah, the leak doesn't tell the whole story, but we should be able to logically determine a degree of scope for it. The fact it doesn't mention a 6 GHz 56 CU GPU doesn't prove such a chip doesn't exist, but in the face of other arguments against such a thing, it's the kind of proof that'd been necessary to give some credence to such speculation.

At the moment, a proposition of Sony running two configs is a proposition with zero evidence, making it a pretty tough sell.
 
Is there any reason why the github leak's devkit couldn't just be using a Navi 10 chip with 2 WGPs disabled, soldered on the motherboard?
After all, there is no RT being mentioned anywhere and bandwidth is still 256bit.



For the same reason there weren't any Navi 2x chips mentioned in the sheets?
If both consoles are being released at around the same time, why is it only Oberon/Ariel getting all those tests done, and Arden only gets a feature checklist? And where is Lockhart?

Like many other religious books, the github gospel doesn't show the answers to everything. At least not for all GPUs and APUs being released in 2020.
This is a burden of proof scenario. We can't claim that there isn't a second chip or that there isn't all these possibilities that exist. We have proof of what is shown in the GitHub leak today. So if the claim is made that PS5 is X, Y , Z because of the leak then there is at least some evidence here to support that claim.

If you make the claim that there is a second GPU with more CUs and more TF of power, a non 256bit bus, then the burden of proof is on you to pull that evidence forward for us to see it.

Unfortunately, none such data points exist. Even the insiders who do make claims, fail to provide any static data points, they are only interested in keeping with relative data points. ie: PS5 is more powerful by X or Y. But refusing to acknowledge that we already have fairly good idea that XSX is Z, so that should be default make PS5 12.84 -> 13.8 TF for instance.

The higher those numbers go up on 1 console, the less realistic it gets which is why the relative data points is a fairly weak leak to rely on.
 
2) Why wouldn't the second option appear in the leaks? Wasn't started until later is running six months behind the 9.2 TF option?
Maybe the github leak is a microsoft leak. :runaway:

Wait wait wait.... hear me out....

That chip is the joint venture sony proposed microsoft, a chip that can run any ps4/pro or xb1/x for scaling and sharing the BC part of the edge infrastructure. It's much more efficient to share the same nodes.

They both made another chip for their respective real console, and the nextgen part of their cloud infrastructure. The 56CU xbsx is in there because it's from the MS side, and AMD clearly separate the two teams of the competitors contracts to avoid leaks.

I should have made a pastebin...
 
Unfortunately, none such data points exist. Even the insiders who do make claims, fail to provide any static data points, they are only interested in keeping with relative data points. ie: PS5 is more powerful by X or Y. But refusing to acknowledge that we already have fairly good idea that XSX is Z, so that should be default make PS5 12.84 -> 13.8 TF for instance.
What Kleegamefan said was:

1. More flops (PS5)
2. PS5 game performance leads (not double-digit percentage).

It doesn't mean PS5 is 12.84~13.8 TF.

For example PS5 is 12.2 TF (while XB is 12.0TF) with better RT then PS5 may have better performance even the TF leads only 0.2TF.
 
What Kleegamefan said was:

1. More flops (PS5)
2. PS5 game performance leads (not double-digit percentage).

It doesn't mean PS5 is 12.84~13.8 TF.

For example PS5 is 12.2 TF (while XB is 12.0TF) with better RT then PS5 may have better performance even the TF leads only 0.2TF.
I'm going to laugh when we get close to launch and start hearing about balance and TF to bandwidth again.
 
Yea, thats one way. But then, if there is ~3TF difference between the two, many may say $50 is too small of a difference in price for power difference.

With $399 dollar console, all else bar TF being the same, $100 looks like a very good deal (which it would be, because Sony would be giving more for less).

Depends what’s important to consumers, I mean, XBO didn’t do too bad considering it was underpowered by an even bigger margin and cost more before dropping to around the same price.

And PS5 has a lot more going for it next gen with BC and who knows what else assures from a smaller TF deficit (if it is 9v12).
 
1) How much really would it cost to evaluate a discrete SOC only to ditch it? What's the sunk cost there?
To add to #1, what would be the cost of additional $4-5 spent per console for cooling vs a revising a chip?
I'd also assume that adding CUs to an existing chip wouldn't cost as much as developing a new chip.

Depends what’s important to consumers, I mean, XBO didn’t do too bad considering it was underpowered by an even bigger margin and cost more before dropping to around the same price.
XBONE was sold out for the 1st half of the year despite being weaker and more expensive. Neither console, no matter what spec/price, will have trouble moving the 1st ~10 million consoles.
 
Why would that have it's own code? If evaluating an existing slice of silicon, I'd have thought it'd have a code the describes it. Then again, I don't know how these things work. ;)
Different GPUs using the same silicon but with different firmwares and PCB configurations get different codenames. RX 5600XT is Navi 10 XLE, RX 5700 is Navi 10 XL and RX5700XT is Navi 10 XT.
Also, while Navi 10 is only a discrete GPU that only works with a variable x86 CPU, Ariel and Oberon should be codenames for platforms with a specific CPU in it. That alone should be reason enough to give them different codenames.


I can understand the absence of platforms for a different customer, or later chips from the roadmap. I'd have thought if Sony were evaluating two different possibilities, we'd have comparable testing for the purpose of said evaluation.
Not if Oberon/Ariel were candidates for a console being released earlier (i.e. the rumored 2019 PS5). That would put different availability timings between the possibilities.
Besides, should we assumethat the team doing regression tests on one platform needs to be the same team who does the tests on another platform?

There's not that much information in the Gospel, to be honest. There's this one excel sheet with the 2GHz / 18 WGP mention that gets repeated tens of times, with different names, along all the folders.

And what if Ariel/Oberon is simply the devkit they could make available for early development, one that uses the first ever (and the only one available) RDNA1 Navi 10 GPU for 1st-party devs to get used to low-level optimizations using the new architecture?

At the moment, a proposition of Sony running two configs is a proposition with zero evidence, making it a pretty tough sell.
Yes.
Though if it wasn't for the Gospel, a 2GHz Navi iGPU in a console would be another tough sell, yet here we are.
 
What Kleegamefan said was:

1. More flops (PS5)
2. PS5 game performance leads (not double-digit percentage).

It doesn't mean PS5 is 12.84~13.8 TF.

For example PS5 is 12.2 TF (while XB is 12.0TF) with better RT then PS5 may have better performance even the TF leads only 0.2TF.
Many insiders have suggeted the difference between the consoles are approximately the difference of XBO to X1S. Which is 7.4% approximately.
If we take the base value of 12TF, that would be about 12.84 TF (7%) to 13.8TF at 15% difference.
 
Don't know but I personally thought we would be lucky to get 10 GCN TFlops so 9.2 Navi TFlops is good enough for me but I'm still sceptical about the 2ghz clock.
My original guess was PS5 would be 16 GB of memory, have an SSD and somewhere between 9.5 and 11TF of power, the GitHub info is in line with this but I don't put much into that leak because 2ghz clock makes zero sense to me.
 
Many insiders have suggeted the difference between the consoles are approximately the difference of XBO to X1S. Which is 7.4% approximately.
If we take the base value of 12TF, that would be about 12.84 TF (7%) to 13.8TF at 15% difference.
These insiders, for example, Klee, stated PS5 has better game performance about 7~10%, not 7~10% more Flops .

I mean 10% better performance doesn't mean 10% more Flops.


If we look at the RT solution, AMD announced their solution for xsx but AMD never mentioned it for ps5. I guess Sony has some better implementation of RT so Sony doesn't use pure AMD solution (maybe a mix of AMD and other IP, or other solutions we don't know).
 
$500 BOM is more likely than $450 if you are going by the leaked specs. People seem way too optimistic.

$450 won't cover all the increases vs the ps4. The increase in costs from the SOC, RAM, SSD are more than $70 vs the ps4 without even considering the cooling cost and the recent uptick in RAM and NAND pricing. 450 would maybe be possible with only a 500GB SSD and with such a small drive, speed will likely be lower too.
 
This is a burden of proof scenario. We can't claim that there isn't a second chip or that there isn't all these possibilities that exist. We have proof of what is shown in the GitHub leak today. So if the claim is made that PS5 is X, Y , Z because of the leak then there is at least some evidence here to support that claim.

If you make the claim that there is a second GPU with more CUs and more TF of power, a non 256bit bus, then the burden of proof is on you to pull that evidence forward for us to see it.

Unfortunately, none such data points exist. Even the insiders who do make claims, fail to provide any static data points, they are only interested in keeping with relative data points. ie: PS5 is more powerful by X or Y. But refusing to acknowledge that we already have fairly good idea that XSX is Z, so that should be default make PS5 12.84 -> 13.8 TF for instance.

The higher those numbers go up on 1 console, the less realistic it gets which is why the relative data points is a fairly weak leak to rely on.

My only claim is there is no proof that Ariel/Oberon is carrying a monolithic APU, and regardless of what the PS5 ends up being, those devkits could have just a Navi 10 chip connected to some CPU (could be .
- There are no RT tests, nor any mention of RT at all;
- Performance results are completely in line with a Navi 10 with 1/9/18 WGPs enabled at 800/911/2000MHz;
- There are no CPU bandwidth tests claiming the CPU cores have access to 256bit GDDR6;
- There are no CPU tests at all (why wouldn't there be?);



So if it's a bunch of tests showing a GPU that perfectly matches a Navi 10 with up to 18 WGPs enabled and up to 2GHz core clocks, using variable GDDR6 clocks, and absolutely nothing else (no RT which is confirmed, no CPU, no storage), why should we skip Occam's Razor and assume Oberon isn't anything other than a devkit with just a Navi 10 in it?
 
So if it's a bunch of tests showing a GPU that perfectly matches a Navi 10 with up to 18 WGPs enabled and up to 2GHz core clocks, using variable GDDR6 clocks, and absolutely nothing else (no RT which is confirmed, no CPU, no storage), why should we skip Occam's Razor and assume Oberon isn't anything other than a devkit with just a Navi 10 in it?

absolutely. The whole story is not presented. The evidence presented is flawed. You are right err on the side of caution that this leak would represent release silicon. I know I am cautious about that claim.
 
These insiders, for example, Klee, stated PS5 has better game performance about 7~10%, not 7~10% more Flops .

I mean 10% better performance doesn't mean 10% more Flops.


If we look at the RT solution, AMD announced their solution for xsx but AMD never mentioned it for ps5. I guess Sony has some better implementation of RT so Sony doesn't use pure AMD solution (maybe a mix of AMD and other IP, or other solutions we don't know).
Well his claim is even worse then. Now we have to factor in the variability of how software could perform better on one over the other.

now we need to bring drivers into the fold, API maturity, kit maturity, whether all the features are implemented well. Etc.
 
If you make the claim that there is a second GPU with more CUs and more TF of power, a non 256bit bus, then the burden of proof is on you to pull that evidence forward for us to see it.

It is the baseless section afterall, we can make up our own devkits/gpus :)

$500 BOM is more likely than $450 if you are going by the leaked specs. People seem way too optimistic.

$450 won't cover all the increases vs the ps4. The increase in costs from the SOC, RAM, SSD are more than $70 vs the ps4 without even considering the cooling cost and the recent uptick in RAM and NAND pricing. 450 would maybe be possible with only a 500GB SSD and with such a small drive, speed will likely be lower too.

Sony probably aimed at similar to PS4 BOM, in terms of APU etc, but ram/NAND and SSD lift that cost, with higher clocks requiring a more expensive cooling solution (which still isnt anything expensive, a few dollars if we might believe rumours).

So if it's a bunch of tests showing a GPU that perfectly matches a Navi 10 with up to 18 WGPs enabled and up to 2GHz core clocks, using variable GDDR6 clocks, and absolutely nothing else (no RT which is confirmed, no CPU, no storage), why should we skip Occam's Razor and assume Oberon isn't anything other than a devkit with just a Navi 10 in it?

Maybe, but so close to production, are things really going to change to much? A year prior to production, i hope Sony had target specced devkits out there.
 
Status
Not open for further replies.
Back
Top