Baseless Next Generation Rumors with no Technical Merits [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
Just to note, leaks have happened every gen.

360 block diagram leaked.

I dont remember the timeline about PS3 info TBH.

Last gen people were passing around One and Ps4 specs early on this very board in PM among others I'm sure. Then Vgleaks.com collated those and put them on a website some time later. They turned out to be almost 100% correct despite the same type of doubters you see about leaks today.

Here's another thing about the github leaks, to my knowledge no developer has really stood up and said "this is incorrect". Which likely wouldn't break any NDA's and be easy enough to do. We only have shadowy Klee types, who really wont even say that much.

Another rational thing to think about: There haven't been any consistent rumor of something "other" than Oberon or Arden. Something that has been there throughout and has the credibility of specifics and time. There's only nonspecific "PS5 is better than that, 13TF" or whatever type stuff, without regard to CU's etc except to backwards engineer them to get the number you need.

Really IMO analyzing leaks requires just being rational. Is there a chance these github leaks are wrong? Yes. What would I put that chance at? Very very low, maybe 5%. Maybe on a different day I put it at 10% chance.

What kind of sucks is I let myself get back into these threads with still possibly months to any official spec confirmation. Which could be E3 or later IMO.

Also, if Sony is behind on TF I expect they may not reveal any TF numbers for PS5 officially at all, ever. Par for the course for all platform holders. You dont accentuate what you lack in. Same reason MS doesn't tell you how many Xbox sold, it pales next to PS4. MIcrosoft will probably gladly give you that info closer to release, just as Sony will happily tell you how many PS4 sold.
 
Last edited:
They're in the igpu and igpu_295 sheet in the Oberon regression test files. First column.

Cache Bandwidth
I$ (per SQ)
K$ (per SQ)
GL0 (per V$)
GL1 (WGP/SA = 1)
GL1 (WGP/SA = 2)
GL1 (WGP/SA >= 4)
GL2 (WGP=1, SA=1) (Read)
GL2 (WGP=9, SA=2) (Read)
GL2 (WGP=18, SA=4) (Read)
GL2 (WGP=1, SA=1) (Write)
GL2 (WGP=9, SA=2) (Write)
GL2 (WGP=18, SA=4) (Write)

They could quite easily be left over if any changes were made. We simply do not know the context of these results with regards to the final console.

Just to note, leaks have happened every gen.

360 block diagram leaked.

I dont remember the timeline about PS3 info TBH.

Last gen people were passing around One and Ps4 specs early on this very board in PM among others I'm sure. Then Vgleaks.com collated those and put them on a website some time later. They turned out to be almost 100% correct despite the same type of doubters you see about leaks today.

Here's another thing about the github leaks, to my knowledge no developer has really stood up and said "this is incorrect". Which likely wouldn't break any NDA's and be easy enough to do. We only have shadowy Klee types, who really wont even say that much.

Really IMO analyzing leaks requires just being rational. Is there a chance these github leaks are wrong? Yes. What would I put that chance at? Very very low, maybe 5%. Maybe on a different day I put it at 10% chance.

What kind of sucks is I let myself get back into these threads with still possibly months to any official spec confirmation. Which could be E3 or later IMO.

Also, if Sony is behind on TF I expect they may not reveal any TF numbers for PS5 officially at all, ever. Par for the course for all platform holders. You dont accentuate what you lack in. Same reason MS doesn't tell you how many Xbox sold, it pales next to PS4.

I don't think anyone would comment on the leaks even if they knew they weren't true. The reason people are doubting the leaks is because they make zero sense with regards to building a console that is cheap, fast and cool. The high clock speeds would make yields even for a small chips likely not that great and the high heat would require a more expensive cooling system, considering that both MS and Sony get to know advanced timelines about products going this route doesn't really make any sense.
 
We actually don't know if the CUs and WGPs changed between Ariel and Oberon just like how the bandwidth did. There was a rumour previously that Ariel was an earlier chip that got chucked out in favour of something faster (Oberon).
288GT/s is max texture fillrate at 2.0GHz. This gives us number of CUs as formula goes 288/4/GFXClk/

Havent heard that rumor. Only saw Ariel popping up at same time as Arden was in PCI ID repository. Sounds a bit far fetched, especially if they were fixing bugs from Ariel in Oberon and making comparison of these two.
 
I don't think anyone would comment on the leaks even if they knew they weren't true. The reason people are doubting the leaks is because they make zero sense with regards to building a console that is cheap, fast and cool. The high clock speeds would make yields even for a small chips likely not that great and the high heat would require a more expensive cooling system, considering that both MS and Sony get to know advanced timelines about products going this route doesn't really make any sense.
They might not make sense to us in retrospective, but for company that is hardware limited by their BC method, and the one that designed next gen console with certain requirements (tdp, form factor), it might not have looked the way it does to you.

I already said this few pages ago, but chip on which Oberon is based retails for 400$ and doesnt contain RT hardware.

GPUs on which PS4 and PS4Pro were based retailed (in a year these were released) for ~180-200$. They were also smaller chips then Navi 10 is.
 
Last time around I conceded Xb1's GPU deficit to PS4 as soon as I saw the 12CUs at 800mhz in an internal deck. I also remember seeing 6x multiplier for GPU increase in the deck if I remember currently.

There was also something about a CU in the Durango being twice as good as a SIMD group in the 360.

The 360 had 3 SIMD groups. Durango has 12 CUs. Thus the real world perf can be extrapolated to be ~8X over 360.
 
Last edited:
Last time around I conceded Xb1's GPU deficit to PS4 as soon as I saw the 12CUs at 800mhz in an internal deck. I also remember seeing 6x multiplier for GPU increase in the deck if I remember currently.

There was also some about a CU in the Durango being twice as good as a SIMD group in the 360.
First time I conceded there might be an unforeseen issue was when Joker454 came out and said PS3 is Vertex limited.

Lol how time flies!
 
They might not make sense to us in retrospective, but for company that is hardware limited by their BC method, and the one that designed next gen console with certain requirements (tdp, form factor), it might not have looked the way it does to you.

I already said this few pages ago, but chip on which Oberon is based retails for 400$ and doesnt contain RT hardware.

GPUs on which PS4 and PS4Pro were based retailed (in a year these were released) for ~180-200$. They were also smaller chips then Navi 10 is.

How does clocking a chip far above the knee for performance/watt make any sense with regards to a TDP target?.
 
How does clocking a chip far above the knee for performance/watt make any sense with regards to a TDP target?.

Your assuming the knee is in the same place for what is a different chip that will reside in a device whose power delivery is specifically tailored to run that specific chip at maximal efficiency.
 
How does clocking a chip far above the knee for performance/watt make any sense with regards to a TDP target?.
Well, for one, we are comparing this to a knee from 5700XT, PC chip that was released 18 months before consoles, as first Navi card on a new new process.

While it gives us a good referrence, we dont know how 2nd generation Navi will clock on improved process they will most likely be using as its just 2nd gen 7nm node, requiring no chip redesign (N7P).

So while Navi sweet spot is 1750-1800MHz currently, nothing tells us next years ones will not be higher, as frequency will have to be a big point of improvement for both MS and Sony to get most out of expensive chips.
 
They're in the igpu and igpu_295 sheet in the Oberon regression test files. First column.

Cache Bandwidth
I$ (per SQ)
K$ (per SQ)
GL0 (per V$)
GL1 (WGP/SA = 1)
GL1 (WGP/SA = 2)
GL1 (WGP/SA >= 4)
GL2 (WGP=1, SA=1) (Read)
GL2 (WGP=9, SA=2) (Read)
GL2 (WGP=18, SA=4) (Read)
GL2 (WGP=1, SA=1) (Write)
GL2 (WGP=9, SA=2) (Write)
GL2 (WGP=18, SA=4) (Write)

What's SA"?
 
They could quite easily be left over if any changes were made. We simply do not know the context of these results with regards to the final console.

For me the context is pretty clear. Bunch of tests to make sure that each spin of chips don't introduce unwanted side effects / bugs. Aka regressions.

Word regression doesn't have anything to do with BC either, as you should create regression tests even for brand new hardware / software. Often the reality is that in the software world, lots of MVPs skip regression tests for speed to the market!

Another word for regression tests in the software world is unit / integration tests.
 
Last edited:
Story cont.

One day at office, you come to work and everyone is gathered around a shiny silver PC tower. Confused, you ask what the commotion is. All your colleagues turn to you and collectively chant. "PS5 devkit with a ~13TF Vega GPU".

Excited, you message all your friends, "13>12".

So your friends didn't message you back and say dude, Vega only goes up to 13TF.

At a certain time before or after you first laid your eyes on the shiny Cerny tower, you acquire an updated / different set of targets from MS. 10+TF Navi.

"Ah you think to yourself, 12GCN translates to ~10TF Navi. Good to see that Navi is fine. AdoredTV must be a fraud!"
"Wait a minute." you realize. "This means that PS5 will be around ~10.8TF. That's better than Stadia!"

You message your retired journalist friend. "Scarlett and PS5 >= 10TF. PS5 > Scarlett by 5-10%."

Around similar time, your office receives the updated PS5 devkit. "Wow that things looks substantial, all these vents must meant it's very stackable! Wait only 9.2TF? It performs better than the Vega 64 kits though. The RDNA jump is bigger than expected! This means the final Xsx is likely under 9tf, or more likely there will be an upgraded PS5 dev kit down the line over 10TF."

Same friends didn't tell you dude, rdna only has 40 cu's?


Fast forward to December:

Articles come out with 12TF Navi rumor for Xsx. You read them and chuckle. "These fools thinks 2x 1x is 12TF RDNA." Later after the Xsx reveal, Phil corroborate your secret knowledge by again using the same multipliers you knew for 2 years!

12/25th: All hell breaks loose. How could AMD screw the pooch like this! You saw that Xsx's CU count is 56 Cus and you begin sweating. Maybe you thought, "you just screwed over your ex-journalist friend who is currently in a bunker somewhere in a remote part of America".

Now these friends tell you dude, you should go bet your mortgage on xbox!!
 
Status
Not open for further replies.
Back
Top