Baseless Next Generation Rumors with no Technical Merits [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
5. Based on all this, if we assume Sony and MS won't be in HEAVY loss business (which they most certainly aren't going for), we are going to see sub 9TF GPUs in these consoles
.

While in general I agree, I found this quote to be interesting from Sony’s fiscal year report.

https://www.eventhubs.com/imagegallery/2019/may/21/playstation-5-stuff/1/

“PS4 will be the source of engagement and profitability for the next three years.” So even over a year after PS5 launches.

The engagement I understand as it’s mainly about install base. The profitability is more open to interpretation. At least to me, it either means a high price of PS5 so expect a slow ramp until the price drops due to cost reductions or PS5 will be sold at a loss and expect profits only after cost reductions.

I really they will go all out with the PS5. However my expectation is still smaller die size (300mm) with very high clocks. There will be a quick process reduction to 7+/6nm to get the thermals and power under control. Think like the quick revisions to PS3 when they reduced the cell and the RSX from 90 to 65nm in consecutive years and power dropped from 200 to 130W.
 
While in general I agree, I found this quote to be interesting from Sony’s fiscal year report.

https://www.eventhubs.com/imagegallery/2019/may/21/playstation-5-stuff/1/

“PS4 will be the source of engagement and profitability for the next three years.” So even over a year after PS5 launches.

The engagement I understand as it’s mainly about install base. The profitability is more open to interpretation. At least to me, it either means a high price of PS5 so expect a slow ramp until the price drops due to cost reductions or PS5 will be sold at a loss and expect profits only after cost reductions.

I really they will go all out with the PS5. However my expectation is still smaller die size (300mm) with very high clocks. There will be a quick process reduction to 7+/6nm to get the thermals and power under control. Think like the quick revisions to PS3 when they reduced the cell and the RSX from 90 to 65nm in consecutive years and power dropped from 200 to 130W.
I think this makes it sound more akin to what Cerny said, something akin to "Its gonna be good price for what user is getting". My point is, they cannot really go to high on costs because node reductions are getting smaller and smaller, therefore having monster of a die size will not get them to what PS4 did, good tech, slight loss, a ton of profit after ~2 years on market.

My prediction is, even with ~8-9TF Navi, you still have top of the line 8 core Zen2, 1TB of SSD, RT, gobs of GDDR6 RAM, UHD and expensive node (meaning, bigger dies are more expensive then ever). In the end, based on what we know regarding PC parts, we know that Navi 10 is bigger with higher TDP relative to last gens parts that where already downlocked and likelyhood of getting even higher parts then already well above limit Navi10 is small.
 
I am almost 100% this person has leaked PS5 devkit PCB on 20th of May (now deleted his account). He also mentioned May 21st being "a cake day" but day this was posted was 20th of May GMT, so he was more then likely from Asia region.

In any case, we got similar leaks last time around from most obscure sources which turned out true (pastebin 2011 Dec, sweetvar26 on GAF and his friend who works at AMD etc).


This would point to :

-316mm2 die
-Probably 40CU @1.8GHz with 4 disabled(Gonzalo leak in April?)
-256 bit bus width and 18Gbps SAMSUNG GDDR6 chips - 32GB in DK, 16GB in console @ 576GB/s
-Additional 4GB DDR4 for OS

So - Zen2, Navi 8.3TF + HW RT and 20GB of RAM (16GB GDDR6 for games). 1TB NAND.

Many would think this would be $400 console, but at 7nm (more expensive node mm² v mm² compared to 16nm) + 16GB GDDR6 (only 8GB GDDR5 in Pro) + SSD (not found in Pro) + UHD drive (not found in Pro) it would definitely be at very least $450 at a loss, or most likely $500.
I agree, this is still a good leak. At 7nm+ that would give about 60 CUs (54 active). That would give about 12.4 Tflops at 1800 Mhz. Not that I think (anymore) they'll clock it as high as that. But they could reach 11 Tflops-ish with more reasonnable clocks like 1650 Mhz.
 
I am almost 100% this person has leaked PS5 devkit PCB on 20th of May (now deleted his account). He also mentioned May 21st being "a cake day" but day this was posted was 20th of May GMT, so he was more then likely from Asia region.

In any case, we got similar leaks last time around from most obscure sources which turned out true (pastebin 2011 Dec, sweetvar26 on GAF and his friend who works at AMD etc).


This would point to :

-316mm2 die
-Probably 40CU @1.8GHz with 4 disabled(Gonzalo leak in April?)
-256 bit bus width and 18Gbps SAMSUNG GDDR6 chips - 32GB in DK, 16GB in console @ 576GB/s
-Additional 4GB DDR4 for OS

So - Zen2, Navi 8.3TF + HW RT and 20GB of RAM (16GB GDDR6 for games). 1TB NAND.

Many would think this would be $400 console, but at 7nm (more expensive node mm² v mm² compared to 16nm) + 16GB GDDR6 (only 8GB GDDR5 in Pro) + SSD (not found in Pro) + UHD drive (not found in Pro) it would definitely be at very least $450 at a loss, or most likely $500.

FYI the bandwidth in the devkit is definitely higher due to the need to offset the bandwidth loss from clamshell. ~8% loss. Probably also higher additionally to feed the faster devkits.

I would say the final console uses 16gbps chips.
 
FYI the bandwidth in the devkit is definitely higher due to the need to offset the bandwidth loss from clamshell. ~8% loss. Probably also higher additionally to feed the faster devkits.

I would say the final console uses 16gbps chips.
I agree, yes. I think 512 GB/s should be enough. Probably 64 GB/s for CPU and 448 GB/s for GPU. Would match Navi XT bandwidth.
 
FYI the bandwidth in the devkit is definitely higher due to the need to offset the bandwidth loss from clamshell. ~8% loss. Probably also higher additionally to feed the faster devkits.

I would say the final console uses 16gbps chips.

A 256 bit bus really , when Xbone X already have a 384 bit bus , and the old Ps2 's Videobuffer is 2560 Bit wide ?? Cpu and Gpu share the same Bus , and you become lots troubles with Stalls and Latency , for 120 Fps and Raytraycing Effects you need ultra fast Videomemory with low latency , and without any disturbing Cpu Access. A Combination of HBM2 and DDR4 Ram makes more sense to me , or the GPU needs a big eDram Buffer. 316mm2 and 40 Cu's? So definitive NOT a Monster and for a new Consolegeneration too weak for a minimum Lifespan of 4-6 Years. I call these Rumors fake or Sony is going the weak Route again.

So the Guy from Platinum Software is right , no Innovation /Futureproof Tech, the same repacked Stuff from previous Generation.
 
A 256 bit bus really , when Xbone X already have a 384 bit bus , and the old Ps2 's Videobuffer is 2560 Bit wide ?? Cpu and Gpu share the same Bus , and you become lots troubles with Stalls and Latency , for 120 Fps and Raytraycing Effects you need ultra fast Videomemory with low latency , and without any disturbing Cpu Access. A Combination of HBM2 and DDR4 Ram makes more sense to me , or the GPU needs a big eDram Buffer. 316mm2 and 40 Cu's? So definitive NOT a Monster and for a new Consolegeneration too weak for a minimum Lifespan of 4-6 Years. I call these Rumors fake or Sony is going the weak Route again.

So the Guy from Platinum Software is right , no Innovation /Futureproof Tech, the same repacked Stuff from previous Generation.
Atsushi Inaba personnaly produced only Nintendo games since 2014 (and one cancelled XB1 game) and before that he mainly worked on games running on Nintendo hardware. I don't think his opinion on anything next-gen can be considered valuable when he never even finished one current gen game.
 
A 256 bit bus really , when Xbone X already have a 384 bit bus , and the old Ps2 's Videobuffer is 2560 Bit wide.
Bus width doesn't matter one jot. It's bandwidth, latency, and amount only that matter (and latency not that much). 1 TB/s on an 8 bit bus to 16 GBs RAM is a good solution. A 2560 bit 1 TB/s bus to a titchy 32 MB scratchpad would be rubbish. A 512 bit bus to 8 GB RAM would be a fail.

Oh, of course cost matters, which is the limiting factor. But PS2 having a 2560 bit bus is utterly meaningless.
 
A 256 bit bus really , when Xbone X already have a 384 bit bus , and the old Ps2 's Videobuffer is 2560 Bit wide ?? Cpu and Gpu share the same Bus , and you become lots troubles with Stalls and Latency , for 120 Fps and Raytraycing Effects you need ultra fast Videomemory with low latency , and without any disturbing Cpu Access. A Combination of HBM2 and DDR4 Ram makes more sense to me , or the GPU needs a big eDram Buffer. 316mm2 and 40 Cu's? So definitive NOT a Monster and for a new Consolegeneration too weak for a minimum Lifespan of 4-6 Years. I call these Rumors fake or Sony is going the weak Route again.

So the Guy from Platinum Software is right , no Innovation /Futureproof Tech, the same repacked Stuff from previous Generation.
It would be more then enough on 16Gbps chips. This way their memory controler would be smaller, so if they can get these speeds in console that would mean 512GB/s of bandwidth. Plenty enough for full Navi 10 + Zen2. If MS went with 14Gbps on 320bit bus this would result in slightly higher BW, but more space on die taken by it.
 
Excerpt:

Platinum Games studio head Atsushi Inaba has said he’s finding it “hard to get excited” about Microsoft’s and Sony’s plans for next-generation consoles, stating that the upcoming hardware feels like “more of the same”.

Asked for his reaction to the platform holders’ plans, Inaba said: “It’s OK. And by that I mean, I’m sure that things will move faster, graphics will be better and maybe it will be easier with less wait times… that’s good for the consumer.

“But it’s more of the same, quite frankly, compared to previous generations. It’s nothing that’s disruptive or super innovative, if you ask me.”

He added: “Game hardware used to be about custom chips that you couldn’t do on PCs. Now you look at it and they’re just grabbing stuff that already exists.

“The Switch, for example, is a Tegra which already existed and the other consoles are using very similar chips and graphics cards to what you see on PCs, but maybe slightly updated. None of it seems unique to that hardware anymore.”


https://www.videogameschronicle.com/news/platinum-boss-says-next-gen-consoles-more-of-the-same/
----------

Once again, keep expectations in check.
If people are worried about not enough TF difference to make next generation games. Well... rest assured your worries will be fulfilled ;)

It is not only the Hardware of the Consoles that missing Innovations , also the Softwaresides has boring elements today:

- direct access to the Hardware down to the Chipregisters like in the 32/16/8 Bit Era is not possible anymore , doesent matter if i had the Money and Manpower to do it , it is not allowed by Microsoft and Sony, so Game Development on Todays Consoles is far more restricted than in yesterday Times.

- focusing more and more on high level programming

- focusing on Pc/Win10/directX12 compatibility

- focusing on back and forward compatibility between Consolegenerations

- less motivation from the gaming Companys to optimize their Games (specially Multiplattformgames) on Consoles , and the Fact that Gaming Pc's with Intel and Nvidia Hardware are better supported from the Gaming Industry then those AMD Consoles, except First Party /Inhouse Gamestudios from Sony.

- ever the same and ten thousand Times repeated Unreal, Cry, Unity, Havok, Tree/Hair Engines. That gaves the Games a boring , uniform Look. What next Gen Consoles need is really new, fresh and individual Engines .

- i see in many different Games the same Texturcontent with the same bland graphic look, outwashed , overcompressed Textures and the complete abstinence of saturated and brilliant colors ,why??

- untalented Grahicdesigner with no sense and feeling for a good Artwork and Colors , Light and Shadows.

- focusing on useless Dlc's, Microtransactions , Lootboxes, Social Network and Services and on the other side less improvement , innovations for Gameplay and Graphics.

- all over Tearing , i didnt know one Game on my old Ps1 or Dreamcast that has Tearing Problems on the Screen, it must be a Phenomenon of the HD Century or the Devs are incompetent , or todays GPU's are crap.

- permanent Update and Patch Terror and long loading Times on Todays Console, and less Time to play a Game. The genius Plug and Play Feature is no longer a Part of modern Consoles.
 
all over Tearing , i didnt know one Game on my old Ps1 or Dreamcast that has Tearing Problems on the Screen, it must be a Phenomenon of the HD Century or the Devs are incompetent , or todays GPU's are crap.
We had only CRT displays back then so there was no tearing. That is one of the issues with signal processing that comes with digital vs analog.

Most of these issues are issues with the industry. Headaches and issues with scaling. The business of games have changed largely as well. I’m not necessarily sure those are related to hardware discussions.
 
It is not only the Hardware of the Consoles that missing Innovations , also the Softwaresides has boring elements today:
You are completely and utterly wrong. Half of what you want has been rightfully abandoned because hardware and software is so much more complex you can't do anything by messing about with registers other than break compatibility.

The other half is just nonsense...
- ever the same and ten thousand Times repeated Unreal, Cry, Unity, Havok, Tree/Hair Engines. That gaves the Games a boring , uniform Look. What next Gen Consoles need is really new, fresh and individual Engines .
Engines means very little when it comes to art style. A whole new engine won't give different looking hair - that's an art choice.

All these people bitching about poor modern console gaming versus yesteryear, watered down and inferior, need to take an actual look at their hobby. More games than ever. More diversity than ever. More visual diversity than ever (2D, 2.5D, 3D, photorealism, stylised) thanks to the power and flexibility of shaders. Everything from indie darlings and abstract weirdness to AAA. The fact these games are also on PC isn't a bad thing, but a good thing, unless you feel devs should have to work harder and get paid less for their efforts. And every single complaint you can levy about screen tearing was either an issue on older machines, or their framerates just plummeted because they were vsync'd. Pointing fingers at hardware for 'tearing' and 'poor colours' is ludicrous. You need to get to grips with how these things work rather than making spurious claims that the entire industry is crap.
 
Last edited:
We had only CRT displays back then so there was no tearing.
Tearing was possible if your hardware allowed for the image to be update mid-scan. It was extremely common on PC CRTs. If your frame overshot the allotted render time, you either got tearing with a mid-refresh FB swap, or a complete dropped frame. Likewise if you had racing rendering, you could get multiple frames and tears. Neverwinter Nights on PC sticks in my memory as a game I could never get smooth. Enable VSync and you had smooth, stable frames in simple scenes but a chugged framerate when it got busy. Disable VSync and framerate was better when it was busy (with a little tearing) but simpler scenes would tear multiple times. Older machines also worked directly on the front-buffer and didn't even have a back-buffer to swap.
 
Tearing was possible if your hardware allowed for the image to be update mid-scan. It was extremely common on PC CRTs. If your frame overshot the allotted render time, you either got tearing with a mid-refresh FB swap, or a complete dropped frame. Likewise if you had racing rendering, you could get multiple frames and tears. Neverwinter Nights on PC sticks in my memory as a game I could never get smooth. Enable VSync and you had smooth, stable frames in simple scenes but a chugged framerate when it got busy. Disable VSync and framerate was better when it was busy (with a little tearing) but simpler scenes would tear multiple times. Older machines also worked directly on the front-buffer and didn't even have a back-buffer to swap.
Hmm. LOL

Just realized the difference in our age is showing here. I don’t think I was old enough back then to catch tearing on PC CRTs.
 
ever the same and ten thousand Times repeated Unreal, Cry, Unity, Havok, Tree/Hair Engines. That gaves the Games a boring , uniform Look. What next Gen Consoles need is really new, fresh and individual Engines .
I feel there are more in house engines being used now than on last gen even the gen before that. I don't know maybe I'm wrong but to me last gen felt like everyone was using Unreal 3 and the gen before that Renderware.

Shifty is right though the engines don't really dictate the look so much, the art style is the most important factor. I will agree that there hasn't been that much innovation in games other than how pretty they look but I feel the weak CPUs played a big part in that this current gen.
 
I agree, this is still a good leak. At 7nm+ that would give about 60 CUs (54 active). That would give about 12.4 Tflops at 1800 Mhz. Not that I think (anymore) they'll clock it as high as that. But they could reach 11 Tflops-ish with more reasonnable clocks like 1650 Mhz.

ps5 can have 5 mode of working:

ps4 BC mode: 18@800
ps4 BOOST BC mode: 18@911
ps4-pro BC mode: 36@911
ps4-pro TURBO BC mode: 36@1822
ps5 mode: 54@1650
 
https://pastebin.com/E5cg6ntZ
  1. Scarlett

  2. CPU: Custom Zen 2 - 8C/16T @ 3.4GHz
  3. GPU: Custom RDNA - 54 Compute Units @ 1577MHz - 10.9TF ( Targeting 2X Scorpio )
  4. RAM: 16GB GDDR6 @ 560GB/s ( Samsung 6x2GB / 4x1GB ) + 8GB DDR4 ( Samsung 2x4GB )
  5. AUDIO: Tensilica HiFi 4 DSP
  6. HD: 1TB SSD NVMe PCIe 4.0 + 32GB NAND eMMC 5.1
  7. I/O: 2x HDMI 2.1, 2x USB 3.1, 2x USB-C, S/PDIF, IR-Out, RJ-45, Bluetooth 5.0, Wi-Fi Direct, Wireless 802.11ax

  8. Dante XDK

  9. CPU: Custom Zen 2 - 8C/16T @ 3.4GHz
  10. GPU: Custom RDNA 58 CU @ 1577MHz
  11. RAM: 32GB GDDR6 @ 560GB/s + 8GB DDR4
  12. HD: 1TB SSD NVMe PCIe 4.0 + 4TB SSD NVMe PCIe 4.0 + 32GB NAND eMMC 5.1 + Thunderbolt interface for High-speed transfer

  13. Dante Roadmap:

  14. - Apr 2019 launch Preview
  15. - Production in Oct 2019
  16. - Refresh in Jun 2020
  17. - Final version around Aug 2020
 
wow it seems PS5 really similar to Scarlett... Maybe one with GDDR6 and the other with HMB2 (just to damper the too strong similarities)....
 
wow it seems PS5 really similar to Scarlett... Maybe one with GDDR6 and the other with HMB2 (just to damper the too strong similarities)....

I presume the dante xdk in the post above yours is the devkit for Scarlett, not the PS5. Hence double the ram and the nearly identical specs.
 
Status
Not open for further replies.
Back
Top