Next Generation Hardware Speculation with a Technical Spin [pre E3 2019]

Status
Not open for further replies.
https://gematsu.com/2019/05/yakuza-...ory-is-finished-and-in-recording-ps5-comments

PlayStation 5

  • Even after PlayStation 5 is released, I think PlayStation 4 is going to be the main focus for a while longer.
  • The processing power of PlayStation 5 is incredible, so when we try to think of new gameplay that will utilize its full potential, I’m not really sure which aspects of existing mechanics we should translate.
  • First there was a time when graphics improved, then there was network features, and now I guess you can say its a return to the “programmable” era? I think artificial intelligence and machine learning will continue to evolve.
  • If you would pour its power into graphics it’ll be the best that we’ve yet to see, and I think everyone is thinking about in what way we should use that power. I as well. There’s artificial intelligence and many other things I want to think about.


A lot of text that doesn't seem to say much, except maybe for continued PS4 support.
 
What leads you to this line of thinking?
Lack of patents?

When navi was being started it was prior to Sony input as confirmed by AMD.
So there could easily be lots of decisions that was made that would make PS4 BC harder to achieve.
Which would mean customizing navi to achieve it, if it was too much work on the software side to guarantee 100% seemless BC.

No. What I'm saying is that any type of customization on the GPU side for PS4 is more than likely straightforward or lite, nothing comparable to what needs to happen on the CPU and memory configuration side towards BC.
 
Navi was in development prior to Sonys collaboration.

Remains to be seen exactly how much of Navi's earlier plans will be in this year's graphics cards, considering the architecture appeared with an early 2018 release window at some point.
 
Old news ;)

2014:
The Xbox One could one day make use of a powerful rendering technique called "real-time ray-tracing," an advanced system used to achieve photorealistic lighting effects. Microsoft Studios executive Phil Spencer has confirmed that internal tests are already underway.

https://www.gamespot.com/articles/x...ring-system-for-amazing-visuals/1100-6418060/

The rendering technique would, however, be yet another computational effort for the Xbox One, but it's possible Microsoft could offload that process to its vast network of cloud-powered servers. Respawn Entertainment is tapping into Microsoft's cloud network to help with performance for its upcoming multiplayer shooter Titanfall.

You don't say...

CAFÉ is that you... lol
 
No. What I'm saying is that any type of customization on the GPU side for PS4 is more than likely straightforward or lite, nothing comparable to what needs to happen on the CPU and memory configuration side towards BC.
Ok, I have a different view than you do.
We was hoping navi would be a reasonable improvement over previous uarch, and now their rebranding it RDMA. I doubt things will just work with a couple lite tweaks.
Yet the CPU needed a magnitude more work.
If games was being recompiled for PS5 then I could half belief it.

Remains to be seen exactly how much of Navi's earlier plans will be in this year's graphics cards, considering the architecture appeared with an early 2018 release window at some point.
It will definitely be interesting picking apart the differences.
Although may not be able to know if something only came about due to Sony, unless its not in Scarlett.
 
https://gematsu.com/2019/05/yakuza-...ory-is-finished-and-in-recording-ps5-comments




A lot of text that doesn't seem to say much, except maybe for continued PS4 support.

I read that as saying they finally have enough of an excess in CPU power that they can finally do more interesting things that aren't almost purely graphics or presentation related. So maybe we'll finally see AI in games catch up to what we had on the X360 again. IMO, the current generation (all platforms) took a backwards step compared to something like Halo: Reach, which IMO had some of the best AI ever in a game.

Regards,
SB
 
Old news ;)

2014:
The Xbox One could one day make use of a powerful rendering technique called "real-time ray-tracing," an advanced system used to achieve photorealistic lighting effects. Microsoft Studios executive Phil Spencer has confirmed that internal tests are already underway.

https://www.gamespot.com/articles/x...ring-system-for-amazing-visuals/1100-6418060/
Lol that’s an impressive memory you have there. That totally would have flown under the radar for just about everyone here because we know there isn’t enough juice on Xbox.
 
It doesn't make sense to use a small 48CU chip on 7nm where they could fit a 80CU GPU on a 350mm2 APU die. You get much better perf/watt by going with more CUs clocked lower than less CUs clocked higher.
Realistically 64CUs (56 enabled-8 disabled) its the lowest PS5 GPU will go, leaving room for potential dedicated rt hw, audio chip and other silicon.

Oh really? How could they fit 80 CUs? Radeon 7 is 330mm with 64 CU. A Zen chipset is ~70mm. Navi looks to have higher area usage per CU. So has are you going to fit all that in a 350mm APU on an even more expensive process.
 
Old news ;)

2014:
The Xbox One could one day make use of a powerful rendering technique called "real-time ray-tracing," an advanced system used to achieve photorealistic lighting effects. Microsoft Studios executive Phil Spencer has confirmed that internal tests are already underway.

https://www.gamespot.com/articles/x...ring-system-for-amazing-visuals/1100-6418060/

OMG
Quotes like that were a gift to Misterxmedia. He served up kool-aid for two years with stuff like that. 15x Cloud power, APU + dGPU, embedded memory, and CUs waiting to be activated through firmware updates. I'm embarrassed to say that even I took a sip.

All that's required is the desire to believe what you're being told no matter how ridiculous it sounds and they've got you hooked. o_O
 
Old news ;)

2014:
The Xbox One could one day make use of a powerful rendering technique called "real-time ray-tracing," an advanced system used to achieve photorealistic lighting effects. Microsoft Studios executive Phil Spencer has confirmed that internal tests are already underway.

https://www.gamespot.com/articles/x...ring-system-for-amazing-visuals/1100-6418060/

I must quote this. It was a comment on that article, back in 2014, from a user named Jerusaelem:

"“The ‘Realtime Raytracing’ techniques will allow the Xbox One to finally generate the necessary 1.21 jiggawatts needed to propel the systems frame rate to a staggering 88fps, allowing both the Xbox One and it’s user to travel back in time to the year 1955!”, claims Microsoft’s studio boss/resident Peter Molyneux impersonator Phil Spencer in a follow up tweet. When asked where he came up with his concept in a phone interview later that day, Spencer replied, “I was hanging a clock in my bathroom after knocking back a fifth of Four Roses and a $2 gas station burrito when I slipped and hit my head on the toilet…when I came to, I had doodled a picture of the Xbox capacitor, which is what makes Realtime Raytracing possible! …also I drew a super cute gender swapped version of Bill Gates dressed up as Sailor Moon…they’re both on my DA page under the username qtPhilchanXB1 if anyone's interested. Constructive criticism only plz or your comments will be deleted."
 
Oh really? How could they fit 80 CUs? Radeon 7 is 330mm with 64 CU. A Zen chipset is ~70mm. Navi looks to have higher area usage per CU. So has are you going to fit all that in a 350mm APU on an even more expensive process.
Navis CUs will be smaller on the 32 SIMD wide config.Contrarly to popular believe CUs don't take most of the GPU die, they take around 40% a CU increase its not a linear area increase.
Radeon 7 has a lot of padding for yields and unneeded transistors for gaming

PS4 Pro was 325mm2 at 16nm and had 40 CUs with 4 disabled. Xbox One X had 44CUs and 4 disabled. when going from 16nm to 7nm, thats over a 2x increase and they could have put in 80CUs on the same size die.
For a deeper explanation i refer to @Proelite post
Need to revise my calculations for next generation APUs, I was under the assumption that the non-cpu and gpu portions of the chip would take up a constant percentage of the die, but further research shows that this is not the case. Bus controllers + etc shrink in size too with each node progression. In fact, you can fit a 384 bit bus in less die space as the 256 bit bus on the Pro.

So my new conclusion is that non-cpu / gpu on the next gen console should only be around ~80mm2 (1.8x scaling over the 140mm2 in the ps4 pro) if they don't add extra bus controllers, caches, dark silicon. In a 350mm2 die there should be 270mm2 to use, assuming cpu and Rops take around 100mm2, there is ~170mm2 remaining, which should fit 64 Vega 7 sized CUs with 30mm2 to spare.

New estimates for me is that >330mm2 SOC should be big enough to accommodate 8 core Zen2, 64CUs (4 disabled), 64 Rops.

Another epiphany of the new conclusion is 512 bit bus for the GPU is not crazy out of the world. It's less die space expensive than the 256 bit bus in the base PS4.

So we're straight up limited in next generation by Navi clock speed and the 64 CU limit.

If there is no CU limit, you can do ~100 CUs in a xb1x sized 360mm2 die. :eek:
 
Lol that’s an impressive memory you have there. That totally would have flown under the radar for just about everyone here because we know there isn’t enough juice on Xbox.
I am not sure how what he said in that tweet means or contradict anything. The tweet doesn't specify that said experiment with RT is on or will appear on X1. If anything this means the platform team has had RT in mind as a future technology with potential way back then. His recent statement in the 2018 vid was a direct statement on what they are working on with regards to the next generation of Xbox in which he name-checks framerate and RT as two things he has in mind for the next gen box.
 
I am not sure how what he said in that tweet means or contradict anything. The tweet doesn't specify that said experiment with RT is on or will appear on X1. If anything this means the platform team has had RT in mind as a future technology with potential way back then. His recent statement in the 2018 vid was a direct statement on what they are working on with regards to the next generation of Xbox in which he name-checks framerate and RT as two things he has in mind for the next gen box.
Sure. But in 2014 we all ignored it. It wasn’t until DXR and We saw RTX demos in 2018 did RT mean anything to the entire gaming community.
It was an interesting hint to a larger idea that did eventually spawn. Just find it interesting he said his back in 2014.
 
This is something I've raised when talking about customisation. There could be a lot of work to support PS4 BC.
Even if it was straight GCN if some instructions where removed or behaved differently, it could potentially needed more work than people assume. Their software stack isn't as abstracted as Xbox.
Hmmm So perhaps while MS has moved onto Navi 20 (full Navi), PS5 is based on NAVI 10 architecture with GCN to facilitate easier BC. Sony does not have the software stack and software engineering capacity as MS, so perhaps they may have had to go with a more hardware based approach to backwards compatibility given GNM IS an arch specific api originally designed to go low level with Southern Island and GCN 1.0.

edited for typo
 
Last edited:
Hmmm So perhaps while MS has moved onto Navi 20 (full Navi), PS5 is based on NAVI 10 architecture with GCN to facilitate easier BC. Sony does not have the software stack and software engineering capacity as MS, so perhaps they may have had to go with a more hardware based approach to backwards compatibility given GNM is not an arch specific api originally designed to go low level with Southern Island and GCN 1.0.

It does seem like Sony is going in between GPUs for the ps 5,( will probably be like a Navi 1.5 ) that’s not a bad thing to be honest.
I don’t believe there will be a big difference in the end between the two systems.
They still have similar constraints to contend with. These GPUs will be designed mostly for efficiency and grunt, they don’t have the space (like pc GPUs ) for a lot special sauce.
 
Status
Not open for further replies.
Back
Top