Baseless Next Generation Rumors with no Technical Merits [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
All this talk of sony having another ray tracing solution.... isnt that just going to add more cost to the console by having another chip ? Wouldnt it also reduce bandwidth if that chip has to access the same ram or increase costs of it has it's own ram pool ?
 
All this talk of sony having another ray tracing solution.... isnt that just going to add more cost to the console by having another chip ? Wouldnt it also reduce bandwidth if that chip has to access the same ram or increase costs of it has it's own ram pool ?

The assumption is that it would be on-chip and integrated into the gpu.
 
If Sony has "their own" RT solution (collaborating of course), I'd be afraid. And it could be the reason PS5 is allegedly undesrpecced in other areas.

I feel like Sony's lack of ability to admit they can no longer engineer console silicon is what led to the Ps3 semi-disaster. They wanted to be involved, in some way, any way, so they made Cell, even needed IBM help. Probably wasn't a great idea. Even worse if they'd used as GPU too as supposedly they originally wanted.

That said why do we think Sony might have their own RT solution? Surely not the mighty github leaks. But that's the only thing I can figure where they started....since that test didn't include any RT test on the alleged PS5.

I would not say Cell was a mistake. Wasn't the GPU the weak spot? Maybe, when Cell came up people realized it's hard to saturate parallel processors, which was something new back then but now is just widely known and accepted. (To me, GCN compute seems still widely underutilized by games too.)

I do not remember all the points that led even me to consider custom Sony RT as probable. But github was one of them, because it mentioned RT for MS but not for Sony chips. So AMD tested only their own RT?
Another point was: If Sony planned to release 2019 with RT, but AMD could not guarantee to deliver, Sony might have decided to do it on their own or with another partner.

Both makes sense, but i don't know how practical it is to integrate custom RT into AMD GPU, or if a seperated RT block could make sense.
Now RDNA2 confirmed for SX also reduces the probability. But both options remain possible.
 
Odium says both are RDNA 1.9, not RDNA 2.0
We know XSX has both VRS and RT blocks from RDNA 2.0. But even with that it is still missing 0.1 for whatever reason unknown.
If PS5 is also 1.9, it must also be missing that same 0.1 for whatever reason that is.

So by default PS5 must have the same RT blocks. because not having it would conclude it to be a significantly less of a number. Like a RDNA 1.5 or less.

You're awfully stuck on that ".9" when the original poster mentioned it in passing and as means to say it doesn't have all the stuff in RDNA2 because it's been traded by custom optimizations.
Now I also think that conversation has run its course, we will just have to agree to disagree that the expression he used was symbolic and not descriptive.

I also don't know where he says that said custom optimizations / blocks are the same for both consoles, which you are implying.
He wrote both consoles are not RDNA2 but they're RDNA1.9 + custom optimizations. Nowhere does he say the consoles are using the same instruction set.
The PS4 Pro and the XBoneX certainly didn't use the same ISA. They had noticeably different compute units.
 
If Sony has "their own" RT solution (collaborating of course), I'd be afraid. And it could be the reason PS5 is allegedly undesrpecced in other areas.

I feel like Sony's lack of ability to admit they can no longer engineer console silicon is what led to the Ps3 semi-disaster. They wanted to be involved, in some way, any way, so they made Cell, even needed IBM help. Probably wasn't a great idea. Even worse if they'd used as GPU too as supposedly they originally wanted.

That said why do we think Sony might have their own RT solution? Surely not the mighty github leaks. But that's the only thing I can figure where they started....since that test didn't include any RT test on the alleged PS5.
The problem I have with PS5 having it’s own RT solution is that Sony worked hard to make life easier for devs after PS3. So is this a return of arrogant Sony if true?

Odium says both are RDNA 1.9, not RDNA 2.0
We know XSX has both VRS and RT blocks from RDNA 2.0. But even with that it is still missing 0.1 for whatever reason unknown.
If PS5 is also 1.9, it must also be missing that same 0.1 for whatever reason that is.

So by default PS5 must have the same RT blocks. because not having it would conclude it to be a significantly less of a number. Like a RDNA 1.5 or less.
I think (again) you’re reading things too literally. His interpretation of v1.9 could mean 1 feature is missing - and either feature could be different for either machine (if that makes sense). He’s just saying they are not fully v2.

I would not say Cell was a mistake. Wasn't the GPU the weak spot? Maybe, when Cell came up people realized it's hard to saturate parallel processors, which was something new back then but now is just widely known and accepted. (To me, GCN compute seems still widely underutilized by games too.)

I do not remember all the points that led even me to consider custom Sony RT as probable. But github was one of them, because it mentioned RT for MS but not for Sony chips. So AMD tested only their own RT?
Another point was: If Sony planned to release 2019 with RT, but AMD could not guarantee to deliver, Sony might have decided to do it on their own or with another partner.

Both makes sense, but i don't know how practical it is to integrate custom RT into AMD GPU, or if a seperated RT block could make sense.
Now RDNA2 confirmed for SX also reduces the probability. But both options remain possible.
Yes, I was under the impression Cell was too far ahead of its time, the problem was that whilst PS3 was complicated (inc split memory pools) having Cell as the GPU also would have been a car crash due to complexity.
 
Last edited:
The problem I have with PS5 having it’s own RT solution is that Sony worked hard to make life easier for devs after PS3. So is this a return of arrogant Sony if true?
That’s exactly why this whole story about Sony having developed their own RT solution - instead of what AMD would have served them on a silver plate - is as ridiculous as the one where Sony co-developed Navi.

I know this is the truth because I’m writing this on the internet.
 
I Ihink (again) you’re reading things too literally. His interpretation of v1.9 could mean 1 feature is missing - and either feature could be different for either machine (if that makes sense). He’s just saying they are not fully v2.
So we're to assume he knows complete details of unreleased architecture and two unreleased semi-customs based (at least partly) on said architecture.
And he was what, some QA game tester? Yes, that sounds plausible :confused:
 
All this talk of sony having another ray tracing solution.... isnt that just going to add more cost to the console by having another chip ? Wouldnt it also reduce bandwidth if that chip has to access the same ram or increase costs of it has it's own ram pool ?
Every inclusion has costs. If the benefits outweigh the costs, it's a good move. Same with the audio processor idea. Until we know (assuming this magical RT unit exists) how it all fits together, no judgements can be made about the choices. Theoretically, all your concerns could be true - this chip adds $30 per console and reduces BW to the other processors - but in terms of the system, it enables raytracing better than $60 of AMD GPU and reduces RAM BW hit from RT. OR, alternatively, it's a turkey that adds cost and performs worse. No-one can possibly know and no assumptions should be made.

The problem I have with PS5 having it’s own RT solution is that Sony worked hard to make life easier for devs after PS3. So is this a return of arrogant Sony if true?
Of course not. Cerny is still at the helm. An RT solution will be an appropriate one.
 
The problem I have with PS5 having it’s own RT solution is that Sony worked hard to make life easier for devs after PS3. So is this a return of arrogant Sony if true?

Since when is innovating or coming up with your own technology considered arrogant?

I'm not saying Sony has developed there own Ray Tracing solution but this rhetoric that it's folly to try to do something when there are already incumbents is backwards and stifles innovation.

There would be no Tesla with that mindset.
 
What's the difference between him and VFXVeteran? Mind you the latter has a verified tag.

Tommy Fisher, OsirisBlack, Odium and VFXVeteran all have wildly different numbers, even for Xsx prior to Monday.

If they were consistent across the board I would have second guessed myself. :)

They are all fraud. The only dev verified on GAF and era is BG but he will not give number and when someone made an article using his era post he stop to post. The truth is no one with real data will give numbers or maybe a few hours before the reveal and not a verified developer but maybe a journalist.

And I am not Tommy Fischer. ;) This guy is a fraud. And I have no idea of the of the PS5 out of it will be between 8 and 14 Tflops for sure but I would say less powerful than Xbox probably between 8 and 11.9 Tflops.

My only "bet" I told is PS45 SSD speed will be above 7 GB/s SSD speed multiply by compression ratio), this is precise and it will be true or false during PS5 reveal. There is no vague assumpion or precise SSD speed number.
 
Last edited:
The assumption is that it would be on-chip and integrated into the gpu.
so that would rule out rdna2 then as you have redudent hardware. Also your going to have a bigger chip than your competitor or sacrifice some where else in the apu.

Also this implies that AMD would have been fine developing a hybrid of their technology with another technology and it all played nicely together


Seems very pie in the sky to me. But I guess we will find out soon enough
 
Just FYI and for all insiders calculating Navi performance v GCN performance based on TFs...

https://www.gpucheck.com/compare-mu...md-radeon-rx-5700-xt-vs-amd-radeon-rx-vega-64

Across dozens of tested games :

Radeon 5700 (7.9TF) outperforms 12.6TF Vega 64 card by 10.2% in 1440p and 10.5% in 4K.

Radeon 5700XT (~9.5TF) outperforms 12.6TF Vega 64 card by 20% in 1440p and 22.2% in 4K.

Obviously, in this case, Vega 64 is slightly bandwidth limited and Radeon VII would be closer to 1.25x multiplier, but that card has 1TB/s worth of BW, which is serious advantage over 448GB/s found in 5700 and XT.

In case of hypothetical 9.2TF card in PS5, coupled with higher BW then XT and 5700, this difference would likely increase by couple of %, therefore anyone saying 12.5-13TF GCN card would mean 10-11TF Navi in PS5 is almost certainly wrong.
 
So we're to assume he knows complete details of unreleased architecture and two unreleased semi-customs based (at least partly) on said architecture.
I don't know the complete details of KBL-G and GFX9 yet I'll still say I wouldn't call it Vega. I may even say the PS4 Pro is more of a Vega than KBL-G's Vega M.

What I do know of RDNA2 from AMD's slides is it has AMD's RTRT implementation and it's on 7nm EUV. A developer who has developer friends who work on multiplatform titles and have had their hands on Big Navi and next-gens will have heard that Big Navi does stuff that the next-gens don't, or do it differently.


Why is it so hard to imagine that people talk, and as long as there are no docs and stuff is said informally within groups of friends and the information isn't compromising nor damning to anyone, there's nothing wrong with it?
I know a shit load of stuff I wasn't supposed to, from my engineering friends, and vice-versa. I talk about the non-compromising and non-damning info I get all the time.
OMG they're all stepping on their NDAs and risking their jobs. Well if you have a brain you won't compromise anything.


And he was what, some QA game tester? Yes, that sounds plausible
Geez how the fake news spread fast.
o'dium is a developer, not a QA game tester.
 
I don't know the complete details of KBL-G and GFX9 yet I'll still say I wouldn't call it Vega. I may even say the PS4 Pro is more of a Vega than KBL-G's Vega M.

What I do know of RDNA2 from AMD's slides is it has AMD's RTRT implementation and it's on 7nm EUV. A developer who has developer friends who work on multiplatform titles and have had their hands on Big Navi and next-gens will have heard that Big Navi does stuff that the next-gens don't, or do it differently.


Why is it so hard to imagine that people talk, and as long as there are no docs and stuff is said informally within groups of friends and the information isn't compromising nor damning to anyone, there's nothing wrong with it?
I know a shit load of stuff I wasn't supposed to, from my engineering friends, and vice-versa. I talk about the non-compromising and non-damning info I get all the time.
OMG they're all stepping on their NDAs and risking their jobs. Well if you have a brain you won't compromise anything.



Geez how the fake news spread fast.
o'dium is a developer, not a QA game tester.

Don't trust GAF, he is not verified on era. There is only one verified developer on era and GAf it is BG but he never give any number. No verified account will give any number.

I would say don't trust anyone giving a precise number if it is not rumored by a trustful source like a good website. And I would say verified dev will not give anything before it is official.
 
Don't trust GAF, he is not verified on era.
(...)
And I would say verified dev will not give anything before it is official.


ROHNHwH.png
 
I remember when he worked on Quake II Evolved back in the day.

"maek it liek tenebrae1111" (no)
 
They are all fraud. The only dev verified on GAF and era is BG but he will not give number and when someone made an article using his era post he stop to post. The truth is no one with real data will give numbers or maybe a few hours before the reveal and not a verified developer but maybe a journalist.

And I am not Tommy Fischer. ;) This guy is a fraud. And I have no idea of the of the PS5 out of it will be between 8 and 14 Tflops for sure but I would say less powerful than Xbox probably between 8 and 11.9 Tflops.

My only "bet" I told is PS45 SSD speed will be above 7 GB/s(SSD speed multiply by compression ratio), this is precise and it will be true or false during PS5 reveal. There is no vague assumpion or precise SSD speed number.
O'dium is a fraud ? enlighten me
 
The assumption is that it would be on-chip and integrated into the gpu.

I consider the 2nd part of this assumption extremely unlikely as I've previously questioned the same before. Integrating some foreign circuit logic into the GPU cache infrastructure sounds like a nightmare to me. This is surely not plug&play at that level and anything outside has huge transaction costs and needs redundant transistors for cache and other resources.
 
Status
Not open for further replies.
Back
Top