Baseless Next Generation Rumors with no Technical Merits [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
AMD as a company…strongly believes in the value and capability of raytracing. RDNA 2, the next-gen, will support raytracing. Both the next-gen Xbox and PlayStation will support hardware raytracing with Radeon natively. We will be sure to have the content that gamers can actually use to run on those GPUs

We believe in our raytracing, and we will have it when the time is right.

-Mithun Chandrashekhar, Product Management Lead, AMD

21 January 2020
 
That is true, but Oberon Native regression test is running Ariel-SPEC-iGPU tests as noted below

xlsm-png.3621


If Ariel is GFX1000 (Navi10_lite), and it is, then there won't be RT/VRS test when running regression test with Ariel specs. Therefore, this does not prove Oberon has no RT/VRS, it proves that Oberon Native regression test was running Ariel SPEC iGPU (as was the case with BC1 and BC2), thus I speculate Ariel was early Navi used by Sony as testbed for PS5 BC, until RT/VRS become available.

Doesn't Oberon have a different clock speed ie 2GHz? Is the assumption they downclocked to 1.8GHz for this test?..
 
Doesn't Oberon have a different clock speed ie 2GHz? Is the assumption they downclocked to 1.8GHz for this test?..
In test its running at 2.0GHz.

Ariel from Gonzalo ran at 1.8GHz, but that is assumption based on AMD decoding, which might not be 100% (as is case with Flute as well).

In any case, we will have to wait and see. Its March 2nd and we still dont have full specs ,this is very surprising tbh.
 
I see some people on other forums are quoting that Flute/Github post as one that discredits the entire thing, and that some people on B3D have gone the wrong route for last 3 months without being aware of certain missing details in Github.

No, it doesn't discredit the entire thing. Entire point of post is that it confirms that Flute benchmark is using Oberon chip. It connects the two, and it all makes sense with us knowing that Sony has released actual APU based dev kits in early summer of 2019. It pretty much confirms that next gen PS5 uses 256bit bus and has 16GB of RAM, and there is high chance that chip tested as native there is actually what we will see with PS5 retail chip + RT/VRS.
 
Last edited:
Aif99iI.png



Also, this one's old but recently came up due to VFXVeteran insisting on the 2-tier statements:
https://www.techradar.com/news/play...h-alongside-standard-model-new-rumor-suggests
Nishikawa has proven accurate in the past with his predictions of the PS4 Pro and Switch Lite.
(...)
According to Nishikawa, the PS5 Pro will cost around $100-$150 more than the basic PS5 console. The report states that Sony is taking this approach because it has "acknowledged the interest in a high-end model and wants to give players what they want right from the beginning of the generation."
Here's hoping there's no 2-tier PS5 at launch. I still think that's a terrible idea.



Ok that +-0.25TF came from me interpreting his words :), I got the impression it's around that 11.25-11.75TF ballpark. But yeah he's definitely not being shifty deliberately like some are suggesting, it's an ever evolving matter but the only way it's going is up.
Which is just proof that the user in question never really cared for the veracity or honesty of his own post.
He saw a couple of posts saying stuff that would fit his character assassination agenda and fired away.
(Despite the first part of his post saying "he's moving goalposts because 11 TF +/- 4.5% is not the same as 11TF +/- a little bit". Which is in itself pretty funny.)
And then a very predictable number of users put likes on said post.



AMD as a company…strongly believes in the value and capability of raytracing. RDNA 2, the next-gen, will support raytracing. Both the next-gen Xbox and PlayStation will support hardware raytracing with Radeon natively. We will be sure to have the content that gamers can actually use to run on those GPUs

We believe in our raytracing, and we will have it when the time is right.

-Mithun Chandrashekhar, Product Management Lead, AMD

21 January 2020

What is new here? Is it the bolded sentence?
If it's that, it looks like he's being really careful to not just say "they're using AMD's raytracing solution".
He didn't say they will support Radeon's raytracing. They will support raytracing with Radeon natively just leaves the field as open as it was before.

Someone at reeeesetera compiled all of Schreier's statements on the SeriesX and PS5. I think it's good for context.


TYZHMPC.png




BTW, how many times must we see the github followers' obsession of posting the exact same thing over and over and over again?

So now the narrative has slightly changed to "it was a Navi 10 dGPU after all", aaaand.. they're still doubling down on the 36CU @ 2GHz spec because it said native on our Gospel.
Yeah, despite the tests showing results for some alpha Navi 10 chips and not the final APU.


@AbsoluteBeginner great sleuthing but you can't deny the possibility that this might just be coincidence, bait planted by ninjas or concerted misinformation created by discord folks.
I think it's really funny that your pals (and/or alts, who knows after your own admissions earlier in this thread?) have started to slightly change your narrative already.
As if trying to run away from repercussions in case your religion is proven wrong.
 
Last edited by a moderator:
Someone at reeeesetera compiled all of Schreier's statements on the SeriesX and PS5. I think it's good for context.


TYZHMPC.png




BTW, how many times must we see the github followers' obsession of posting the exact same thing over and over and over again?

So now the narrative has slightly changed to "it was a Navi 10 dGPU after all", aaaand.. they're still doubling down on the 36CU @ 2GHz spec because it said native on our Gospel.
Yeah, despite the tests showing results for some alpha Navi 10 chips and not the final APU.



I think it's really funny that your pals (and/or alts, who knows after your own admissions earlier in this thread?) have started to slightly change your narrative already.
As if trying to run away from repercussions in case your religion is proven wrong.
Before you start with BS spewing, why are you posting 4Chan leaks from guys who cannot even calculate TF straight?

Just a hint, 60CU at 1700MHz is not 13.3TF. Next time you create 4chan post, learn how to calculate correct TF number.
 
Before you start with BS spewing, why are you posting 4Chan leaks from guys who cannot even calculate TF straight?

Just a hint, 60CU at 1700MHz is not 13.3TF. Next time you create 4chan post, learn how to calculate correct TF number.

And before you start with your obsessive inquisition spewing, learn to read the numbers that are actually in the post.
No one wrote 1700MHz. 13.3TF could come from 1730MHz, and the poster might not just care for writing the clock number down to tens of megahertz, especially when writing in GHz.
 
And before you start with your obsessive inquisition spewing, learn to read the numbers that are actually in the post.
No one wrote 1700MHz. 13.3TF could come from 1730MHz, and the poster might not just care for writing the clock number down to tens of megahertz, especially when writing in GHz.

Or he might be full of shit considering where its posted. To each to each his own, but please try to slow down with aggressive posts for anything you do not agree with, ok?
 
What is new here? Is it the bolded sentence?
If it's that, it looks like he's being really careful to not just say "they're using AMD's raytracing solution".
He didn't say they will support Radeon's raytracing. They will support raytracing with Radeon natively just leaves the field as open as it was before.

I don't see how this leaves anything open. Sounds more he tried to pack all of AMDs brand words into it, namely RDNA2 and Radeon. Good marketing job.

What i wonder is: Is Sony fine with him reveling such details before them? But who cares. Likely they just got used to AMD leaking all the shit out of it, hihihi :)
 
I see some people on other forums are quoting that Flute/Github post as one that discredits the entire thing, and that some people on B3D have gone the wrong route for last 3 months without being aware of certain missing details in Github.

No, it doesn't discredit the entire thing. Entire point of post is that it confirms that Flute benchmark is using Oberon chip. It connects the two, and it all makes sense with us knowing that Sony has released actual APU based dev kits in early summer of 2019. It pretty much confirms that next gen PS5 uses 256bit bus and has 16GB of RAM, and there is high chance that chip tested as native there is actually what we will see with PS5 retail chip + RT/VRS.
There's already a strong chance Arden has a 256-bit bus as well given the 560 GB/s numbers in the GitHub leak.
 
Yes I agree, this place is much better when we don't engage in silly fanboy culture and stick to productive discussions.

PS. The heartbeat monitors and sweat analyzers all but confirms a 8TFlop system. It's clearly a built in medical safety system that auto-dials 911 when you start to witness the frame-rate tanking on your new $500 system. Cerny thinks of everything.
 
There's already a strong chance Arden has a 256-bit bus as well given the 560 GB/s numbers in the GitHub leak.
On contrary, it pretty much points to 320bit bus (which would again, corroborate with Scarlett SOC E3 video, lending credence to leak even more).

Theoretical for both Arden and Ariel seem to be using 14Gbps RAM modules, therefore 560GB/s would point at 320bit bus with 14Gbps RAM modules.

cf60a17f19368f61eddb7d5cf185480e20190212183814.png

As we have seen with Oberon, theoretical value for BW does not mean you cannot go higher if there are higher clocked RAM chips out there (which Flute and Oberon tests both confirm).
 
Last edited:
Status
Not open for further replies.
Back
Top