What's your view of Next-Gen Rumor Religions? +Poll! *excommunicado thread*

Which is the next-gen true faith (TF)?

  • Orthodox Kleeism - PS5 is pretty much the same as XBSX at 12 TF

    Votes: 0 0.0%

  • Total voters
    48
  • Poll closed .
Will we? How will we know what the GPU is clocked at? Sony has a display on the front of the console showing clock speeds?? ;)

Was thinking of that tear-down, if it could include technical tear downs of the innards, what the boost clocks mean to frequencies etc. Common sense in me says that if they downclock by just 2% at the max, why bother upclocking 2% to begin with.
Developers would have to know, or is this all automatic, developers won't have to think with dynamic clock speeds in mind?
 
The main take home was a leaked spec for an SOC of 36 CUs, and a bunch of would-be insiders making claims of notably faster, and whether people believed the leak was final hardware (regardless what it could be clocked at) or whether the insiders were actual insiders providing real info. The insiders were, AFAICS, by and large, a bunch of fibbers. So the lesson here is don't trust GAF/Era insiders.

Although if one of the would-be insiders like Osiris did state 10.x TFs and stuff that makes it look legitimate, that could be recorded for future prosperity come PS6 predictions.

The whole take was if github/36Cu was the final PS5, aside from clock adjustments that do occur when finalising console specs hardware. Ive said it many times, 'aside from clock adjustments', github was from summer 2019, it would never have to be final clocks. That they upclocked the damn thing was a surprise, to many.
 
Was thinking of that tear-down,
Tear-downs are hardware. There's no reason a tear-down would reveal a dynamic clock.

Common sense in me says that if they downclock by just 2% at the max, why bother upclocking 2% to begin with
Because you always want the fastest possible within you budget. Sony's objective here wasn't a performance, but a heat target that they could control and cool. We need the other half of the picture, the cooling solution, to understand completely the choices.
Developers would have to know, or is this all automatic, developers won't have to think with dynamic clock speeds in mind?
I doubt they'll give it any consideration in the main. Their game will run, but there'll just be some slightly slower frames or slightly lower-res dynamic resolutions when it throttles.
 
HDMI 2.1 will allow developers to stop worrying about variable frame rates. I think developers will opt for lower framerate than to lower res.
 
In that, you are right. This poll picked TFs as the metric.
Why would there be any other metric? Why would we be discussing anything but compute throughput, especially between GPUs with similar architectures?
Whomever created this thread and its poll seemed to share this sentiment, as the options only show TFLOP count.

Do we consider Hawaii to be a better choice than Polaris 10 because it has many more execution units?

TFLOP count is obviously not the only metric we can compare that is relevant for the console's final performance, but on systems with similar architectures it sure is arguably the best we can get.


However ,that's somewhat skewed by dynamic clocks.
Sony says the clock deltas will be tangential and most of the time both the CPU and GPU will run at the advertised max clocks. If we choose to reject their claims, why would we choose to accept that Microsoft isn't lying about their clock speeds also?
I won't pick sides and say one is lying and the other isn't. Both companies stated their specs. I won't engage in baseless assumptions over officially stated specs. I also think it's a rather poor practice to do so, but to each their own.



The main take home was a leaked spec for an SOC of 36 CUs, and a bunch of would-be insiders making claims of notably faster, and whether people believed the leak was final hardware (regardless what it could be clocked at) or whether the insiders were actual insiders providing real info. The insiders were, AFAICS, by and large, a bunch of fibbers. So the lesson here is don't trust GAF/Era insiders.

This is the one sole time I'll comment on this. It's a worthless conversation to have.

All I did these past couple of months was put up several different sources on the baseless thread with different specs so we could discuss them. Which was supposedly the purpose of that thread until a mod decided it should instead be a thread for dogpiling on people outside this forum who couldn't defend themselves. A practice that same mod brought to this thread as soon as the baseless one was closed.
So much talk about what is beneath B3D during these last couple of months, yet it seems constant mockery of people (some of them publicly identified developers) outside the forum is somehow fair game.


My problem has always been the github inquisitors who constantly jumped to mockery, bullying and trolling towards any source who claimed the github data wasn't indicative of the final product (which it is), as well as any user who dared to post these baseless rumors on the baseless rumor thread.
I don't remember swearing fealty to any one leaker, on the contrary. My purpose has always been to entertain different hypotheses.. in the thread that was supposed to exist to entertain different hypotheses.

The only lesson I learned here is that B3D condones the dogpiling of users and external people who don't conform to a certain clique.
That was a lesson well learned.



Although if one of the would-be insiders like Osiris did state 10.x TFs and stuff that makes it look legitimate, that could be recorded for future prosperity come PS6 predictions.

All the certified leakers said the github gospel had outdated data. Which it does.
The PS5 isn't bringing a 9.2 TFLOPs GPU (36 CUs at 2GHz) which absolutebeginner repeated ad nauseam.
It's also not bringing a 8 TFLOPs GPU (36 CUs at 1.75GHz) which psman1700 repeated around the forum ad nauseam, in what's probably over a hundred posts if we bother to count.

The GPU will be running at over 10% higher clocks than what's in the github, and the resulting compute throughput delta towards SeriesX is reduced from 33% to 15%. 15% is close, 33% is not.
Personally, my take has always been that Sony would be seeking close-to-parity performance because they know how bad it went for the XB1for not achieving it. I thought this would be done through a wider chip and Oberon was actually a 3 shader engine part. Turns out Sony managed to achieve close-to-parity through humongously high clocks.

One user I can remember of who claimed +/- 10.5 TFLOPs from the start to finish was HeisenbergFX or something.
o'dium might have been tricked by dev kit specs. If the devkit is carrying all CUs enabled like we saw in the XBoneX's devkit, then at the same 2.23GHz the devkit is pushing 11.4 TFLOPs, which is the number he'd been given for a while. I really don't think o'dium is an attention-seeking troll. He's a publicly identified developer with over a decade of experience on working for AAA projects. His user avatar at GAF is a picture of him next to his daughter, that's not the profile of a troll.

OsirisBlack, VFXVeteran and Tommy Fisher were fake insiders in the end, yes. Tommy Fisher was a very lucky one BTW, since he guessed the SeriesX's GPU clockspeeds right down to the tens of megahertz.

There was also a french journalist that made a video over a week ago with statements that became spot on, whose content was translated to a post at GAF.
I would have shared that here, had there been a thread actually dedicated to baseless rumors.
 
Can't you just forget it? Github was it, done. Just go on and stop attacking people so aggressively.
Also, you have no idea about the dynamic clocks yet, DFs Alex has a good view on it.
 
Will we? How will we know what the GPU is clocked at? Sony has a display on the front of the console showing clock speeds?? ;)
Don't even joke, Sony will do that. I don't know what to expect from PS5's case but based on PS3 and PS4, I'm expecting hideousness.
 
Sony says the clock deltas will be tangential and most of the time both the CPU and GPU will run at the advertised max clocks. If we choose to reject their claims, why would we choose to accept that Microsoft isn't lying about their clock speeds also?

Because there is nothing subjective about "all the time" while "most of the time" is open to interpretation. But while I don't like the ideal of people using best case scenarios as the the most common scenario, I don't see a reason to disbelieve Sony and maybe paint a picture that fits one's bias. Sony seems to have set an aggressive power profile for the PS5 that uses variable gpu frequencies to keep power hungry games from readily shutting down the console. For the vast majority of games, I don't think this will be an issue. The question becomes what happens for games that push the gpu of the PS5 to the max? But even that question is probably not worth making a big deal over if the the down clock is limited to 2% or 200 Mflops.
 
Personally I think we need to make a sticky of this gen's insiders names and claims as a reminder of why not to give these guys a lot of credence.

Because, regardless if someone got it right, how through all the noise are we as forum suppose to pick that person out? All of it should be taken with a grain of salt.
 
Next time around I think the conversation will need to be handled better. These names my have disappeared by then as ousted by the forums that supported them. And if so, there'll be other pretenders in their place. Maybe we'll just ignore baseless rumours completely next gen because nothing particularly good comes from them? The GitHub leak wasn't baseless, and should have been discussed more in the next-gen predictions thread.
 
Can't you just forget it? Github was it, done. Just go on and stop attacking people so aggressively.
Also, you have no idea about the dynamic clocks yet, DFs Alex has a good view on it.

I wouldn't be so uppity you were wrong too, in fact I don't think anyone predicted so high clocks though. Once again when Cerney says it runs at those clocks all the time except for worst case scenarios I believe him.

And no I don't think all AAA games are worst case scenarios or at least not what Cerney means.
 
So maybe we could just treat those clocks as standard and the lower worst case scenarios as outliers maybe?
He said that higher frequencies have specific benefits than just having more CUs. I wonder what these are.
 
Tear-downs are hardware. There's no reason a tear-down would reveal a dynamic clock.
what? They’ll tear it down but leave the clock in there? ;)

I wouldn't be so uppity you were wrong too, in fact I don't think anyone predicted so high clocks though. Once again when Cerney says it runs at those clocks all the time except for worst case scenarios I believe him.

And no I don't think all AAA games are worst case scenarios or at least not what Cerney means.
Yeah, my understanding is that games that don’t require the full GPU power will get bumped down a couple % to save 10% power...but maybe I misunderstood? This would likely mean less demanding games hitting the lower clocks more frequently than AAA.
 
I wouldn't be so uppity you were wrong too, in fact I don't think anyone predicted so high clocks though. Once again when Cerney says it runs at those clocks all the time except for worst case scenarios I believe him.

I think everyone doubted the vaunted 2Ghz rumour based on what was known about AMD's architecture with a dollop of conventional wisdom. Nobody could have foreseen Sony employing a completely different model of clocks/performance and power utilisation.
 
Next time around I think the conversation will need to be handled better. These names my have disappeared by then as ousted by the forums that supported them. And if so, there'll be other pretenders in their place. Maybe we'll just ignore baseless rumours completely next gen because nothing particularly good comes from them? The GitHub leak wasn't baseless, and should have been discussed more in the next-gen predictions thread.

Overall I think the Baseless Rumors thread was good. It was ridiculous and some people were far too invested, but it isolated stuff that would have otherwise leaked into conversations all over the forum (even more than the few times it did anyway).

As long as it didn't leak outside or become overly toxic, I think the thread was a great idea for "silly season."

And for myself, it was amusing seeing people get so invested in what "insiders" were saying. Hope springs eternal as they say. In the end, it was all moot and both consoles will be capable machines trying new ways to enhance the game play experience.

Regards,
SB
 
I think everyone doubted the vaunted 2Ghz rumour based on what was known about AMD's architecture with a dollop of conventional wisdom. Nobody could have foreseen Sony employing a completely different model of clocks/performance and power utilisation.

Yup, I didn't think 2Ghz was possible but then RDNA 2 had the 50% improvement which made it possible. Maybe the shape of the dev kit was or is a clue that this was the plan all along though. That they were going all in on cooling because they were going narrow and damn fast.
 
Back
Top