What's your view of Next-Gen Rumor Religions? +Poll! *excommunicado thread*

Which is the next-gen true faith (TF)?

  • Orthodox Kleeism - PS5 is pretty much the same as XBSX at 12 TF

    Votes: 0 0.0%

  • Total voters
    48
  • Poll closed .
I wouldn't be so uppity you were wrong too, in fact I don't think anyone predicted so high clocks though. Once again when Cerney says it runs at those clocks all the time except for worst case scenarios I believe him.

And no I don't think all AAA games are worst case scenarios or at least not what Cerney means.

No not wrong, it was github all along. Only the clock speeds are adjusted, which was a given since the leak was from mid 2019. I was aware of the clock for almost two months ago.

About the dynamic clocks, we will have to see. 10TF or 9.5 it wont matter for example. Just that, why bother complicating things if they could have clocked it somewhat lower, or even have dynamic clocks att all, if this extreme case basically never happens? Even ps4/xbox have extreme cases.

Just seems there is more to it.
 
I remember you saying 9 potential 8 TFlops so wrong. I'm not having a go at you though because I thought the same thing because I thought 2Ghz was not possible in a console.

The sentiment was if Github was it. When the leaks appeared, i thought they would downclock that thing. Not long ago i got to hear Github was the real thing, and that they upclocked it, rather insanely high.
Going from a devkit from mid 2019, almost impossible to know the exact TF number to the decimal, clocks are mostly going to be adjusted, final adjustments.

But, it doesnt matter now anymore, and yes, i voted the wrong answer ;)
 
Cerny said vast majority of the time the GPU clock will be at or close to 2.23ghz.

So we don't know until we get numbers from devs in the real world. Frankly 5-10% drop isn't going to make a difference in on-screen visuals.
 
I expect most titles to be heavy on the GPU side and lighter on the CPU side.
Sony only have to worry about price points and how well their box can keep their console from any sort of RMA process at this point in time.
 
Cerny said vast majority of the time the GPU clock will be at or close to 2.23ghz.

So we don't know until we get numbers from devs in the real world. Frankly 5-10% drop isn't going to make a difference in on-screen visuals.
Did Cerny mention what the CPU clocks will be when the GPU is running 2.23?
If the 2.23Ghz is the number we're going with, what will CPU be as a result here?
GPUs normally eat more power than a CPU, so I'm starting to understand why he would say to feed the CPU only a small fraction of GPU frequency will be lost.
So what will the CPU have to be then to feed the GPU to max clockrate?
 
Did Cerny mention what the CPU clocks will be when the GPU is running 2.23?
If the 2.23Ghz is the number we're going with, what will CPU be as a result here?
GPUs normally eat more power than a CPU, so I'm starting to understand why he would say to feed the CPU only a small fraction of GPU frequency will be lost.
So what will the CPU have to be then to feed the GPU to max clockrate?

I think the benchmarks for Gonzalo and Flute have some relevance.

For Flute, they indicate a 1.6ghz base clock for CPU, and 1.2ghz base clock for GPU. I am guessing they're related to the variable clock rates. Just a hunch.
 
Back
Top