A how can a hardware teardown determine what clock speed is dynamically set in game? How will we know (or how can we ascertain) whether the GPU is sitting at 2.2 GHz or 2.05 GHz when we're playing games?
sorry, thought the wink gave it away !! Lol
A how can a hardware teardown determine what clock speed is dynamically set in game? How will we know (or how can we ascertain) whether the GPU is sitting at 2.2 GHz or 2.05 GHz when we're playing games?
I wouldn't be so uppity you were wrong too, in fact I don't think anyone predicted so high clocks though. Once again when Cerney says it runs at those clocks all the time except for worst case scenarios I believe him.
And no I don't think all AAA games are worst case scenarios or at least not what Cerney means.
You voted the same option as ToTTenTranz, so if he's wrong, you're wrong.No not wrong
You voted the same option as ToTTenTranz, so if he's wrong, you're wrong.
No not wrong,
I remember you saying 9 potential 8 TFlops so wrong. I'm not having a go at you though because I thought the same thing because I thought 2Ghz was not possible in a console.
Ah the voting ya all know what ive been pushing in the baseless thread....
Did Cerny mention what the CPU clocks will be when the GPU is running 2.23?Cerny said vast majority of the time the GPU clock will be at or close to 2.23ghz.
So we don't know until we get numbers from devs in the real world. Frankly 5-10% drop isn't going to make a difference in on-screen visuals.
Did Cerny mention what the CPU clocks will be when the GPU is running 2.23?
If the 2.23Ghz is the number we're going with, what will CPU be as a result here?
GPUs normally eat more power than a CPU, so I'm starting to understand why he would say to feed the CPU only a small fraction of GPU frequency will be lost.
So what will the CPU have to be then to feed the GPU to max clockrate?
yes, max 9TF it’s in your invisible sig