I don’t believe we have. Just talking through the data points available. What specs seem glaringly suspect that we should investigate?Why are people buying into what fehu posted?
Parts of the spec are glaringly suspect.
I don’t believe we have. Just talking through the data points available. What specs seem glaringly suspect that we should investigate?Why are people buying into what fehu posted?
Parts of the spec are glaringly suspect.
Reasons could be to improve CPU yields, or they are power and heat constrained, so they will design their cooling and power circuitry around those limits.
They want to avoid jet-engine noise this time
What specs seem glaringly suspect that we should investigate?
If that's true then cutting on CPU clock rates seems silly while at the same time running RDNA 7nm @2GHz (?).
Makes more sense to run the CPU @~3GHz and the GPU @~1.6-1.7 Ghz. That's where the sweetspot is.
In the case of microsoft, i dont think 8 Zen 2 cores at 2.2 ghz are 4x faster than the Jaguar cores in Xbox X.I don’t believe we have. Just talking through the data points available. What specs seem glaringly suspect that we should investigate?
From my perspective, working just on compute, it looks very different:Compared to NV gpu's in some area maybe, but compared to what was available the PS4 was kinda low end. 7970ghz was out for a year by ps4 launch. NV gpu's where that bad either, mostly outperforming amd in most games This time around were getting a real CPU, and a more advanced GPU for the time (i think).
Good post. Sorry my cache didn’t refresh for some reason. I was an odd 15 messages behind current.
Please answer: why on ONE X we still have Jaguars ?
With RT being a task of terrible random memory access, this could explain the insane 800GB/s bandwidth eventually. That's the interesting question for those given specs. (15 TF leaks really sounded too good)So obviously fake. 384 bit memory system to feed a 7TF GPU? GTFO!
So obviously fake. 384 bit memory system to feed a 7TF GPU? GTFO!
Cheers
If that's true then cutting on CPU clock rates seems silly while at the same time running RDNA 7nm @2GHz (?).
Makes more sense to run the CPU @~3GHz and the GPU @~1.6-1.7 Ghz. That's where the sweetspot is.
GPUs have redundant CUs to shut off to improve yields. I don’t believe this is the case with CPU.If that's true then cutting on CPU clock rates seems silly while at the same time running RDNA 7nm @2GHz (?).
Makes more sense to run the CPU @~3GHz and the GPU @~1.6-1.7 Ghz. That's where the sweetspot is.
It's talking through a speculation. I was actually in the process of moving this to the 'baselss rumour' thread, but it's taking too long to separate out the past pages of discussion!Why are people buying into what fehu posted?
Parts of the spec are glaringly suspect.
I'm wondering that myself.
The 2.2GHz CPU clocks are specially glaring. Zen 2 at 3.2GHz is massively efficient. Why would Sony want to loose almost a third of potential CPU performance just to save what, 8 Watts?
Because it is NOT needed...
This ps5 needs to not leave ps4 behind... so a 3.2 ghz CPU is way too much... better saving 8 watts (only 8 watts ????)
Also because of bandwidth... Too strong CPU cuts down GPU bandwidth.... system needs balance. Please answer: why on ONE X we still have Jaguars ?
Reasons could be to improve CPU yields, or they are power and heat constrained, so they will design their cooling and power circuitry around those limits.
Remember RT seems to be bandwith bottlenecked in Nvidias solution. Maybe it will throw a wrench at what TF/BW ratios we are used to