Technical Comparison Sony PS4 and Microsoft Xbox

Status
Not open for further replies.
- 410GF Minor boost if used for rendering.
So it's not 1.8TF

There's absolutely zero reason to believe that developers cannot\will not use this for rendering, or that it is in any way "less viable" to use all 18 CU's for rendering.

Guerrilla Games, for instance, in KZ:SF have stated they're not using compute for much of anything at all. Are we to assume then that they're not actually "using all 18 CU's"? That they're not getting 1.84TF performance?
 
- 410GF Minor boost if used for rendering.
So it's not 1.8TF

Sony says 1.84TF.

Anand says, "Sony has confirmed the actual performance of the PlayStation 4's GPU as 1.84 TFLOPS. Sony claims the GPU features 18 compute units, which if this is GCN based we'd be looking at 1152 SPs and 72 texture units."

But you're welcome to show how 14 CUs would yield 1.84TF as nicely as 18*64*2*.000800 does.
 
Are we back to this 14+4 stuff again? AFAIK it was settled that they were 18 CUs free to be tasked as needed with 14+4 being an artifact of early dev kits. Ergo it's 1.8 vs 1.2 TF if both GPUs are tasked solely with gfx jobs
 
Sony says 1.84TF.

Anand says, "Sony has confirmed the actual performance of the PlayStation 4's GPU as 1.84 TFLOPS. Sony claims the GPU features 18 compute units, which if this is GCN based we'd be looking at 1152 SPs and 72 texture units."

But you're welcome to show how 14 CUs would yield 1.84TF as nicely as 18*64*2*.000800 does.

Are we back to this 14+4 stuff again? AFAIK it was settled that they were 18 CUs free to be tasked as needed with 14+4 being an artifact of early dev kits. Ergo it's 1.8 vs 1.2 TF if both GPUs are tasked solely with gfx jobs

Not to speak for him, but I think his point is different than what you are actually replying to.

For example, could you not interpret the info from VGLeaks as saying the 18CU's could be used all for rendering, but 4 of them are geared towards compute and don't add much to system's rendering capabilities if they are used for rendering? That phrase about a minor boost for rendering could be read both ways; as suggesting those 4 offer limited rendering improvements or as suggesting that If you use them for rendering you get a boost slightly above what you'd normally get with 18CU's. Understand what I'm saying? It's kinda hard to phrase it... :/

Neither interpretation would run contrary to what Sony has said. Clearly the option to use them for rendering exists, as confirmed by Sony and as was likewise true in the VGLeaks info. Also, regardless of what those 4CU's are used for you still count the flops they provide, it's just not necessarily all going towards rendering. Those flops are still operations the GPU is doing after all. For instance, one interpretation could be:

14CU's + 4CU's

...with the 4CU's geared towards compute, which only offer 'minor boosts' for rendering if used; aka you get 14CU's + a lil bit more juice when all are used for rendering.

Maybe that is what marcberry is trying to say? If not, then I'm posing that question anyhow. :p
 
Not to speak for him, but I think his point is different than what you are actually replying to.

For example, could you not interpret the info from VGLeaks as saying the 18CU's could be used all for rendering, but 4 of them are geared towards compute and don't add much to system's rendering capabilities if they are used for rendering? That phrase about a minor boost for rendering could be read both ways; as suggesting those 4 offer limited rendering improvements or as suggesting that If you use them for rendering you get a boost slightly above what you'd normally get with 18CU's. Understand what I'm saying? It's kinda hard to phrase it... :/

Neither interpretation would run contrary to what Sony has said. Clearly the option to use them for rendering exists, as confirmed by Sony and as was likewise true in the VGLeaks info. Also, regardless of what those 4CU's are used for you still count the flops they provide, it's just not necessarily all going towards rendering. Those flops are still operations the GPU is doing after all. For instance, one interpretation could be:

14CU's + 4CU's

...with the 4CU's geared towards compute, which only offer 'minor boosts' for rendering if used; aka you get 14CU's + a lil bit more juice when all are used for rendering.

Maybe that is what marcberry is trying to say? If not, then I'm posing that question anyhow. :p

Right, but the GPU isn't "fix functioned" or whatever. You can use the CU's anyway you want, and Sony has optimized the GPU to more efficiently perform computer operations (ACE units).. But there's nothing anywhere that denotes 4CU's are adjusted\separate for "Compute Only".
 
The more I look at the 8 ACE's and 64 compute queues, combined with UMA and HSA, the more I think Sony's intention is to turn the GPU into an out of order FPU. They may also have done it to make sure that the latencies from GDDR5 aren't an issue; there's nearly always something there to work on.

If they manage to accomplish that, it surely would be a game changer compared to anything else available today? But I'm not smart enough to know if that's what they're trying to accomplish or whether it's even possible.
 
Also, why is everyone assuming that the total bandwidth figure of 200GB/s+ is automatically just marketing speak? Just because adding in the CPU happens to add up to 200GB/s? If so, that wouldn't make sense for them to say MORE than 200GB/s...that's add up precisely 200GB/s, no?

Just because DF and Anandtech assume it's the CPU's contribution doesn't actually mean that's some given fact. We DID have that VGLeaks article which claims specs had been adjusted a month or so ago (same rumor as the XboxMini + BC info). And that bandwidth is clock dependent, so if they upped the clocks at all it'd point to a higher bandwidth figure.

Like Rangers, I too find it interesting that sebbbi noted the 200GB/s+ figure as a starting point when he could have just as easily qualified his estimates with a sentence like "if you assume the leaks are true and total bandwidth was 170GB/s...". Plus rumors of the consoles running hot and having relatively weak yields supposedly, slight delays in shipping beta kits out, and the fact that MS didn't disclose the clocks nor the bandwidth for the eSRAM/GPU/CPU at the reveal nor the hardware panel. Seems to me the reason they bragged about 8 core CPU and custom DX 11.1 AMD GPU and 8GB of RAM etc is because those all are identical with PS4 qualitatively on the surface. But...so are the leaked clocks.

If their agenda was to paint X1 as identical to SP4 hardware-wise they should have likewise trumpeted "an 800MHz GPU with 170GB/s of available bandwidth and a 1.6GHzCPU...", no? Unless, of course, they really did make last minute changes to the clocks as VGLeaks' source claimed.

Just for fun, how would having say 204.8GB/s of overall bandwidth change things? How does that affect flops, clocks, etc?

It'd be something like this perhaps:

204.8GB/s total to the GPU ("more than 200GB/s of available bandwidth"); does not include any CPU contribution
204.8-68.3=136.5GB/s for eSRAM
(136.5/102.4)*800MHz=1.066GHz GPU ==> 1.64Tflops

Assuming coherency between the chips on the SoC...that's a CPU at 2.13GHz.

Before you shout at me and tell me how that CPU clock can't possibly work because of the thermal considerations of Jaguar's spec, I'll note that X1's CPU isn't actually a Jaguar (even though everyone is reporting it as such by assuming).

:)
 
Right, but the GPU isn't "fix functioned" or whatever. You can use the CU's anyway you want, and Sony has optimized the GPU to more efficiently perform computer operations (ACE units).. But there's nothing anywhere that denotes 4CU's are adjusted\separate for "Compute Only".

Re-read my post. I'm not suggesting they are 'compute only'. I'm wondering aloud if they might be geared towards compute, as in they perform fantastic on compute tasks and don't do real great on typical rendering tasks.
 
Please not the 14+4 again.

Has it been confirmed that of the 18CU's none of them are geared towards compute over rendering specifically? Are they all identical? I seem to recall everyone just assuming that since Sony lumped them all together qualitatively in their PR interviews they must all be identical, but that's not actually a necessary contradiction to the original VGLeaks info.

If they did clarify I'd appreciate a link and/or quote from someone at Sony to clear me up. :smile:

I think it's helpful at this stage everyone is just assuming that since a handful of things from VGLeaks Durango specs were correct therefore we can pretend every single detail was as a blanket statement. I'm not sure that the 14 + 4CU thing was actually addressed adequately nor am I sure we are in a strong position to just assume the clock speed of the X1 chips as we have no evidence supporting such assumptions beyond a VGLeaks article which *may* have got the eSRAM bandwidth wrong. To argue that they didn't get the bandwidth wrong requires the assumption that MS just added in the CPU, but that seems slightly at odds with the wording of 'more than 200GB/s of available bandwidth', as it would only add to precisely 200GB/s of bandwidth.

Just saying we still don't have all the actual details quite yet. Everyone has given things a first pass judgment, which is great and all, but refinements to the logic and set of assumptions might be prudent too. Plus...it's fun to go back and forth and speculate on the missing details in both directions!
 
We have insider that have confirm nothing change in the Xbone. Its wishful thinking at this point at best.

MS announce every spec that was on pair or better than the PS4. They are pushing the cloud as secret sauce. Its just marketing.....
 
We have insider that have confirm nothing change in the Xbone. Its wishful thinking at this point at best.

MS announce every spec that was on pair or better than the PS4. They are pushing the cloud as secret sauce. Its just marketing.....
They did say more than 200GB/s of bandwidth to the various caches, or something like that. Otherwise they left it at 8GB RAM, 8 core CPU and an AMD GPU.
 
We have insider that have confirm nothing change in the Xbone. Its wishful thinking at this point at best.

It's not wishful thinking at best. Read my post. It's speculation, but it's all based on some sort of evidence. It may yet be wrong, but trying to avoid discussing it simply on the basis of your assumptions isn't particularly helpful for forging interesting discussion either.

MS announce every spec that was on pair or better than the PS4. They are pushing the cloud as secret sauce. Its just marketing.....

No, they didn't. As I pointed out, they didn't announce either the GPU or CPU specs, and they would have also been happy to boast about 170GB/s total bandwidth presumably. You're stretching the truth here in an effort to just dismiss the notion. :rolleyes:
 
They did say more than 200GB/s of bandwidth to the various caches, or something like that. Otherwise they left it at 8GB RAM, 8 core CPU and an AMD GPU.

Photos from Wired also seem to confirm the main RAM is DDR3 @ 2133MHz for 68.3GB/s bandwidth. Also, MS did likewise confirm 768 ops/cycle, which means 12CU's. ;)
 
We have insider that have confirm nothing change in the Xbone. Its wishful thinking at this point at best.

MS announce every spec that was on pair or better than the PS4. They are pushing the cloud as secret sauce. Its just marketing.....

I think using names like "XBone" is more a behavior I expect from NeoGAF, not from Beyond3D.
 
Status
Not open for further replies.
Back
Top