What is PS4's 14+4 CU thing all about? *spawn

Status
Not open for further replies.
But the bolded part is still whats confusing and I guess what I wonder about. What exactly is the issue (s).

It doesn't need to be an issue, perhaps each of the CUs all come with everything they need to be fully exploitable, and it's just a matter of workload.

For instance, let's say that for a particular game all the ALU work accounts for only 10% of frametime. You want to increase the framerate, so you double the CU count. Even assuming a perfect linear scale you are going to have only a marginal increase in framerate. But what if it wasn't just a single game, but rather almost every game you run you find yourself in this exact same scenario... You probably wouldn't put more CUs in there, because you know that for the games you are trying to run you have more performance gains if you increase another area of the gpu.

Now think to every thing else a game has to do during a frame... You have vertex transforms, the setup engine, texture fetches, ROPs output, ALU operations... I don't know exactly how gpus work, but i'm going to assume that they are not always fully utilizing all their components all the time, because that's just unrealistic workload (unless gpus have super giant caches capable of storing data for a few frames at once), so if any of those is much faster than the rest you'd still wouldn't get much performance back because there's still going to have something holding your framerate back... So sony first found the perfect balance between anything, then threw a few more CUs in there and increased the ACEs count, so they could run compute on the gpu.
 
The PS4 32 ROPs alone will insure a much smoother frame-rate amongst most 3rd party games, especially at 1080p. So regardless of how things are sliced & diced, the ROPs deficiency on XB1 can't be overcome by any magic math.

I'm happy with 1080p/30fps for the first generation PS4 games...

The PS4's 32 ROPs alone don't do anything. They are at the tail end of a long series of elements that have just as much potential to cause the frame-rate to nosedive regardless of the quantity or capability of that one group of functional units. All you can really say is that the ROPs won't be a bottleneck, which isn't the same thing.
 
The PS4's 32 ROPs alone don't do anything. They are at the tail end of a long series of elements that have just as much potential to cause the frame-rate to nosedive regardless of the quantity or capability of that one group of functional units. All you can really say is that the ROPs won't be a bottleneck, which isn't the same thing.

Ok, how about this... PS4 will "less-likely" have frame-rate issues across many 3rd party games compared to X1, especially if such game was 1080p native.

By the way....

http://www.eurogamer.net/articles/d...unlock-more-gpu-power-for-xbox-one-developers

ROPs are the elements of the GPU that physically write the final image from pixel, vector and texel information: PlayStation 4's 32 ROPs are generally acknowledged as overkill for a 1080p resolution (the underlying architecture from AMD was never designed exclusively just for full HD but for other resolutions such as 2560x1400/2560x1600 too), while Xbox One's 16 ROPs could theoretically be overwhelmed by developers.

Obviously though, it stands to reason that having more ROPs on call is the preferable scenario, even if they remain largely unused - and that's what PlayStation 4 offers. Microsoft's pitch is that its hardware set-up wouldn't necessarily be able to make use of them even if they were there.

Our take on the ROPs situation is that while these figures make perfect sense, there are many other scenarios that could be potentially challenging - depth-only passes, shadows, alpha test and Z pre-pass for example. But from a user perspective, the fact is that native 1080p isn't supported on key first-party titles like Ryse and Killer Instinct. Assuming this isn't a pixel fill-rate issue as Microsoft suggests, surely at the very least, this impacts the balanced system argument?
 
Ok, how about this... PS4 will "less-likely" have frame-rate issues across many 3rd party games compared to X1, especially if such game was 1080p native.

By the way....

http://www.eurogamer.net/articles/d...unlock-more-gpu-power-for-xbox-one-developers

I think 1080p30 is a frame rate issue, but then it's just me.
Based on some general illogical argument, I can probably also say it's gonna stay this way and probably get worse for this generation of PS4 titles. ;-)
 
It doesn't need to be an issue, perhaps each of the CUs all come with everything they need to be fully exploitable, and it's just a matter of workload.

For instance, let's say that for a particular game all the ALU work accounts for only 10% of frametime. You want to increase the framerate, so you double the CU count. Even assuming a perfect linear scale you are going to have only a marginal increase in framerate. But what if it wasn't just a single game, but rather almost every game you run you find yourself in this exact same scenario... You probably wouldn't put more CUs in there, because you know that for the games you are trying to run you have more performance gains if you increase another area of the gpu.

Now think to every thing else a game has to do during a frame... You have vertex transforms, the setup engine, texture fetches, ROPs output, ALU operations... I don't know exactly how gpus work, but i'm going to assume that they are not always fully utilizing all their components all the time, because that's just unrealistic workload (unless gpus have super giant caches capable of storing data for a few frames at once), so if any of those is much faster than the rest you'd still wouldn't get much performance back because there's still going to have something holding your framerate back... So sony first found the perfect balance between anything, then threw a few more CUs in there and increased the ACEs count, so they could run compute on the gpu.

That's a bottom up, hardware-centric approach to arrive at a "balanced" figure. It tells you some basic relationships and design points about the h/w parts. It may not tell you how high the h/w can shoot as a whole because...

Games are designed and implemented by human developers (teh software !). Their views will be higher level. Depending on what they hope to achieve, their own expertise and resources, they will engineer their software and contents around these basic parameters, but they can indeed still use all 16-18 CUs for graphics (or compute) jobs at any one time as they see fit.

A scientific study that evaluates human's composition may conclude that the ideal age for one's first serious romance is 18 years old. But people may not bother to look at that recommendation because romance is not biology or psychology. They have different taste, creativity/imagination, experience, capability, etc.

In the early PS3 days, critics kept complaining that the SPUs are not suitable for graphics work. Then MLAA happened just because people learn to adapt and exploit.

We should give a few years to the developers to see what they can do on the new hardware.
 
I think 1080p30 is a frame rate issue, but then it's just me.
Based on some general illogical argument, I can probably also say it's gonna stay this way and probably get worse for this generation of PS4 titles. ;-)

Magic math tends to confuse people, so I understand your situation. ;)
 
Ok, how about this... PS4 will "less-likely" have frame-rate issues across many 3rd party games compared to X1, especially if such game was 1080p native.

By the way....

http://www.eurogamer.net/articles/d...unlock-more-gpu-power-for-xbox-one-developers

Is what you're trying to support here that the PS4 would be expected to perform better than the XBOne because it has (mostly) higher-spec hardware in it? Of course it would. What does this have to do with the topic at hand?
 
Is what you're trying to support here that the PS4 would be expected to perform better than the XBOne because it has (mostly) higher-spec hardware in it? Of course it would. What does this have to do with the topic at hand?

Exactly. It's just people freaking out when if anyone said anything that sounds like it's not 50% more powerful because of the extra flops (and it never was).

I also like how Cerny's not correct because he's making the system sound less powerful, but when the other guy said it, he was lying because he's making it seem more powerful. /Facepawn.
 
Exactly. It's just people freaking out when if anyone said anything that sounds like it's not 50% more powerful because of the extra flops (and it never was).

I can guarantee I'm not freaking out over a game system, especially one that isn't using magic math and magic drivers. ;)
 
Exactly. It's just people freaking out when if anyone said anything that sounds like it's not 50% more powerful because of the extra flops (and it never was).

I also like how Cerny's not correct because he's making the system sound less powerful, but when the other guy said it, he was lying because he's making it seem more powerful. /Facepawn.

Cerny didn't make the system sound less powerful.
Even if a game use the PS4 GPU in 14+4 config, it is still equally powerful as a 16+2 one. ^_^
The developers simply use them resources differently (perhaps because their goals are different). The compute load has to come from somewhere. They can change their usage as they see fit.

Who's the other guy ?

EDIT:
The "14+4" note is mentioned in Sony's slides. Cerny may not be the guy who created the slides though. Would be interesting to see if it's still in the 2013 presentation.
 
Cerny didn't make the system sound less powerful.
Even if a game use the PS4 GPU in 14+4 config, it is still equally powerful as a 16+2 one. ^_^
The developers simply use them resources differently (perhaps because their goals are different). The compute load has to come from somewhere. They can change their usage as they see fit.

Who's the other guy ?

EDIT:
The "14+4" note is mentioned in Sony's slides. Cerny may not be the guy who created the slides though. Would be interesting to see if it's still in the 2013 presentation.

No one in their right mind (provided that they are somewhat technical) would think that 14+4 as any sort of physical division. This is just as ludicrous as the stacking dGPU conspiracy.

Having said that, in Cerny's July 13 interview, he brushed that off as just leaks (as in, not official words), but he actually confirms the CU headroom for compute.

Mark Cerny: That comes from a leak and is not any form of formal evangelisation. The point is the hardware is intentionally not 100 per cent round. It has a little bit more ALU in it than it would if you were thinking strictly about graphics. As a result of that you have an opportunity, you could say an incentivisation, to use that ALU for GPGPU.
 
No one in their right mind (provided that they are somewhat technical) would think that 14+4 as any sort of physical division. This is just as ludicrous as the stacking dGPU conspiracy.

*shrug* Some did. The point is they are all 18 CUs used differently. Cerny can't make it less powerful just by talking.

Having said that, in Cerny's July 13 interview, he brushed that off as just leaks (as in, not official words), but he actually confirms the CU headroom for compute.

He acknowledged that the GPU has extra ALUs for more than pure graphics work.

It doesn't really matter if it's a leak or not. The developers can use the CUs based on their game needs.
 
"Copyright 2012." Old news, already discussed when VGLeaks first leaked it and the subsequent ongoing discussion.

Probably should have locked it then. >:3

Locked because... the circle isn't going to be broken here (people making strawmen with hats! Strange claims, Not reading other people's claims, sarcasm++, dead horse/old topic etc), and generally it's not going anywhere!
 
Status
Not open for further replies.
Back
Top