How much of a bottleneck is Jaguar in current gen consoles?

From 2013 most have been hearing that Jaguar is underpowered, and the reason why developers focus on 30fps.
Recent Pro patches showed only a slight increase in uncapped frame rate, that seemed tied to the CPU increase which was 30% clock speed. Even though GPU power had more than doubled.

If Jaguar was already a bottleneck to PS4 and Xbox One, how big of a bottleneck is it going to be for PS4 Pro, and possibly Xbox Scorpio?

I understand the purpose of both PS4 Pro and Xbox Scorpio is merely to run current generation games at better image quality settings, with more stable frame rates as well. But still, will Jaguar be able to keep up once PS5 arrives?..

Is Jaguar even able to drive a 4.2TF GPU, let alone a 6TF GPU?
IMO it really seems, I dunno... unbalanced ?

edit: here is the video that got me thinking
 
Sony and MS made the right call. You weren't going to get 200 watt consoles again, with all the issues they had, not to mention the losses incurred. Corners had to be cut somewhere.

Between gpu and cpu, the gpu is more important in games today.

It seems to me that the biggest blunder of the current consoles is the obscene amount of memory reserved by the operating systems.

I mean there's definitely a cpu bottleneck, but nothing that stops developers from achieving a locked frame rate if it's one of their top priorities.

---

Since Ps4 pro's extra power is largely going towards higher resolution, jaguar won't be a bigger issue than it is for ps4. As for scorpio, we don't know for sure if it uses jaguar or something else.
 
depending on the developer and the way the games are designed, CPU tasks can be offloaded to GPU. A majority of CPU time is very much used towards the render block and the more complex the scene the more the CPU is loaded up.

We do have functions in place to move draw calls over to the GPU; i'm pretty sure a lot of work is being done in that space by the API holders, vendors and developers.
If we're successful in moving draw calls over to the GPU, then the jaguar has plenty of CPU to do other tasks.
 
Sony and MS made the right call. You weren't going to get 200 watt consoles again, with all the issues they had, not to mention the losses incurred. Corners had to be cut somewhere.

Between gpu and cpu, the gpu is more important in games today.

It seems to me that the biggest blunder of the current consoles is the obscene amount of memory reserved by the operating systems.

Fully agree! According to AMD's roadmap, it would take until 2018 for APU's with more powerful CPUs to become available, today, and certainly back in 2013, Jaguar was the best choice. But if Jaguar is holding back the Polaris in PS4 Pro, then I believe a 460 equivalent (with regards to CU count) could have achieved similar results. At a lower cost of course, leading to faster market adaptation
 
The bottleneck of a 8core Jaguar CPU is all relative if Console Game Publishers also target Nintendo Switch with Tegra X1.

Not when driving +4TF of power.

Let me put it this way: would you rather have a system which has a Titan X and a AMD Turion X2. -or- a system which has a 980ti and an i5 7600k ?
Would you say the Titan X in the AMD Turion X2 system is bottlenecked by the CPU, or would you say it's all relative?
 
Im sure every developer ever would prefer a quad core I7 processor in place of the 8 core Jaguar, but it is what it is. Its much harder to market advances made possible by high end CPU's, but GPU's can render the same scene in higher resolutions with higher fidelity settings. Luckily the bar was set with the Jaguar on both Xbox One and PS4, and that had to make developers life a little easier. Not because of the performance, but because they had a consistent baseline spec to target. All games are different as well. I would bet most games are still GPU limited. Console API's cut down on draw calls significantly compared to PC, and thus a lot of the performance hogging operations on PC are handled much more efficiently on consoles. Although Vulcan seems to be closing the gap considerably. Products always have to make compromises. Its what AMD had to offer in the form of an APU for the price point PS4/X1 were targeting. Going to a quad core I5 or I7 would have increased cost significantly seeing as how the CPU and GPU would then have to be discrete.
 
Titan X and a AMD Turion X2. -or- a system which has a 980ti and an i5 7600k ?

That's a loaded comparison because you picked a very good GPU for the 2nd system but a VERY VERY poor CPU for the first. So you went great GPU+awful CPU vs Very good GPU +good CPU. Of course the 2nd system is preferable.

The loaded analogy would be me asking something like "which would you prefer, a i5 6600k+1080 Ti or a i7 6950X+Geforce 460"
 
But your example was still pretty biased. You can create a PC system with a GPU or CPU imbalance easily enough. It doesn't prove a lot related to this topic.

I am not sure how to judge Jag compared to an i5 considering Desktop consumers CPU's basically spend mega transistors to handle bad generic code best. Not really that applicable in consoles (in theory).

Also you have 8 Jag cores, which is at least something, compared to your 2 core Turion, still double the cores of an average desktop today, neglecting overhead it means they could be half performant and be equal. Not ideal but a positive.

Well, I certainly hope for 2.3 ghz or something on Scorpio if it's Jag. At least it's better than paltry 1.6/1.75. We know Pro is at 2.1 so it'll at least be that. A nice 50% per core boost at least. MHZ improvements are always the best if you can get 'em, no parallelism overhead.
 
I don't think Jaguar is a major bottleneck in most games that only push for 30FPS. For games that push for 60FPS, a faster CPU would probably be much more useful.
 
Wouldn't it be a bottleneck is if it either leads to CPU limited games or prevents new gameplay ?
 
Last edited:
Sadly, the PS4 Pro and possibly even the Scorpio (depending if it can have exclusive titles) isn't made to produce better gameplay, enhanced animation, or other things CPUs can contribute to. They are simply made to allow PS4 and XBox One games to reach closer to 4K resolution.
 
But your example was still pretty biased. You can create a PC system with a GPU or CPU imbalance easily enough

I think that is his point though. If the PS4/X1 were PC's, they would be very imbalanced, heavily favoring the GPU performance. In the PC world, there is no question a Jaguar CPU would be a bottleneck, but in consoles, not so much thanks to the API's reducing the work load on the CPU compared to PC.
 
But your example was still pretty biased. You can create a PC system with a GPU or CPU imbalance easily enough. It doesn't prove a lot related to this topic.

I am not sure how to judge Jag compared to an i5 considering Desktop consumers CPU's basically spend mega transistors to handle bad generic code best. Not really that applicable in consoles (in theory).

Also you have 8 Jag cores, which is at least something, compared to your 2 core Turion, still double the cores of an average desktop today, neglecting overhead it means they could be half performant and be equal. Not ideal but a positive.

Well, I certainly hope for 2.3 ghz or something on Scorpio if it's Jag. At least it's better than paltry 1.6/1.75. We know Pro is at 2.1 so it'll at least be that. A nice 50% per core boost at least. MHZ improvements are always the best if you can get 'em, no parallelism overhead.


Your wish came true! A 2.3ghz Jaguar CPU :)
 
Last edited:
Perhaps the harsh reality is that in a world of established efforts in GPU compute, the Jaguar cores really aren't a bottleneck to the iGPUs in these SoCs. To the point where an 8-core Jaguar at 2.3GHz is capable of driving a 6 TFLOPs GPU without being a bottleneck.
 
Perhaps the harsh reality is that in a world of established efforts in GPU compute, the Jaguar cores really aren't a bottleneck to the iGPUs in these SoCs. To the point where an 8-core Jaguar at 2.3GHz is capable of driving a 6 TFLOPs GPU without being a bottleneck.

This asked an interesting question for next-gen machines.
Will they go with a 4c/8t Ryzen CPU or a 8c/16t?
 
CPU choice will end up being bang-for-buck for chip area & power consumption since graphics side will need the bigger focus for consumers. Devs will ultimately be forced to tailor their engines for the given CPU budget so bottlenecks are shifted elsewhere i.e. where graphics are easier to scale.
 
Back
Top