New consoles coming with low-clocked AMD x86. Can we now save moneys on our PC CPUs?

  • Thread starter Deleted member 13524
  • Start date
Thanks to another thread, here's the first developer insight I've seen about the subject:

We have the exact same kind of Achilles’ heel on the PC too. People who have AMD chips have a disadvantage, because a single core on an AMD chip doesn’t really have as much horsepower and they really require you to kind of spread the load out across multiple cores to be able to take full advantage of the AMD processors.

Our engine sucks at that right now. We are multi-threaded, but the primary gameplay thread is very expensive. The biggest piece of engineering work that they’re doing right now, and it’s an enormous effort, is to go back through the engine and re-optimize it to be really, truly multi-threaded and break the gameplay thread up. That’s a very challenging thing to do because we’re doing a lot of stuff – tracking all these different players, all of their movements, all the projectiles, all the physics they’re doing.

It’s very challenging to split those really closely connected pieces of functionality across in multiple threads. So it’s a big engineering task for them to do, but thankfully once they do it, AMD players who’ve been having sub-par performance on the PC will suddenly get a massive boost – just because of being able to take the engine and re-implement it as multi-threaded.

I’m very excited about that because I have a lot of friends, lots of people who are more budget minded, going for AMD processors because nine times out of ten they give a lot of bang for the buck. Where it really breaks down is on games with one really big thread. PlanetSide’s probably a prime example of that.
 
This would be great news for PC hardware in general since 1. It makes AMD more competitive forcing Intel to up their game and 2. (kinda following on from 1) if games are going to start making heavy use of 8 threads then 8 core Intel CPU's may start becoming main stream. Although that would clearly tip the competitive advantage back in Intels favour.
 
Do we really need 8-core CPUs, given that most of them are working at over 3.2GHz, whereas the ones in PS4 work at 1.6GHz?

How does the Jaguar's IPC compare to Steamroller and Ivybridge?
Isn't any modern 4-core @ >3.2GHz going to be always faster than the 8-core Jaguars @ 1.6GHz?
 
Do we really need 8-core CPUs, given that most of them are working at over 3.2GHz, whereas the ones in PS4 work at 1.6GHz?

How does the Jaguar's IPC compare to Steamroller and Ivybridge?
Isn't any modern 4-core @ >3.2GHz going to be always faster than the 8-core Jaguars @ 1.6GHz?

We certainly don't need 8 core CPU's to keep up with the console CPU's. I think the IPC of an Ivy core is at least double that of Jaguar which in addition to running at twice the speed (ignoring turbo) gives an easy 4x performance per core. So even with 4 cores we should have at least twice the overall performance and 4x the single threaded performance.

But 8 cores would be cool regardless. For example an 8 core Haswell running at full turbo is going to be pushing close to a TFLOP of pure SIMD performance. When that kind of power is main stream the whole question of low latency GPGPU being a limitation on the PC side becomes moot.
 
Mantle has been announced and promises console-like efficiency in the PCs and cutting most of the CPU overhead in GPU rendering.

So now that the efficiency and draw-calls limitation seems to have been lift (if Mantle becomes widely adopted), what will happen to the gaming CPU market?

In this situation, we'll have:
1 - CPUs for $150 or less that compare to the ones in the next-gen console
2 - Games will be built for low-clocked x86 cores
3 - An API that severely reduces CPU overhead on rendering
4 - Games that will increasily use GPGPU for many tasks


If a PC achieves performance/efficiency parity with a next-gen console using a $100 CPU, what will happen to the $300/400 CPUs that many PC gamers find the need to purchase?

Why would anyone pay top dollar for an expensive CPU if the practical difference for a CPU that costs less than 1/3rd becomes marginal?


What could still be needing expensive CPUs in multiplatform games? SLI/Crossfire? CPU-based physics?
 
You'd hope that programmers can start using CPUs for more interesting stuff than feeding the GPU in a very inefficient manner?
 
You'd hope that programmers can start using CPUs for more interesting stuff than feeding the GPU in a very inefficient manner?

You think developers will put a lot of effort into getting more content/interaction eclusively for the PCs just because some of them may use substantially better CPUs than the consoles?

Because they sure as hell did not do that, during this last gen.
Besides using larger textures (as in, less compressed from the originals), tesselation in a couple of titles and enhanced shaders, what did we ever see in comparison with the console versions?
 
Some of the current games seemingly need an i5 to run fast. That's demanding enough, I think, also some people game on laptops which are a lot slower.

So I think not much changes. The console CPUs will be competitive btw, they're going from three weak cores (if we take the X360, Cell is less easy to compare) to eight quite less weak cores so the PC CPUs will be used.
CPUs have been stagnating, anyway. The roadmap is just SIMD widening and AMD trying to catch up on IPC (while losing cores compared to the higher end FX)

They will remain TDP-limited : especially, mainstream CPUs are sold in laptops too, at lower clock/voltage. The FX is an exception but will not be replaced. So I don't think you'll see a 8 core to 12 core monster anytime soon.
 
If 8x Jaguar cores at 1.6GHz were definitely not considered a bottleneck to 1.2-1.8 TFLOPs GPUs in a gaming/multimedia system, how could 4/6x Vishera cores at 4GHz (which can be had for $100-130) be considered a bottleneck to 2-5 TFLOPs GPUs?


And I think you missed the point where the new factor here is Mantle (in a hypothetical situation where it becomes standard), which should be able to make the games with lower-end CPUs run a lot faster. So no more expensive Core i5 needed for the GPU to become the bottleneck again.
 
Back
Top