PS4 and Xbox One: Was AMD the right choice?

What about IBM?
There aren't too many options for getting PowerPC in a low-power package with the performance of what AMD's Jaguar offers. Cell I think was out, despite the fact most folks were just getting the hang of the thing.
 
That depends on whether there was truth to the rumor that Intel tried marketing Larrabee to the console makers.
I'm sure there was discussion but the Sony/Cerny goal from the outset appears to have been focussed on a developer friendly hardware platform. I'm not sure how Larrabee figures into this. It could well have been a repeat of Cell.

If ARM becomes worth looking into because the Vita architecture had it, then PowerVR gets pulled in as well. It would depend on whether there is truth to the hearsay about an early run-off between ARM and x86.

I'm sure Sony looked at everything, even if briefly. Back when Sony started the PS4 project, 2008 according to Cerny's recent talks, 64-bit ARM was on the drawing board but the architecture (AArch64) hadn't been finalised and performance was a complete unknown - as was it being delivered when ARM planned it would. Betting on it would have been a gamble.

At least Sony may not have required 64-bit ARM up until the point it thought it could shoot for greater than 4GB in memory. As far as how many console devs know ARM, from Sony's POV it's a question of how many devs know Vita.

I'm sure it wasn't an address space issue, but more to do with stepping back from a 64-bit Cell chip to a 32-bit ARM chip. Although there are 64-bit/128-bit SIMD extensions like NEON, your ARMv7 is still just a 32-bit processor at heart and nobody know when ARMv8 would arrive.

As far as how many console devs know ARM, from Sony's POV it's a question of how many devs know Vita.

I wonder which R&D project began first, PS4 or Vita. I'm sure Sony looked at ARM, but back in 2008 game development on ARM was still pretty niche. It didn't really explode until a couple of years later after Apple opened up iPhoneOS/iOS and competitive Android phones began to appear.

Would it have been a misstep to go that route? AMD publicly put forward the narrative that neither Sony or Microsoft knew of their intention to go with a custom APU, although I don't know about that.
I think it would be fun to imagine an alternate universe where Sony somehow blundered into a 'roided-up Vita, although I'm not sure Sony would have enjoyed things as much.

Yeah, that would have been a gamble. If they had bet on Vita doing really well and based PS4 around similar tech hoping to leverage Vita devs experience that would not have worked out as planned. But then again, there are an awful lot of devs working on ARM-based hardware. Hmmm.

I'm personally very happy with the AMD solution. Sony may have had little-to-no choice, but I don't for a second thing they would have proceeded if they didn't think it was a good, solid chip on which to base their console. Clearly Microsoft thought the same.
 
That's spin. They were interested, and it wasn't that long ago that JHH was telling the world that they would be in one because "it is impossible for one company to do all three"! Cerny pinpointed the technical reasons they were out - i.e. the only APU like solution NVIDIA could offer would be ARM based and that's not powerful enough.
Thanks - missed this when you posted it last night. Did they say this publicly? I googled the quote but couldn't find anything.
 
One thing that needs to be kept in mind is Sony went to all the developers and asked them what they would like to see in the next generation consoles. The Developers wanted HSA.The only company that could pull it off was AMD because of it's APU's.

IMO Because Sony was so far ahead of MS in shaping the developer community and many titles that MS wanted were already being made for PS4 and HSA. I think MS jumped on the bandwagon to save a lot of porting costs because the alternatives were.

1. Use an Intel CPU and an Nvidia GPU. The problem with this solution is that HSA isn't really possible, games would need significant MS and NV support that would put Xbox versions way behind their PS4 counterparts.
Using a Quad core Intel CPU for the heavily multithreaded PS4 centric titles could be a problem due to cache thrashing. And I don't think an Intel 6 core chip is possible due to the cost.

2. Use an Intel CPU and an AMD GPU. Again lack of HSA causes porting problems. The hardware to match Jaguar's performance would likely take considerably more watts.

Imo going with AMD's 8 core Jaguar was the best possible choice. It pushes reliance on IPC to the back burner and forces developers to embrace Multithreading and GPGPU. It also has the added benefit of stopping the majority of NV TWIMTBP titles.
 
The consoles would have been developed in parallel, they werent copying each other. MS had already been with AMD since 360 they simply stayed, Sony was Nvidia last time they switched to AMD as MS already was.

I always assumed the necessity of an SOC was cost related. The cost savings must have been such that discrete components weren't even a consideration.

Well we've been through this discussion a million times but it's hard to say how viable an alternative Intel was. Intel and Nvidia both seem to be considered more difficult and expensive to work with.

I bet for next consoles (if they happen) ARM+Nvidia will be a viable option, since ARM will be strong enough by then I expect. Intel might also be an option since Haswell GPU is pretty powerful, but again that comes down to how much they want it.

Intel is in a position to dictate terms, AMD seems a little desperate, they needed this console deal to survive.

I bet whether Nvidia makes a push next time depends on how well this console cycle works out for AMD. If AMD sees a lot of benefits and dollars, Nvidia obviously would be more inclined next time.

One aspect we have yet to see is how much the AMD consoles translate into better benchmarks on PC ports of console games.
 
Mark Cerny mentioned several times that he really wanted to have a x86 CPU in the new console. So he had to choose from:

AMD CPU + AMD GPU
AMD CPU + Nvidia GPU
AMD HSA APU
Intel CPU + AMD GPU
Intel CPU + Nvidia GPU
Intel "APU"
Intel MIC

What would you guys choose? Sony decided to go with a powerful modern-day Radeon GPU, cache coherency and unified GDDR5 RAM. Doesn't sound too bad, does it? And Cerny already explained that he plans to eventually boost the compute capabilites of the system by using asychronous fine-grain compute. Looks like a nicely balanced system to me, especially since Sony can sell it for $399.
 
The consoles would have been developed in parallel, they werent copying each other. MS had already been with AMD since 360 they simply stayed, Sony was Nvidia last time they switched to AMD as MS already was.

MS already stated that development started late 2010.
http://news.xbox.com/2013/05/xbox-one-beauty-of-xbox-one

Cerny stated development for the PS4 began 2008
http://playstationgang.com/mark-cer...tation-4-wont-require-to-always-be-connected/

Seems pretty parallel to me.
 
One thing that needs to be kept in mind is Sony went to all the developers and asked them what they would like to see in the next generation consoles. The Developers wanted HSA.The only company that could pull it off was AMD because of it's APU's.

IMO Because Sony was so far ahead of MS in shaping the developer community and many titles that MS wanted were already being made for PS4 and HSA. I think MS jumped on the bandwagon to save a lot of porting costs because the alternatives were.

1. Use an Intel CPU and an Nvidia GPU. The problem with this solution is that HSA isn't really possible, games would need significant MS and NV support that would put Xbox versions way behind their PS4 counterparts.
Using a Quad core Intel CPU for the heavily multithreaded PS4 centric titles could be a problem due to cache thrashing. And I don't think an Intel 6 core chip is possible due to the cost.

2. Use an Intel CPU and an AMD GPU. Again lack of HSA causes porting problems. The hardware to match Jaguar's performance would likely take considerably more watts.

Imo going with AMD's 8 core Jaguar was the best possible choice. It pushes reliance on IPC to the back burner and forces developers to embrace Multithreading and GPGPU. It also has the added benefit of stopping the majority of NV TWIMTBP titles.

To be clear, developers asked for unified memory, not HSA. The 360 has unified memory, but we wouldn't say it has HSA.

I'm in the camp that AMD was the only choice. Intel has zero desire to make cheap chips and nvidia didn't have an ARM chip it could use to make an APU. Remember that this is ask pre-2010 when anyone trusted nvidia to make anything besides just graphics cards. They had 1 commercial product with a CPU in it (Zune with Tegra 1). ARM64 is also still a year off assuming it ships on time.
 
Thanks - missed this when you posted it last night. Did they say this publicly? I googled the quote but couldn't find anything.

http://venturebeat.com/2011/03/04/q...his-strategy-for-winning-in-mobile-computing/

Q: Do you think there will be another round of consoles coming?


A: Oh, no question about it.

Q: And can you predict when it will be in terms of how many years from now?

A: We will build one of them, right. And the reason for that is because the world doesn’t have enough engineering talent for anybody to build three of them at one time. It takes the entire livelihood of a computer graphics company to build one of them. And every single time they build one, my life is in danger. You build it once every five or seven years, but you have to build it all in a very short time. That’s because they wait and wait and then they say, ‘Can I have it next week?’

This was in March 2011, after this thread points out all were in development...
 
http://venturebeat.com/2011/03/04/q...his-strategy-for-winning-in-mobile-computing/

This was in March 2011, after this thread points out all were in development...
Thanks for this! I can't recall if the talk Cerny gave at Gamelab mentioned much in the way of timeframes for events, I recall the story jumped about a bit. I'm surprised that hadn't settled on a CPU and GPU as late as March 2011!

I must admit I wasn't paying a whole lot of attention to Nvidia when it came to nextgen consoles, I get the feeling they burnt some bridges with Microsoft after the original Xbox.
 
Using a Quad core Intel CPU for the heavily multithreaded PS4 centric titles could be a problem due to cache thrashing. And I don't think an Intel 6 core chip is possible due to the cost.

A fast quad core Intel CPU would have been significantly faster than the current 8 core Jaguars in the consoles. Far from being a performance issue this would have been a significant performance advantage for Sony.

Apart from anything else, Intel CPU's are capable of running 8 threads as well even though they are spread across 4 cores. And each thread would be much faster than a Jaguar core.
 
The execution resources for Intel's cores are such that they are frequently double what Jaguar can offer in terms of numerical counts, prior to realizing they can readily clock twice as fast, so the cores are fine. Other factors intrude, though in general the idea that a higher core count Jaguar was a sufficient or superior stand-in for AMD's desktop cores for the consoles says more about Bulldozer than it does anything else.

The L1 and L2 could thrash more readily in a multithreaded case, since they are per-core. This is mitigated and probably bettered for Intel by a far superior last-level cache situation compared to Jaguar, which is an area where there was a clear economy of effort for the console designers.

For Durango, we know from the leaks that the dual-module approach is particularly painful for remote accesses, as coherence between L2s is not an optimized case. The numbers are about as bad as main memory access elsewhere, which means main memory access for the consoles is probably incrementally more painful as a result.
The exact organization for Orbis is not detailed, although we haven't been given a reason to suspect otherwise. Vgleaks indicates extra care is needed to make sure not too much tries to cross between modules, which means the 8-core consoles shouldn't be treated the same as a PC 8-core. Assuming two reserved cores, a game faces four cores it can readily use and two that will tangle more frequently with the L2 traffic of the system reserve, and for which excessive interplay with the remaining four can cause performance problems.
There are still decoupled functions that can do well enough on those two, but that places a higher priority for the game on a single module.

For Intel, there's a very high-bandwidth and large L3 that is uniformly shareable. In cycle terms, it is (being clocked twice as fast aside) moderately longer-latency than the Jaguar's not particularly impressive L2 numbers.
 
A fast quad core Intel CPU would have been significantly faster than the current 8 core Jaguars in the consoles. Far from being a performance issue this would have been a significant performance advantage for Sony.

Apart from anything else, Intel CPU's are capable of running 8 threads as well even though they are spread across 4 cores. And each thread would be much faster than a Jaguar core.

Intel also would have wanted $150 per chip just for the CPU alone. Compared to an estimate $80 for the PS4/XboxOne APU, you can see why Intel was never seriously in the running.
 
Intel also would have wanted $150 per chip just for the CPU alone. Compared to an estimate $80 for the PS4/XboxOne APU, you can see why Intel was never seriously in the running.

I don't dispute it. My argument was about performance only.
 
Note the original article has been updated:

After publication of this article, the developers reached out to us to clarify that the Forgelight Engine in particular struggles with AMD chips, and that it “is not something that is inherent in the PS4 hardware”, and “not something all developers will have as a hurdle,” but that the SOE team will have to overcome the issue.
 
Maybe he mean the project shield.

Impossible. According to nvidia The Shield is a project that started in early 2012.

http://blogs.nvidia.com/blog/2013/01/30/how-project-shield-got-built/

... In less than a year, SHIELD has grown from an idea dreamed up by Jen-Hsun, Tony, and a handful of others into a conspiracy involving hundreds of gaming fanatics across every department at NVIDIA. “We’ve been talking on and off about building something for more than five years, maybe 10,” says Tony.

The first prototype, assembled in early 2012, was little more than a game controller fastened to a smartphone with wood.
 
The execution resources for Intel's cores are such that they are frequently double what Jaguar can offer in terms of numerical counts, prior to realizing they can readily clock twice as fast, so the cores are fine. Other factors intrude, though in general the idea that a higher core count Jaguar was a sufficient or superior stand-in for AMD's desktop cores for the consoles says more about Bulldozer than it does anything else.

The L1 and L2 could thrash more readily in a multithreaded case, since they are per-core. This is mitigated and probably bettered for Intel by a far superior last-level cache situation compared to Jaguar, which is an area where there was a clear economy of effort for the console designers.

For Durango, we know from the leaks that the dual-module approach is particularly painful for remote accesses, as coherence between L2s is not an optimized case. The numbers are about as bad as main memory access elsewhere, which means main memory access for the consoles is probably incrementally more painful as a result.
The exact organization for Orbis is not detailed, although we haven't been given a reason to suspect otherwise. Vgleaks indicates extra care is needed to make sure not too much tries to cross between modules, which means the 8-core consoles shouldn't be treated the same as a PC 8-core. Assuming two reserved cores, a game faces four cores it can readily use and two that will tangle more frequently with the L2 traffic of the system reserve, and for which excessive interplay with the remaining four can cause performance problems.
There are still decoupled functions that can do well enough on those two, but that places a higher priority for the game on a single module.

For Intel, there's a very high-bandwidth and large L3 that is uniformly shareable. In cycle terms, it is (being clocked twice as fast aside) moderately longer-latency than the Jaguar's not particularly impressive L2 numbers.

Bkilian said the CPU in XBO can do six times the work as the 360 CPU...

Given the low clock it's impressive though not unexpected, and serves as a fine "next gen" jump.

Jaguars are also extremely light on power and die size. Making them excellent fits for 130 watt consoles and SOC's.
 
MS already stated that development started late 2010.
http://news.xbox.com/2013/05/xbox-one-beauty-of-xbox-one

Cerny stated development for the PS4 began 2008
http://playstationgang.com/mark-cer...tation-4-wont-require-to-always-be-connected/

Seems pretty parallel to me.

Microsoft and Sony start thinking about next gen the day their current gen ships. whatever Cerny was doing in 2008, MS was doing something analogous at the same time. This would be likely high level conceptual thinking. (IIRC Cerny said he did his heavy thinking about X86 vs not, in 2007)

Otherwise one would ask what the hell took Sony so long, why is PS4 coming out in 2013 and not 2011? :LOL:
 
Back
Top