PS4 and Xbox One: Was AMD the right choice?

Microsoft and Sony start thinking about next gen the day their current gen ships. whatever Cerny was doing in 2008, MS was doing something analogous at the same time. This would be likely high level conceptual thinking. (IIRC Cerny said he did his heavy thinking about X86 vs not, in 2007)

Otherwise one would ask what the hell took Sony so long, why is PS4 coming out in 2013 and not 2011? :LOL:

Processor design takes years. MS said in that video that concept work began late 2010. IDC if you don't believe words straight from the horses mouth.
 
Processor design takes years. MS said in that video that concept work began late 2010. IDC if you don't believe words straight from the horses mouth.

He's talking about the plastic box, controller, etc obviously. Geez. http://www.merriam-webster.com/dictionary/industrial design

Definition of INDUSTRIAL DESIGN
: design concerned with the appearance of three-dimensional machine-made products; also : the study of the principles of such design

You were so adamant about that? Really?
 
I think that at least MSFT had no choice, they wanted lots of RAM, ARM v8 was too late.
For Sony well it is more disputable as the high amount of RAM did not seem to be the focus of the design.
I wonder if it would make a difference on price, I wonder if somebody has data about the cost of something like zacate vs the cost of a high end ARM SoC?
 
Do we know all there is to know about the jaguar cpu's in both machines?
Or does any one believe there would have been tweaks on the same level as seen on the gpu side of things.
I believe jaguars now implement AVX (128?) already, anything else they would've considered doing.
 
Do we know all there is to know about the jaguar cpu's in both machines?
Or does any one believe there would have been tweaks on the same level as seen on the gpu side of things.
I believe jaguars now implement AVX (128?) already, anything else they would've considered doing.

If one console has some major advantage over the other (say more performance on the weakest parts, or better efficiency) they are going to mention it, the fact that both has glossed over the details pretty much points to stock standard jaguar.
 
When you say glossed over, are you talking about vgleaks docs?
Shame, was hoping there would be cpu related things for people here to dissect in the future.
 
When you say glossed over, are you talking about vgleaks docs?
Shame, was hoping there would be cpu related things for people here to dissect in the future.

When I say glossed over, I mean in the presentations and everything else, we only know its jaguar because of the vgleaks, other then x86 and 8 cores they haven't said anything.
 
Bkilian said the CPU in XBO can do six times the work as the 360 CPU...

Given the low clock it's impressive though not unexpected, and serves as a fine "next gen" jump.

Xenon set the bar low enough that almost anything modern could do much better.
This goes to the point that an Intel quad core from Sandy Bridge on should be able to match or better two Jaguar modules. I'm less certain if Nehalem could fit in that set. That would depend on the class of AMD chips that are causing problems for that particular engine. There are some newer AMD SKUs that have finally gotten into that performance range.

The console games will be using six Jaguar cores, and there is a potentially awkward 2:4 split in thread resources thanks to the choices surrounding the integration of the second module that the current desktop chips won't need to worry about.
It's nominally 6 threads, although it might turn out to be less since two of them have more performance uncertainty.

Jaguars are also extremely light on power and die size. Making them excellent fits for 130 watt consoles and SOC's.
Intel's desktop CPUs don't need to skimp to fit alongside a GPU on a cost-sensitive SOC, and it shows.
 
If one console has some major advantage over the other (say more performance on the weakest parts, or better efficiency) they are going to mention it, the fact that both has glossed over the details pretty much points to stock standard jaguar.

Not necessarily. MS won't talk up eSRAM/flash cache/DMEs and doesn't really go into the other pieces of custom silicon in the Xbox one at all. Everything out of MS is vague and generalized.


(on the general topic)

Intel has better cpus but they aren't known to produce high end performance gpus and historically even their IGPs have been outclassed by nvidia and amd.

Nvidia may have better gpus but their ARM experience is all based on mobile devices and the desktop like product isn't slated until sometime in the next couple of years.

The current trend is toward heterogeneous processors.

Neither Intel or Nvidia has been able to maintain a partnership beyond one gen of consoles.

Given that AMD is a known quantity on both sides of the hardware, is producing decent heterogeneous desktop parts and is able to maintain long term relationships with console manufacturers, AMD makes the most sense.
 
Last edited by a moderator:
Remember that this is ask pre-2010 when anyone trusted nvidia to make anything besides just graphics cards. They had 1 commercial product with a CPU in it (Zune with Tegra 1).

Not quite. My 2010 (2009 also) Audi A4's Multimedia Interface System (MMI) which is found in every model including the R8, is powered by Tegra 2. Audi is well known for being overly careful with putting technology into production models, especially something as important as the system which runs the navigation, audio system, video player, parking sensors/cameras, climate control, and even the electro-magnetic suspension system. If they had any doubts about the reliability of the Tegra chip in my car, they would have never put it into production. Especially after what happened to their brand image in the 80's when the American media destroyed them for over a decade because of a woman who claimed the car magically ran over her kid. Only long after the damage was done did the investigation show the car was not at fault and the woman actually ran her own child over. But the media never reported that, much like they haven't done much to undo the damage over the Toyota Prius fake scandal.

This was in March 2011, after this thread points out all were in development...

I'm quite certain that in early 2011 most companies were expecting the economy to really begin to pick up speed which in turn would allow larger capital investments in the hardware put into these consoles. Nobody at Sony was expecting another 10 quarters of consecutive losses on top of at least that much time already. But i'm sure they made the calculated decision to avoid producing a console like the PS3 that would sell at a loss for many years to come, and lucky for them that was the correct choice. If the PS4 sold for a similar loss per unit as the PS3 did, there would have been a real possibility of Sony being broken up by a chapter 11 bankruptcy court.
In early 2011 Sony was at $35/share and at the start of 2013 had plummeted to $9 and has rebounded partially due to exceedingly good press over the upcoming PS4 release.
 
Why would consoles even come with intel core cpus? If they decide on intel for CPU you'd be getting a pentium or a celeron at best and worst case you get atoms in your next gen consoles. For the GPU, you'd probably be getting 1/4 the performance of the current hardware, all for the same price.

If you go to nvidia for a soc, it will be less powerful than the 360. Tegra 4 is still not even unified shaders and doesn't support any new dx11.2 or openCL or openGL 3.0 compatible. You'd also be stuck at 4 gb of ram. Basically you can probably get sub wii-u level hardware. Logan doesn't come out until next year and even if you stuck 2 of those together, its still 1/4 as powerful as the current ps4 gpu.

Other ARM vendors are all as woefully behind on GPU tech in their SoCs as nvidia.

IBM doesn't have gpu tech so they would have license from nvidia or AMD. Thus making things more complex.

HSA and gpu features all come from AMD along with low power CPU cores. AMD is the only choice here.
 
If you go to nvidia for a soc, it will be less powerful than the 360. Tegra 4 is still not even unified shaders and doesn't support any new dx11.2 or openCL or openGL 3.0 compatible. You'd also be stuck at 4 gb of ram. Basically you can probably get sub wii-u level hardware. Logan doesn't come out until next year and even if you stuck 2 of those together, its still 1/4 as powerful as the current ps4 gpu.

Why are you looking at off the shelf soc's from NV as a comparison point? AMD isn't offering off the shelf parts so why would NV? Clearly it would be a custom part from NV just as it is from AMD.
 
Why would consoles even come with intel core cpus? If they decide on intel for CPU you'd be getting a pentium or a celeron at best and worst case you get atoms in your next gen consoles. For the GPU, you'd probably be getting 1/4 the performance of the current hardware, all for the same price.

If you go to nvidia for a soc, it will be less powerful than the 360. Tegra 4 is still not even unified shaders and doesn't support any new dx11.2 or openCL or openGL 3.0 compatible. You'd also be stuck at 4 gb of ram. Basically you can probably get sub wii-u level hardware. Logan doesn't come out until next year and even if you stuck 2 of those together, its still 1/4 as powerful as the current ps4 gpu.

Other ARM vendors are all as woefully behind on GPU tech in their SoCs as nvidia.

IBM doesn't have gpu tech so they would have license from nvidia or AMD. Thus making things more complex.

HSA and gpu features all come from AMD along with low power CPU cores. AMD is the only choice here.

If the focus is on HSA\SOC's absolutely.. But would discreet components be so terrible?

Or were the benefits of HSA\SOC's worth all the fuss?
 
Why are you looking at off the shelf soc's from NV as a comparison point? AMD isn't offering off the shelf parts so why would NV? Clearly it would be a custom part from NV just as it is from AMD.

To be fair, AMD's offerings are based on "off-the-shelf parts", there's just more of them in the APU. Tegra 4 wouldn't be good no matter how many of them you'd put together in a single chip
 
To be fair, AMD's offerings are based on "off-the-shelf parts", there's just more of them in the APU. Tegra 4 wouldn't be good no matter how many of them you'd put together in a single chip

True, but that doesn't mean they can't integrate a desktop equivalent GPU core into an ARM based SOC. Although if reports are true that both Sony and Microsoft found the CPU power of ARM CPU cores at the time to be less than that of Jaguar then that wouldn't have worked anyway. After all it's the difference between a CPU core designed for smartphones and tablets versus one designed for notebooks and possibly tablets.

Regards,
SB
 
Back
Top