*spin-off* Length of the generation & HW... things

True, I was more thinking of the "easy-BC" path more than anything. The power consumption is just a side-bonus, but even so if that were a path to take, they wouldn't have to include the ARM at all then (one less license).
Ah I see, sorry I misunderstood :oops: Maybe. Maybe you won't need the x86 hardware in a few years. There are some people doing interesting work to get ARM as a viable x86 platform (for compatibility, not as an alternative to real x86 hardware) but I'm guessing it's a complicated area in terms of getting the appropriate licence from Intel. From what I have observed, if your design is outside of the consumer and enterprise space, there looks as though there may be room for negotiation - particularly if you're avoiding a purely hardware solution. I think the lack of any form of a solution x86 emulation solution in Windows RT speaks volumes.

Most of my hardware (1.2m core server farm) is x86, but we also have a lot of custom nodes with specialised processors for specific things. I never thought I'd see the day where anybody would even discuss x86 being replaced as the base architecture but we look at ARM and their speed of development and they are increasingly moving to a place that I think will work better (than x86) for how we engineer software solutions.

And their approach is genius, most of the onerous work is done by their licensees :yep2:

Also, would it be feasible to have the OS reservation on said "little" cores instead of reserving the newer "big" ones :?: Maybe it's a mess for the memory bus. :s They do already have some funky stuff going on with multiple OS's though.

I definitely don't see the PS4 OS doing much, but that may change for PS5.
The Xbox One looks a little busier, though. It's hard to really tell what is going on, I suspect we'll here more on Xbox One once Microsoft really open up the application OS for third parties, developer information will inevitably leak.
 
Sony or Microsoft wouldn't go to ARM to design a custom core, they'd look into making or contracting a customized core that uses the ARM ISA.
Companies with the hardware experience of Microsoft and Sony could probably develop in-house expertise in this. Or do what Apple did and acquire it. Apple went from nothing to rolling their own designs (first 64-bit ARM chip in a commercial product) in a few years. Competitive well-thought out designs.

The number of such implementors is limited, and the number of such implementors with all the additional IP and architectural history that overlaps with what consoles (as we know them) is very limited.
The number of implementors is actually huge but it's just in systems that you will never get close to unless you're a government contractor or work in the defence industry. It's worth stating that both of these industries are notoriously slow to embrace change, or risk changing what has proven to work. Pre-Core radiation hardened x86 chips are still being built into new products by some but ARM is really causing something of a revolution.

If AMD is successful in what its marketing has promised, there is going to be a higher-performance custom ARM core in the range that a console would like, and it's going to plug into an SOC with the infrastructure with the highest chance of matching Onion and Garlic.

I wait with baited breath ;)
 
Companies with the hardware experience of Microsoft and Sony could probably develop in-house expertise in this. Or do what Apple did and acquire it. Apple went from nothing to rolling their own designs (first 64-bit ARM chip in a commercial product) in a few years. Competitive well-thought out designs.
Apple's Cyclone chips go into devices that serve massively profitable and growing markets.
Bespoke hardware engineering goes to the markets that deserve it, and the measure of worthiness is the profitability of said market.
Consoles--as we know them--do not deserve it, and the A7 is not in the same league as the console APUs in performance.

The number of implementors is actually huge but it's just in systems that you will never get close to unless you're a government contractor or work in the defence industry.
It's worth stating that both of these industries are notoriously slow to embrace change, or risk changing what has proven to work. Pre-Core radiation hardened x86 chips are still being built into new products by some but ARM is really causing something of a revolution.
This indicates that they don't have the necessary experience or IP to provide something on the level of a consumer APU.
Rad-hardened CPUs don't use leading-edge processes, and their extreme reliability requirements trump performance and game developer approachability.
Which contractors have an architectural license from ARM, and which ones work on low-cost mass production at 28nm and below? How are their GPUs?
 
Apple's Cyclone chips go into devices that serve massively profitable and growing markets. Bespoke hardware engineering goes to the markets that deserve it, and the measure of worthiness is the profitability of said market.
Ok, let's start the beginning. The economics of mobile devices like phones and tablet are based on different profit economics from games consoles. Apple and Samsung's profits come primarily from hardware, Sony and Microsoft's profits comes primarily from software. R&D for mobile devices is constant year-on-year exercei and investment, R&D for a games console is not. That does not mean R&D is done on a shoe-string budget.

Consoles--as we know them--do not deserve it, and the A7 is not in the same league as the console APUs in performance.
The A7 also draws a fraction of the watts than either console APU does. That's a reality of needing to preserve battery life in a mobile device, that isn't a consideration of a console. Nor is passive cooling.

This indicates that they don't have the necessary experience or IP to provide something on the level of a consumer APU.
I'm not sure I follow that logic at all. The consumer space for ARM is far less interesting in terms of technology delivering performance than aerospace and defence. Based on what I've seen, anyway. And again, as above, this is primarily because battery life is holding everything back.

Rad-hardened CPUs don't use leading-edge processes, and their extreme reliability requirements trump performance and game developer approachability.
Huh, where are you getting this from? You know that you can get high performance Octo-core chips in the RAD package. And the process is absolutely state of the art. Hence the cost.

Which contractors have an architectural license from ARM, and which ones work on low-cost mass production at 28nm and below? How are their GPUs?
Right, becuase I'm going to disclose that :rolleyes:
 
Ok, let's start the beginning. The economics of mobile devices like phones and tablet are based on different profit economics from games consoles. Apple and Samsung's profits come primarily from hardware, Sony and Microsoft's profits comes primarily from software.
Apple also has content and software profits in addition to hardware, but it has very good volumes on high-margin devices.
Samsung doesn't command the premium, but as a data point, it sells over 40 million devices a quarter. It may sell as many or more of its chips in a year than the final lifetime total for the current gen. Samsung's often heavy reliance on standard cores or cores made by others like Qualcomm may be showing the limits of not having one's own design capability.

R&D for mobile devices is constant year-on-year exercei and investment, R&D for a games console is not. That does not mean R&D is done on a shoe-string budget.
The scope of the semi-custom APU deals is in the hundreds of millions to several billion over a stretch of years, per some staff Linkedin data. I think the figure includes the contracted payments to AMD for the APUs produced, however.

The A7 also draws a fraction of the watts than either console APU does. That's a reality of needing to preserve battery life in a mobile device, that isn't a consideration of a console. Nor is passive cooling.
That power constraint also means Apple didn't need to do a number of things that have proven themselves to be difficult to scale. Being mobile means a lot of anemic hardware choices are acceptable, and a number of design challenges don't need to be taken on.

I'm not sure I follow that logic at all. The consumer space for ARM is far less interesting in terms of technology delivering performance than aerospace and defence. Based on what I've seen, anyway. And again, as above, this is primarily because battery life is holding everything back.
The consumer electronics space occupied by consoles is at the intersection of decently high performance devices, good production volumes on a commodity process with promises of node shrinks, interfacing with a lot of standard components, while having all the IP for multimedia, games, and significant investment and distribution of software development tools.
The current gen leverages all of this to a very great degree.

AMD didn't have any IP or expertise that was leading edge in any single category versus competitors that could frequently best it on any single SOC and API/development component.

Huh, where are you getting this from? You know that you can get high performance Octo-core chips in the RAD package. And the process is absolutely state of the art. Hence the cost.
I suppose the radiation-hardened products I'm dimly aware of don't opt for 28nm on TSMC at 80 dollars a chip.

Right, becuase I'm going to disclose that :rolleyes:
Then there's nothing I can say either way.
I can't debate the merits of undisclosed tech versus publically disclosed technology that has spent years and hundreds of billions of dollars to iterate and evolve.
 
The scope of the semi-custom APU deals is in the hundreds of millions to several billion over a stretch of years, per some staff Linkedin data. I think the figure includes the contracted payments to AMD for the APUs produced, however.

So if I follow this, you're suggesting that Microsoft and Sony have actually invested a fair bit in their APU designs. Hundreds-of-millions to billions is not small change (and certainly exceeds what Apple expend on R&D for a single cheap each year - based on their public finance). But two posts back you had this view:

Apple's Cyclone chips go into devices that serve massively profitable and growing markets. Bespoke hardware engineering goes to the markets that deserve it, and the measure of worthiness is the profitability of said market. Consoles--as we know them--do not deserve it, and the A7 is not in the same league as the console APUs in performance.

Which looks to the reverse because you are saying that you do not think consoles are worth that investment. Of course it's what Microsoft and Sony think which matters but I'm confused about your position. But also it's difficult to compare the design process of a chip costed around $17 to go in a phone compared to one costing $100 to go in a console. You can do a lot better with 5x the budget, more power and active cooling.

That power constraint also means Apple didn't need to do a number of things that have proven themselves to be difficult to scale. Being mobile means a lot of anemic hardware choices are acceptable, and a number of design challenges don't need to be taken on.

Well that's one motherload of an assumption there. In terms of cores and clocks, Apple's A7 is well behind the curve. It's tops out at a modest 1.4Ghz (iPad Air) and is dual core. You think Apple simply couldn't have managed a quad-core design like so many others, or offered higher clocks? Why? I personally don't think they needed to, but not needing to and not being able to are quite different.

The consumer electronics space occupied by consoles is at the intersection of decently high performance devices, good production volumes on a commodity process with promises of node shrinks, interfacing with a lot of standard components, while having all the IP for multimedia, games, and significant investment and distribution of software development tools. The current gen leverages all of this to a very great degree.

I personally struggle with calling either console a 'high performance' device. The second generation of HD consoles where 1080p still isn't being delivered enough. Of the last four console generations the current crop are fairly lacklustre compared to technology in mainstream PC space. Of course I offered ARM as a contender [for processor] in PlayStation 5. Microsoft and Sony don't need the best or fastest processor in a games console (look at what we have now), they just need to be good enough.

AMD didn't have any IP or expertise that was leading edge in any single category versus competitors that could frequently best it on any single SOC and API/development component.

I'm not sure what this comment is in regard too? If it's about the choice of an AMD APU this generation, Mark Cerny make an interesting comment that suggested it really wasn't a choice at all. It's was the only viable option given their target launch window.
 
Yup, very big if. But if any PlayStation architecture looks to make the leap into the following generation and stand any chance of architectural backwards compatibility, I'd say an 80x86 processor and AMD Radeon GPU without EDRAM/ESRAM, is probably it.

However I think that ARM could well be a serious contender next time.

Sadly (I suppose?) I'm almost positive it will be ARM. Sounded like Sony/MS really wanted to go with them this time as it was, they just weren't quite powerful enough yet.
 
Sadly (I suppose?) I'm almost positive it will be ARM. Sounded like Sony/MS really wanted to go with them this time as it was, they just weren't quite powerful enough yet.
Yeah, one architecture would have appealed to them but contrary to that, Cerny said Sony wanted plot go with an architecture familiar to developers.

While there are a lot of ARM developers, they aren't their targeted devs - the console devs. But yeah, I reckon they debated that one for ages.
 
So if I follow this, you're suggesting that Microsoft and Sony have actually invested a fair bit in their APU designs. Hundreds-of-millions to billions is not small change (and certainly exceeds what Apple expend on R&D for a single cheap each year - based on their public finance). But two posts back you had this view:


Which looks to the reverse because you are saying that you do not think consoles are worth that investment. Of course it's what Microsoft and Sony think which matters but I'm confused about your position.
This only contradicts my view if the roll-your-own-and-play-with-the-big-dogs approach is cheaper than contracting with a company that knows what it's doing for hundreds of millions of dollars.

If you believe AMD, the head of the server group presented last year with an estimated 400 million dollars and three years to design an x86 server processor versus 30 million and 1.5 years for an ARM.
I don't actually believe that was really a fair comparison (it was for investors, not people AMD wants to educate) nor entirely applicable in this case, but a complex OoO design can take 3-5 years and some hundreds of millions of dollars all on its own without including a GPU or additional SOC hardware.

My view of the situation is that a console manufacturer starting from a non-presence in performant and manufacturable modern APU design faces more expense and risk for a device whose volumes are a fraction of what is considered sustainable on the modern market, and whose software as razorblades model nearly broke completely in the last generation.

Despite costs, the consoles started with an effectively finished and unmodified Jaguar CPU, and the fundamental framework of standard GCN.
Starting fresh would have incurred the above costs and then added on years and tens to hundreds of millions of dollars more.
Instead Jaguar and GCN have their costs amortized over a much wider range of SKUs than the comparatively small volume of consoles per quarter.
The tens of millions of units in aggregate of Jaguar/Puma APUs and various Sea Islands GPUs, and even Kaveri help provide sales volume where the consoles cannot.
The less said about how mobile SOCs sell an order of magnitude above that quarterly, the better.

If we believe Vgleaks and the diagram of an old Steamroller-based PS4 design, Sony had the choice between two expensive multiyear CPU projects and got to shrug one off.
To do the same in a from-scratch in-house effort would have been ruinous.

But also it's difficult to compare the design process of a chip costed around $17 to go in a phone compared to one costing $100 to go in a console. You can do a lot better with 5x the budget, more power and active cooling.
It's actually possible that the lower-cost high volume chip had more R&D go into it than the one that accepts higher per-unit costs. It's actually the general story of much of AMD's product mix versus Intel.
An Intel Core i3 might cost tens of dollars in silicon costs, but there is little doubt that Intel spent AMD into the ground to design it, given the process co-development and significant research and design costs that went into the cores, superior memory subsystem, LGA package, higher levels of integration, and superior per-unit costs.
Spend more now, and save money over a hundred million units. Be better than AMD, and charge more to boot.

Well that's one motherload of an assumption there. In terms of cores and clocks, Apple's A7 is well behind the curve. It's tops out at a modest 1.4Ghz (iPad Air) and is dual core. You think Apple simply couldn't have managed a quad-core design like so many others, or offered higher clocks? Why? I personally don't think they needed to, but not needing to and not being able to are quite different.
I haven't seen an architectural deep dive concerning Cyclone's features.
The apparent emphasis is on being physically optimized to have a lower max frequency in the current implementation.
Its width is fine at the low clocks it targets, but would be prohibitive if they were higher.
It has a 4-cycle load latency at low clocks, which might point to a cache subsystem that adds extra stages to help drive signals at lower voltages than a more tight load pipeline would permit.

Memory bandwidth is an order of magnitude below what the consoles work with, and getting a good memory subsystem is an even darker art than getting a good-enough core.
AMD's has some glaring deficiencies, but it is still beyond what mobile designs have tried to do, and outside of a few initial custom forays memory performance is a critical disadvantage that ARM is now working towards improving. The A7 is notable in that it improves upon things to some unspecified level.

The core is double the area of Jaguar, so there are likely physical tradeoffs for reduced leakage and a lower clock ceiling can result.
I haven't seen discussion on the features of the A7's L2 and L2 interface, which is a rather impressive area consumer for Jaguar.

To compound the comparison, AMD's Puma core is Jaguar with functional turbo and bug-fixed power management.
This goes to show the complexity of Jaguar's less glamorous attributes and the limited engineering resources AMD had to devote to all its initiatives. For what it's worth, AMD's not-best autonomous power management is still quite good relative to most of the supposed competitors.

I personally struggle with calling either console a 'high performance' device.
For a consumer device of this type there is AMD, Intel, ARM in mobiles, and the last dregs of IBM's old designs.
The latter two are not in the same area code in terms of performance, at least not until the much higher core count A57 chips finally make their way to market 1.5 years after Jaguar (and possibly only this close because AMD's R&D was significantly disrupted at some point in the last few years).
AMD's chips are clearly inferior to Intel, but still can reach some level of parity in some measures to the low to mid tier Intel consumer chips, and in that regard AMD has little company.
A console APU can manage the multithreaded CPU throughput of a single Steamroller chip, which nips at the lower end of Intel's i5 and more solidly matches i3.
Single-threaded, not so much.

If you want to go a bit further afield, there's POWER and perhaps Oracle's chips, but those are well out of consideration.
For graphics, the number of players in the same zip code as the consoles is two.

The second generation of HD consoles where 1080p still isn't being delivered enough. Of the last four console generations the current crop are fairly lacklustre compared to technology in mainstream PC space.
It's a price, power, and size constrained entertainment appliance whose traditional economic model is borderline unsustainable.
There are hundreds of watts of TDP the consoles can't have, and the prospects for cost reduction and power savings through node transitions are much poorer than they once were.
The last gen had a worse time of things than was expected, and if that were known in advance they might have scaled back as well.

Of course I offered ARM as a contender [for processor] in PlayStation 5. Microsoft and Sony don't need the best or fastest processor in a games console (look at what we have now), they just need to be good enough.
As I mentioned, if AMD doesn't implode, then one good-enough ARM contender is going to have a good-enough custom ARM or a closely aligned good-enough x86 and IP that is closely aligned with the existing platform.

I'm not sure what this comment is in regard too? If it's about the choice of an AMD APU this generation, Mark Cerny make an interesting comment that suggested it really wasn't a choice at all. It's was the only viable option given their target launch window.
It was the viable option given their constraint that it all be one-chip, have a common CPU architecture, have a performant GPU, and be able to reach a final design in several years.
Had they relaxed the SOC constraint and release date, several other combinations would have been readily applicable.

AMD's CPUs are not the best CPUs, and Jaguar is not the fastest AMD CPU.
AMD's GPUs are not the best GPUs.
AMD's memory controllers and cache subsystems are not the best.
AMD's DVFS, physical design, IO, connectivity, compiler design, driver quality, 64-bit ISA, etc. and so on...

On the other hand, the ones that beat AMD in some of those categories were no-shows in others, which is why some unspecified defense contractor's super-ARM may not be enough to make it compelling if the designer's overall portfolio and expertise don't provide the total package.
 
Last edited by a moderator:
http://www.tweaktown.com/news/38552...ill-be-here-quicker-than-last-time/index.html

Both machines started fast, because gamers had been waiting for eight years for them. People were really anxious about having something different".

"I hope this gen will go more quickly also. My feeling is that because PC is growing fast and lots of people are trying to pick up the business on TV, that will put pressure on [platform holders] to not wait eight years next time. I think they will wait a lot less than that. First they need to make sure that they can sell the current machines at a lower price, so that there are enough games sold and enough extra content so that their platform is profitable. This will help the market".
 
This only contradicts my view if the roll-your-own-and-play-with-the-big-dogs approach is cheaper than contracting with a company that knows what it's doing for hundreds of millions of dollars.

ARM has issued 400 individual licences (although because of they way ARM licence their IP (by architecture - Cortex A, R and M) there's some duplication but that still leaves more 200+ unique licensees. That's a lot of companies doing their own thing, which suggests that actually ARM core IP is not only incredibly versatile and scaleable extensible architecture, but it's easy to work with. The diverse scale of small-to-large organisations suggests it's also cost friendly. 200+ companies would be busting their arses with it otherwise. You quote AMD as supporting this too.

There seems to be little argument that doing your own thing can be the most cost effective solution. And again, you don't have to be producing the best chip, only chips good enough.

If you believe AMD, the head of the server group presented last year with an estimated 400 million dollars and three years to design an x86 server processor versus 30 million and 1.5 years for an ARM.

I can well believe there is massive cost divide.

My view of the situation is that a console manufacturer starting from a non-presence in performant and manufacturable modern APU design faces more expense and risk for a device whose volumes are a fraction of what is considered sustainable on the modern market, and whose software as razorblades model nearly broke completely in the last generation.

I disagree that the basic economics of consoles is broken, last gen was not great for Sony and Microsoft due to fuckups. Sony let their project get crazy out of hand, it was though there was no project management at all and Microsoft incurred RRoD costs that could have been saved with better design/testing. There are not costs that you typically attribute to consoles - at least not Sony consoles.

Despite costs, the consoles started with an effectively finished and unmodified Jaguar CPU, and the fundamental framework of standard GCN. Starting fresh would have incurred the above costs and then added on years and tens to hundreds of millions of dollars more.

I don't follow here. Starting fresh from what? Both manufacturers started fresh from last gen.

If we believe Vgleaks and the diagram of an old Steamroller-based PS4 design, Sony had the choice between two expensive multiyear CPU projects and got to shrug one off.

My recollection of the VGleaks coverage was this was a platform Sony was using as a very early development kit because as it was close computationally to the final target hardware (4x 3.2Ghz cores, vs 8x 1.6Ghz cores). I've not seen anything to suggest that was intended to be the final platform but if you have something..... ?

It's actually possible that the lower-cost high volume chip had more R&D go into it than the one that accepts higher per-unit costs. It's actually the general story of much of AMD's product mix versus Intel.

Sure it's possible. However if you look at the R&D resources of Intel and AMD, then look at the R&D resources of ARM and all the ARM licences, then it looks unlikely.

Memory bandwidth is an order of magnitude below what the consoles work with, and getting a good memory subsystem is an even darker art than getting a good-enough core.

Right, but nobody is suggesting using an Apple A7 in a games console. Apple designed the A7 for phones and tablets and I included it as an example of an ARM licensee who were using stock CPU cores and are now rolling their own. And in the terms of getting a 64-bit processor in consumer products, leapt ahead of ARMs own roadmap

As I mentioned, if AMD doesn't implode, then one good-enough ARM contender is going to have a good-enough custom ARM or a closely aligned good-enough x86 and IP that is closely aligned with the existing platform.

Finally we're talking the same language. Yeah, it only has to be good enough. And this is where AlNets came in for a solution to b/c. But I don't think b/c is in a console manufacturer's financial interest. I think it was Shifty was said it [in another thread], they don't want you playing old games better on your new hardware, they want you buying new games for your new hardware.

It was the viable option given their constraint that it all be one-chip, have a common CPU architecture, have a performant GPU, and be able to reach a final design in several years. Had they relaxed the SOC constraint and release date, several other combinations would have been readily applicable.
I don't remember Sony ever saying a design constraint was one chip, indeed Mark Cerny said completely the reverse at GameLabs last June. He said that "using Cell [in PS4] was an option" (his exact words). He also said that "we could have gone with a conventional architecture, with just a CPU and GPU". In fact the only design goals that Cerny stated were key were: 1) unified memory. 2) four or eight core CPU. 3) nothing exotic (no freaky ray-tracing GPU). 4) a familiar architecture.


Not everybody has been waiting that long, i.e. Sony din't sell 80m PlayStation 3s at launch in 2006/07 ;) I believe the vast majority of PlayStation 3 sales have been in he latter half of the console's life - Rangers may be able to confirm. But it's less likely that anybody waiting that long to jump on a PlayStation 3 will be jumping on the newer consoles in the first year.
 
ARM has issued 400 individual licences (although because of they way ARM licence their IP (by architecture - Cortex A, R and M) there's some duplication but that still leaves more 200+ unique licensees.
The number of architectural licensees was as of 2013 around 15.
That is the class of license that permits a custom core microarchitecture.
The rest don't want to mess with the CPUs, and the vast majority of them in terms of device volume and possibly licensees can use or even need a current process node.
Many uses of ARM cores rely on foundry-provided packages that do the physical validation and design for manufacturing work for them as well.
Save possibly for the A57 that hasn't come out yet, none of the other cores in any of those families would be close to matching the Jaguar position in terms of performance or 64-bit (or 32 or 16 depending) support.

There seems to be little argument that doing your own thing can be the most cost effective solution. And again, you don't have to be producing the best chip, only chips good enough.
Good enough in this case seems to be a rather awkward 2-modue Jaguar CPU section, and no current standard ARM cores meet that bar.
Good-enough also moves around depending on who else is in the same price and capability range.


I disagree that the basic economics of consoles is broken, last gen was not great for Sony and Microsoft due to fuckups. Sony let their project get crazy out of hand, it was though there was no project management at all
Maybe things like a CPU architecture they invested so much into that showed they had stepped beyond the limits of their architectural and leading-edge manufacturing expertise? The Cell experience lead them to regress on both counts. They ditched leading-edge fabs and retreated from rolling their own microarchitectures.

and Microsoft incurred RRoD costs that could have been saved with better design/testing.
Physical design and testing would be something Microsoft would have cooperated with AMD with for the current gen. RRoD occurred at the transition to lead-free solder, and Microsoft tried to do its GPU's physical package design in-house. Part of the fix, besides a process node transition, was involving outside help.

I don't follow here. Starting fresh from what? Both manufacturers started fresh from last gen.
Jaguar and GCN were already done or nearly so.
Last gen, the CPUs were not fully realized and neither was Xenos.

Sony and Microsoft were free to make changes at the periphery of the CPU section, and were able to customize the GPU and memory subsystems to a greater extent, although many significant elements are riffs on existing structures going back to earlier AMD chips.

My recollection of the VGleaks coverage was this was a platform Sony was using as a very early development kit because as it was close computationally to the final target hardware (4x 3.2Ghz cores, vs 8x 1.6Ghz cores). I've not seen anything to suggest that was intended to be the final platform but if you have something..... ?
Steamroller is what is in Kaveri, the processor launched after the PS4 did, and is the desktop chip that uses the Onion+ bus that Sony wanted for its design.
Having an expensive and high-performance internal CPU development project used solely to prototype a totally different internal development architecture would be an even greater extravagance for an internal CPU division.


Sure it's possible. However if you look at the R&D resources of Intel and AMD, then look at the R&D resources of ARM and all the ARM licences, then it looks unlikely.
It's possible, and it happens all the time when comparing an Intel chip to its equivalently priced AMD counterpart.


Right, but nobody is suggesting using an Apple A7 in a games console. Apple designed the A7 for phones and tablets and I included it as an example of an ARM licensee who were using stock CPU cores and are now rolling their own.
And it happened because Apple is sitting on all the money, and the custom product is going into segments that will sell in 6 months more units than the PS4 ever will. This means Apple can accept the high costs and long lead times involved in starting a hardware design division, and it has the volumes and revenues from its own sales to justify the expense.

Neither Microsoft or Sony can say the same, in terms of schedule, or the fact that they are using base hardware that the other uses, and which AMD sells millions of units of independently.

Finally we're talking the same language. Yeah, it only has to be good enough. And this is where AlNets came in for a solution to b/c. But I don't think b/c is in a console manufacturer's financial interest. I think it was Shifty was said it [in another thread], they don't want you playing old games better on your new hardware, they want you buying new games for your new hardware.
Then create some arbitrary break in software compatibility, but keep the bulk of the tools and established development pipelines.

I don't remember Sony ever saying a design constraint was one chip, indeed Mark Cerny said completely the reverse at GameLabs last June.
Then I misremembered the statement and had recalled the Durango design restriction instead.
 
The number of architectural licensees was as of 2013 around 15.
http://www.arm.com/products/processors/licensees.php Interesting that don't even list Apple but I'm not surprised to see certain others I'm aware of. Actually it looks like ARM are only highlighting vendors.

Good enough in this case seems to be a rather awkward 2-modue Jaguar CPU section, and no current standard ARM cores meet that bar.
Again, PlayStation 5. Look forward.

Maybe things like a CPU architecture they invested so much into that showed they had stepped beyond the limits of their architectural and leading-edge manufacturing expertise? The Cell experience lead them to regress on both counts. They ditched leading-edge fabs and retreated from rolling their own micro architectures.

I'm lost on what this has to do with your assertion that the console economics model is broken? And if Cell was perceived so badly [internally at Sony], why did Mark Cerny even mention it as being considered as a possibility for the PS4? That makes no sense.

RRoD occurred at the transition to lead-free solder, and Microsoft tried to do its GPU's physical package design in-house. Part of the fix, besides a process node transition, was involving outside help.

We know why it happened, the fact that it did doesn't mean the console economics model is broken :???: I'm really lost at your responses here, it's like you completely reset the conversation every post and everything you said before didn't happen :???:

Jaguar and GCN were already done or nearly so. Last gen, the CPUs were not fully realized and neither was Xenos.

In 2007 when Sony started work on PlayStation 4? I don't think so. Again, flick back to the things Cerny said at GameLabs last June. They considered a revised Cell design, they had years to work on this. They could have gone with anything at all. It was developers would set the bounds and key objectives.

Having an expensive and high-performance internal CPU development project used solely to prototype a totally different internal development architecture would be an even greater extravagance for an internal CPU division.

If steamroller was further along in silicon testing at that point, it would make perfect sense. Although it was further along the product roadmap, it may have been a higher priority project inside AMD. R&D completion of projects won't always linearly shadow a release roadmap.

And it happened because Apple is sitting on all the money, and the custom product is going into segments that will sell in 6 months more units than the PS4 ever will.

I don't follow. Apple aren't throwing crazy amounts of money at R&D even across the whole company (look at their financials) so they clearly aren't just throwing huge amounts of money at the problem to get ARM to do what they want it to. Ignoring Apple, what about all the other companies using ARM without vast resources? What are they using, magic?

Neither Microsoft or Sony can say the same, in terms of schedule, or the fact that they are using base hardware that the other uses, and which AMD sells millions of units of independently.

What's this got to do with the next generation (PS5/XB4) of consoles? :???:
 
Again, PlayStation 5. Look forward.
At which point, the already cited 1-2 year delay in equivalent results between an architectural licensee's custom core and the bog-standard ARM cores comes into play.

I'm lost on what this has to do with your assertion that the console economics model is broken? And if Cell was perceived so badly [internally at Sony], why did Mark Cerny even mention it as being considered as a possibility for the PS4? That makes no sense.
It would have already been a sunk cost and it could have reused many of the existing tools they'd spent 8 years on. The design expense, funds blown on in-house fabs, and the failure of Cell as a consumer electronics platform had already happened. It couldn't make them re-lose all that money, and I didn't interpret the possibility that Sony was going to reopen a full design project to create a new Cell microarchitecture.

The SPEs are not OoO and quite simple, while still being very good at what they do best. The PPE was something IBM might need to be contracted to replace, since it is already the contractor for managing the shrinks.
The wrinkle is that the implementation predates a lot of the modern power management techniques and has historically depended on SOI, the additional rework might have helped scuttle things.

We know why it happened, the fact that it did doesn't mean the console economics model is broken :???:
Physical implementation, design reliability, and risk management are part of what paying someone who knows what they are doing entails. Microsoft thought it could roll its own, and it got burned. Lead-free solder has been handled by now, but long-term device reliability is an area of active research because those challenges have gotten worse.

I'm really lost at your responses here, it's like you completely reset the conversation every post and everything you said before didn't happen :???:
I thought my position was consistent:

The refrain: Bespoke hardware is expensive. New hardware has gotten harder and will be getting harder to get right. If you aren't good at it already, getting good is expensive. Getting it wrong is expensive.
Examples of when custom designs are made involve massive investments to get them right, and then are only considered if the revenues and volumes justify the expense.

Sony and Microsoft paid someone who had already gotten it right, and a lot of the expense is being defrayed over multiple markets instead of relying solely on console hardware to make up for things.

That model was already failing last gen. Cell is a failure because it did count on multiple markets that didn't happen.
Every design challenge and manufacturing challenge since then has only gotten worse, and neither console maker has shown sufficient growth in microarchitectural expertise. For Sony, it's the opposite. They've gotten worse at rolling their own, so back to the refrain.

Console volumes are small relative to the markets GCN and Jaguar go into, and miniscule relative to the volumes of custom ARM chips like Krait and Cyclone.


In 2007 when Sony started work on PlayStation 4?
They didn't contract AMD until much later. They didn't pay into AMD's R&D while GCN or Jaguar were finished up, so the risk and expenditures in that time period would have been all on AMD.

If steamroller was further along in silicon testing at that point, it would make perfect sense.
This is irrelevant to the comparison of rolling one's own architecture versus paying for one developed by someone with competence in the discipline.
AMD took on the bulk of the risk and has to worry about the monetization of Steamroller.
Sony isn't on the hook like it would be if there were a Sony Steamroller and Sony Jaguar in a multi-billion dollar internal development project.

I don't follow. Apple aren't throwing crazy amounts of money at R&D even across the whole company (look at their financials) so they clearly aren't just throwing huge amounts of money at the problem to get ARM to do what they want it to. Ignoring Apple, what about all the other companies using ARM without vast resources? What are they using, magic?
In the last 5 or so years, Apple's R&D has ramped up over 2 billion dollars a year.
That's enough to cover quite a bit, particularly since the core design costs cited by AMD are spread out over multiple years.
The AMD semicustom contract likely includes contracted wafer or unit purchases between AMD and Microsoft, which Apple would not account as R&D.
In addition, did Apple account significant expenses such as the PA Semi acquistion outside of R&D?
Ramping up a high-end development division needed those expenditures as well.

What's this got to do with the next generation (PS5/XB4) of consoles? :???:
It goes to show they are going to greater lengths than last gen to avoid designing their own architectures.
They aren't interested in a console-only architecture, and don't even care if someone else is selling very similar hardware. They're actually counting on it.
 
Last edited by a moderator:
At which point, the already cited 1-2 year delay in equivalent results between an architectural licensee's custom core and the bog-standard ARM cores comes into play.

But it's not 1-2 years of delay, it's 1-2 years of development during which you have complete control over your project and are not subject to the timetables of an external collaborator. In his interview with Eurogamer, Mark Cerny made a concerted point that the PlayStation 4 hardware was largely dictated by their partners and their timetables. Personally I think this pointed remark was indicating it reduced their options, it didn't open them up.

But I challenge the 1-2 years for custom parts given there are ARM licensees, who aren't big companies with lots of money, nor with evidence they have multiple chip teams, deploying custom ARM solutions at a higher pace.

The design expense, funds blown on in-house fabs, and the failure of Cell as a consumer electronics platform had already happened. It couldn't make them re-lose all that money, and I didn't interpret the possibility that Sony was going to reopen a full design project to create a new Cell microarchitecture.

It was Sony's choice to go all in on Cell and have manufacturing facilities. IBM continues research on Cell. But again, you are looking backwards and not looking forwards. Because that was Sony's approach in the past, does not means that it is the only approach and it would have had to be that way for PlayStation 4 (or PlayStation 5).

The SPEs are not OoO and quite simple, while still being very good at what they do best. The PPE was something IBM might need to be contracted to replace, since it is already the contractor for managing the shrinks.

Nobody is suggesting that Sony would have rolled their own Cell part a second time. But Cell wasn't the discussion, it was a point I raised to disprove your assertion that Sony had a 'one-chip strategy' for PS4.

Physical implementation, design reliability, and risk management are part of what paying someone who knows what they are doing entails. Microsoft thought it could roll its own, and it got burned.

Yeah, with an IBM CPU and an AMD GPU. The choice and expertise of the chip designers were not the issue. :nope:

The refrain: Bespoke hardware is expensive. New hardware has gotten harder and will be getting harder to get right. If you aren't good at it already, getting good is expensive. Getting it wrong is expensive.

Except the entire ARM industry disproves this conventional wisdom. A diverse portfolio of small and large companies, embracing and deploying ARM solutions, some with no previous experience with processors. If ARM was hard to get right (as in x86 levels of complexity) then the 200+ companies using ARM would have left a trail of disastrous projects across the numerous product spaces. Let's count them... umm. well. Uh.

Examples of when custom designs are made involve massive investments to get them right, and then are only considered if the revenues and volumes justify the expense.

Custom will probably always be more expensive than standard. But again with the "massive investments" thing. ARM bucks this conventional wisdom, ARM's business model is the creation of inherently extensible, changeable, scalable processor cores without massive costs. It's why they are so successful. The console market remains very profitable - Sony and Microsoft aren't in this because of their philanthropic generosity to gamers :nope:. In terms of profits, Sony have got it more right (PS,PS2) than wrong (PS3). And yet PS3 became profitable (in real terms) a few years back. Last month Kaz Hirai predicted the PS4 would likely be more profitable than PS2 for them. If Sony wanted to do something extravagant, there's enough gold in them hills.

Sony and Microsoft paid someone who had already gotten it right, and a lot of the expense is being defrayed over multiple markets instead of relying solely on console hardware to make up for things.

That's not how business works. You don't seriously believe that AMD cut Microsoft and Sony a break on the chips because the IP was used in AMD's own product lines? Particularly given the lack of viable alternatives? :rolleyes: If this benefited anybody, it benefited AMD because they got to pass along some of that original R&D cost to the console makers as well charging for the changes they each wanted.

Console volumes are small relative to the markets GCN and Jaguar go into, and miniscule relative to the volumes of custom ARM chips like Krait and Cyclone.

What do the economics of the final platforms have to do with the economics of the chip R&D? You have a platform budget and somewhere in that you have a chip budget. x86, ARM, Cell, PowerPC - all have offerings that start low and work up.

They didn't contract AMD until much later. They didn't pay into AMD's R&D while GCN or Jaguar were finished up, so the risk and expenditures in that time period would have been all on AMD

Interesting, when did Microsoft and Sony first approach AMD?

This is irrelevant to the comparison of rolling one's own architecture versus paying for one developed by someone with competence in the discipline.

It is irrelevant but you brought up the steamroller leak, not me.

In the last 5 or so years, Apple's R&D has ramped up over 2 billion dollars a year. That's enough to cover quite a bit, particularly since the core design costs cited by AMD are spread out over multiple years.

It has, but Apple's acquisition of PA Semi was in 2008 (six years ago). Apple's massive investments in R&D have largely been attributed to the work being done to mass produce sapphire glass and US production facilities. I've seen no evidence that significant spend is going into ARM development, and why would it, Apple deploy one new chip a year. Other much smaller companies with much smaller R&D spend are more active.

The AMD semicustom contract likely includes contracted wafer or unit purchases between AMD and Microsoft, which Apple would not account as R&D. In addition, did Apple account significant expenses such as the PA Semi acquistion outside of R&D?

The first half is speculation on your part, with no supporting evidence. The PA Semi acquisition, which cost Apple $278m, was six years ago. Peanuts for Apple and also way below the Sony's $400m acquisition of Gaikai. The Apple R&D increase spend has been attributed by investment by Apple to secure mass sapphire glass production. Apple are unusual in that they will heavily invest in their supplier's production facilities in return for preferential supply and rates. They have a long history of doing this.

Ramping up a high-end development division needed those expenditures as well.

What high-end development division? Are we taking about Apple, Microsoft or Sony?

It goes to show they are going to greater lengths than last gen to avoid designing their own architectures.

That's assuming facts not in evidence. Even all these years later, it's not clear who in Sony was the driver for Cell. The original STI tripartite numbered Sony, Toshiba and IBM but if you look back over the early STI papers it's Sony Corporation not Sony Computer Entertainment (SCE) - they seemingly got sucked in later because of PlayStation 3. An alternative interpretation is SCE got stuck with Cell because Sony had this grand vision to use one chip architecture across lots of products - games consoles, TVs, progressional broadcast equipment and so on. It looked like a sensible strategic move and may have been if it wasn't for PS3 where the architecture was not only an issue for Sony's teams but PlayStation third party developers. But if they were avoiding their own thing, like you think, why consider Cell at all for PlayStation 4. Mark Cerny said it was an option.
 
Last edited by a moderator:
But it's not 1-2 years of delay, it's 1-2 years of development, during which you have complete control over your project and not subject to timetables of an external collaborator.
The two cases I'm discussing are 1) Sony using a custom core from someone else and 2) using an equivalent standard ARM core.
An equivalent standard core won't be out for a year or two after the custom one, at least.

The scenario you are discussing is having in-house development of a custom core.
If Cerny was dithering over whether to go with ARM or x86 up until several years ago, he deserved to have his options limited.
That choice would have needed full commitment six or more years ago, and I'd wager that the end result would be inferior.


It was Sony's choice to go all in on Cell and have manufacturing facilities. IBM continues research on Cell.
Which project would that be?

Yeah, with an IBM CPU and an AMD GPU. The choice and expertise of the chip designers weren't the issue. :nope:
The GPU was the reason for RRoD, and it had to do with the mechanical failure of the physical chip package. That part was Microsoft's, and it involved the hardware designers when it came to fixing it. With the high power densities and increasingly exotic and fragile materials and structures, that is a big part of chip design as long as we assume people are designing chips that are supposed to work in the real world.

Except the entire ARM industry disprove this conventional wisdom. A diverse portfolio of small and large companies, embracing and deploying ARM solutions, no previous experience in processors. If ARM was hard to get right (as in x86 levels of complexity) the 200+ companies using ARM would have left a trail of disastrous projects across the numerous product spaces. Let's count them... umm. well. Uh.
15 or so companies need to worry about getting a complex new architecture right.
200+ take the architecture as-is. They paid ARM or a foundry to get it right for them, with the drawback that they get a generic solution that can't match contemporaneous optimized designs.
Most of the general licensees are not on a process from this decade, and don't care.


Custom will probably always be more expensive than standard. But again with the "massive investments" thing. ARM bucks this conventional wisdom, ARM's business model is the creation of inherently extensible, changeable, scalable processor cores without massive costs.
The vast majority of ARM cores are not extensible or changeable. That's different level of license, and the vast majority of licensees don't care.

Last month Kaz Hirai commented that PS4 would likely be more profitable than PS2 for them. If Sony wanted to do something extravagant, there's enough gold in them hills.
Are there numbers for the PS2's lifetime profit, inflation adjusted hopefully?

That's not how business works. You don't seriously believe that AMD cut Microsoft and Sony a break on the chips because the IP was used in AMD's own product lines? Particularly given the lack of viable alternates? :rolleyes:
Its outside product lines are one factor that probably allowed AMD to bid lower than what most other options would have been willing to do.
As AMD described, it is willing to do semi-custom work for clients with a product with at least a hundred million dollars or so of revenue minimum, and if the client is willing to pay for the incremental R&D.

If this benefited anybody, it benefited AMD because they got to pass along some of that original R&D cost to the console makers as well charging for the changes they each wanted.
It benefits everybody involved.
AMD leverages its IP for money, which is good because the console market is one place where its level of mediocrity isn't as apparent.
AMD's CPUs cannot penetrate the mobile market, and cannot gain significant ground in desktop.
Its GPUs are second-best in a small market segment, and have lost serious ground in mobile.

Sony and Microsoft get a CPU and GPU they wouldn't be able to design on their own in several years, and their expenditures are closer to the R&D cost over the base IP AMD has already paid into.

(Addendum: They also are likely contracting yield improvement and manufacturing risk onto AMD. Already very difficult node transitions would also probably be handled by AMD if it is paid for them.)

This is good, because Sony is falling apart, and large investors want Microsoft to sell off Xbox because they are sick of a decade of ROI vastly inferior to other MS initiatives.

What do the economics of the final platforms have to do with the economics of the chip R&D?
My assumption is that the chips being researched are supposed to enhance the profitability of the platform.


Interesting, when did Microsoft and Sony first approach AMD?
The Xbox interview put the design being laid down 2-3 years ago. I'm trying to find the PS4 timeline, which I think was in the same range to a bit longer.

It is irrelevant but you brought up the steamroller leak, not me.
To restate the point I'm trying to make: whether or not Steamroller was further along for AMD is not relevant to a hypothetical scenario where Sony is engineering two of its own core designs.
The idea that a division would build two expensive architectures, and use one of them only as an early debug platform is crazy.

The fact that it was even an option only existed because of the involvement of an outside company who is able to make money of of both of them.
If it weren't an outside company's architecture, Sony should have fired everyone involved.

It has, but Apple acquisition's of PA Semi was in 2008 (six years ago). The massive investments in R&D have largely been attributed to the work being done to mass produce sapphire glass. There's no evidence that significant spend is going into ARM development, and why would they, they deploy one new chip a year.
Here's the graph I'm using to get a picture of the R&D increase.

http://www.tuaw.com/2014/02/12/a-look-at-apples-randd-expenditures-from-1995-2013/

Something on average of over $2 billion per year over the last six years.
Let's say $200 million for a non-server mobile architecture instead of the price cited by AMD. It's probably not even that.
I don't see how fitting $33 million a year in there is a challenge.
Apple had the time, and the revenues for this.

That same amount would actually be noticeable in Sony's profit-loss statements these days.

The first half is speculation on your part, with no supporting evidence.
There is some evidence.
One is that AMD didn't report several billion dollars of unexplained revenue, and we know the multiple billions of dollars that go into AMD's wafer purchases over the years, which Microsoft would be paying AMD for.
The revenues for AMD in the R&D period don't have billions of dollars padding them out, so it looks like money is coming during the mass production phase.

What high-end development division? Are we taking about Apple, Microsoft or Sony?
Apple produced a 64-bit aggressively OoO architecture on a relatively leading edge process with IPC in that limited product window in the same area as a ULV Haswell.
That's high end, particularly given the inferior mobile cores it is compared to.

That's assuming facts not in evidence. Even all these years later, it's not clear who in Sony was the driver for Cell.
IBM's original goal was a homogenous solution, which it previously created and has been using since.
Toshiba liked the SPEs, which resembled other media processors it used, and the stripped down SPU products it tried to make people use later.
Sony split the difference.

An alternative interpretation is SCE got stuck with Cell because Sony had this grand vision to use one chip architecture across lots of products - games consoles, TVs, progressional broadcast equipment and so on.
This is exactly what I stated Cell was intended to be, and what it needed to have happen to justify the investment.
It didn't happen.

But if they were avoiding their own thing, like you think, why consider Cell at all for PlayStation 4. Mark Cerny said it was an option.
As stated, it's a sunk cost. The disaster had already happened and they could reuse 8 years' worth of development infrastructure, which first-party teams were fine with.
 
The two cases I'm discussing are 1) Sony using a custom core from someone else and 2) using an equivalent standard ARM core.
An equivalent standard core won't be out for a year or two after the custom one, at least.
Then we're not discussing the same thing. I'm defending, at your contestation, that ARM is a contender for PlayStation 5. My only position here is that ARM would be viable for PlayStation 5, which I also said I would not expect to see appear before five years of PlayStation 4 being released - these are my first two posts in this thread.

The scenario you are discussing is having in-house development of a custom core.

Well I feel like I've been brought here like a hostage! In every post you are look backwards and not forward [to PS5] so to address your 'ARM isn't powerful enough for a console' stance, I figured it at least be interesting to discuss Sony doing their own ARM chip. Then we did the whole tangent of cost/time/motivation. And now we're here. Can I go now? Stockholm syndrome is kicking in :yep2:

Which project would that be?
It's IBM, they are never that specific. http://www.heise.de/newsticker/meldung/SC09-IBM-laesst-Cell-Prozessor-auslaufen-864497.html

The GPU was the reason for RRoD, and it had to do with the mechanical failure of the physical chip package.
This started out as you arguing that rolling your own CPU was risky (I don't debate this) but your use of RRoD as an example isn't a good one because the problem was not as a result of Microsoft rolling their own chip - IBM and AMD were their chip vendors. I could distort this by say if only Microsoft had designed the chip themselves the experience would have furnished them with the necessary information to prevent that failure from happening.

15 or so companies need to worry about getting a complex new architecture right.
Ok, let's go with fifteen. So there must still be a whole bunch of disastrous projects? No, perhaps because it's not really that monumental a task?

Are there numbers for the PS2's lifetime profit, inflation adjusted hopefully?
Well the PlayStation 2 was on sale for 13 years. You'd need to derive the PS2 profits from each year and calculate for inflation then add them all together while removing R&D costs. I've never felt the urge to do this, myself!

Its outside product lines are one factor that probably allowed AMD to bid lower than what most other options would have been willing to do.

Well we don't really know what went on. You say bid, as though Sony (and Microsoft) let vendors bid but from what Nvidia indicated (I'm getting a touch of sour grapes so this may be a bit of bravado) it wasn't that AMD bid less it's that Nvidia weren't interesting for whatever royalties were available.

This is good, because Sony is falling apart, and large investors want Microsoft to sell off Xbox because they are sick of a decade of ROI vastly inferior to other MS initiatives.
Sony have their share of investors wanting the company to sell PlayStation, but Loeb's incentive is so he can invest in PlayStation which has traditionally been profitable, whereas other parts of Sony have not. Sony has just been through a restructuring (arguably so has Microsoft in management terms) to shed their unprofitable business. That's cost them and why their books look so bad.

Xbox's original showing being so short aside, and RRoD aside, I can't understand how Microsoft could really be making a loss on Xbox given they have the same economics as Sony but have benefited from Live revenue to a far greater extend than PS+. Unless Xbox execs are behaving like Al Pacino in Scarface(!). But the lack of visibility on Xbox finances, because it's of inclusion with other business units within Microsoft, doesn't help the negative speculation.

The Xbox interview put the design being laid down 2-3 years ago. I'm trying to find the PS4 timeline, which I think was in the same range to a bit longer.
The only person who's really spoken about the PS4 development process is Cerny and he was vague about the latter timelines - like the traditional R&D cycle. But I'm also interested.

Here's the graph I'm using to get a picture of the R&D increase.

http://www.tuaw.com/2014/02/12/a-look-at-apples-randd-expenditures-from-1995-2013/
If you look at the financial calls, Apple will vaguely attribute R&D increased to [new] product research and new production. We know they put a fair amount of effort into TV because Jobs spoke about it. We've never seen a TV but the effort may not have been hard are/software R&D, more policy/market/legal exploration. They're supposedly looking at wearables. We know they've sinking a lot into sapphire glass production because that amount of money sunk into a small manufacturing industry, makes waves. But unless their CPU guys are like Al Pacino in Scarface, I can't see how they would be burning through large amounts of $$ on R&D for a chip a year.

That same amount would actually be noticeable in Sony's profit-loss statements these days.
Unfortunately Sony don't go into that much detail on their financial reports. For example, this is how Sony report (but with no real figures) the quarter covering PS4.

Sony 2013 FQ2 said:
In the IP&S segment, operating income decreased year-on-year mainly due to a decrease in sales of video cameras. In the Game segment, operating loss significantly increased year-on-year primarily due to an increase in research and development expenses related to the upcoming introduction of the PlayStation®4 (“PS4”) and the impact of a strategic price reduction for the PS Vita. In the MP&C segment, operating results significantly improved year-on-year primarily due to a significant increase in sales of smartphones.

That's just a couple of paragraphs from a cover that goes on for almost a page of A4. They do cover the entire organisation but R&D is spread across consumer products (TV, games, cameras, A/V), medical, financial serves, movies, music.

There is some evidence. One is that AMD didn't report several billion dollars of unexplained revenue, and we know the multiple billions of dollars that go into AMD's wafer purchases over the years, which Microsoft would be paying AMD for.

But unless you know what the payment/recompense agreement between AMD and Microsoft was, this could mean something or nothing.

Apple produced a 64-bit aggressively OoO architecture on a relatively leading edge process with IPC in that limited product window in the same area as a ULV Haswell. That's high end, particularly given the inferior mobile cores it is compared to.

Ok, so Apple. I agree, the A7 looks nice. Other companies have taken a more brute-force approach. Weren't Samsung putting 8 core Exynos processors into phones last year? Others are doing it old school, cranking the clocks into the 2Ghz and above bracket. ARM. Choices.

This is exactly what I stated Cell was intended to be, and what it needed to have happen to justify the investment. It didn't happen.

Not to the degree of the original plan, no. But Sony launched Cell-based professional video editing equipment and they launched Bravia TVs will Cell chips in (as did Toshiba). Of course it soon become cheaper to just use an embedded conventional processor and purpose-built DSP.
 
Then we're not discussing the same thing. I'm defending, at your contestation, that ARM is a contender for PlayStation 5.
I'm making the distinction between ARM as a standard and the implementers of the ISA.
The historical pattern is that custom cores, should they be optimized for performance, can outpace the standard cores by multiple years.

Then, if the desire is for a custom core, that an implementer with a viable core be found, and that their experience and other IP that the consoles also need would weigh in on that choice.
Should things go its way, the implementer with a higher-performance core and the other IP beyond the CPU would also have an x86 core that should be equivalent.
That seems pretty compelling.

Well I feel like I've been brought here like a hostage! In every post you are look backwards and not forward [to PS5] so to address your 'ARM isn't powerful enough for a console' stance, I figured it at least be interesting to discuss Sony doing their own ARM chip. Then we did the whole tangent of cost/time/motivation. And now we're here. Can I go now? Stockholm syndrome is kicking in :yep2:
I did discuss it, in how Sony has not demonstrated that it has the means, desire, or economic return to do so.
And with discussion speculating on a shorter time frame for the generation, it would have needed to have started it before launching the PS4.


So going by the date of the article we should be expecting the non-specific project to poke its head out around 2011 or 2012?

This started out as you arguing that rolling your own CPU was risky (I don't debate this) but your use of RRoD as an example isn't a good one because the problem was not as a result of Microsoft rolling their own chip - IBM and AMD were their chip vendors.
The die were from the vendors, Microsoft tried to do the much simpler physical packaging instead of letting the experts do it and lost a billion dollars as a result.

Ok, let's go with fifteen. So there must still be a whole bunch of disastrous projects? No, perhaps because it's not really that monumental a task?
The architectural licensees whose chips I've seen discussed include vendors for mobiles, high-end networking, and dense server applications where there is volume or revenue to match.
Many of the ones I know of have established experience in doing this sort of work.
Samsung is one that has volume and reason to customize cores in order to avoid the race to the bottom that standard cores are engaged in. Its lack of experience in doing so may explain some of the rumors about things not going so well.

Other licensees may have custom cores, but not necessarily cores customized to fit a console or designs they care to talk about or sell. A few of them are competitors to either Sony or Microsoft.

Well the PlayStation 2 was on sale for 13 years. You'd need to derive the PS2 profits from each year and calculate for inflation then add them all together while removing R&D costs. I've never felt the urge to do this, myself!
It's possible then that the amounts in question are comparatively small relative to the total revenues the architectural licensees are pursuing.

Well we don't really know what went on. You say bid, as though Sony (and Microsoft) let vendors bid but from what Nvidia indicated (I'm getting a touch of sour grapes so this may be a bit of bravado) it wasn't that AMD bid less it's that Nvidia weren't interesting for whatever royalties were available.
I'm sure they didn't want more money. Although if you want to take them at their word, why assume many of the architectural licensees with better things to do would want to engage in such a deal either?

But unless their CPU guys are like Al Pacino in Scarface, I can't see how they would be burning through large amounts of $$ on R&D for a chip a year.
I gave a six year window, during which R&D grew by billions of dollars a year, and I saw no difficulty in hiding a fraction of a billion dollars per year in the bigger pool.
It's a much more tractable undertaking when the economics of the vendor's product lines give it incredible volumes or revenues.

Ok, so Apple. I agree, the A7 looks nice. Other companies have taken a more brute-force approach. Weren't Samsung putting 8 core Exynos processors into phones last year? Others are doing it old school, cranking the clocks into the 2Ghz and above bracket. ARM. Choices.
Those solutions are inferior in performance, 32-bit, and are architecturally maxed out relative to the consoles.
Samsung also screwed up cache coherence for the first run of product, so I don't see what the choices are.

Custom architectures lead the bog-standard ones in release schedule and capability compared to cores that must cater to a broader market that is ill-served by things the consoles need.
 
Now that your discussion seems to have run its course, let me chip in with an opinion on the original topic.
I believe that for a new console to have a good chance of acceptance, it needs to bring something new to the table. Otherwise it will simply be seen as an expensive device doing nothing new compared to what the consumer already has.
So far, for Sony and Microsoft, graphics has been the prime differentiator between generations. (With Nintendo proving that it is not the only way to differentiate new from old.)
Assuming that Sony and possibly Microsoft want to use graphics as a differentiator again, just how long time needs to pass for graphics to make a "generational leap"? As far as I can see, given that realism in rendering is an asymptotical process, it would take at least three lithographic generations to achieve, perhaps more. Incidentally, if you want to do a reasonably good job of VR (as an example of an alternative generational differentiator), it would require at least 4-8 times the graphical performance of the current generation of consoles if the graphical quality is not going to take a significant step back. Which again implies at least three generations of lithographic progress. Which would be what in TSMCs parlance is called their 7nm node.
In TSMCs latest annual technology summit at the end of April, they anticipated that risk production on this node would begin mid-2017. Grains of salt strongly recommended of course. Assuming they are not completely off, console chips could conceivably go into volume production 2019, making launch before the Christmas of 2019 an early, but possible, target.

An alternative would be TSMC 10nm, going into risk production at the end of 2015, (which makes you wonder a bit at just how much will really change between 10 -> 7), which could allow a new console in 2018 or less likely late 2017. But I don't see that anyone really has a lot to gain by making this console generation much shorter than the previous, nor do I believe that a perceptually clearly smaller step in graphical fidelity would go over well with the early adopters. So - Autumn 2019 would be my guess for launch target if the market doesn't change markedly in the coming couple of years.
 
The primary cost for high performance SOCs is in the memory subsystem; The proportion of on-die real estate put aside for local memories, caches and memory PHYs, the wide and fast external buses (with their high power consumption adding constraints as well as costs) and the DRAM themselves all add up.

The next generation of consoles won't happen before the tight integration with DRAM becomes mature. I'd expect a minimum of five years, more likely more. A launch in 2021/2 seems likely.

What we will see is erosion from the bottom by mobile SOCs in tablets and the Google/Apple TVs of the world. I would expect this to be similar to the boom/bust of PC gaming. The initial boom fuelled by the ever larger capabilities of systems, and the subsequent bust when these systems end up with a million different hardware configurations/performance points.

Cheers
 
Back
Top