Predict: The Next Generation Console Tech

Status
Not open for further replies.
I think the markets for PCs, servers and netbook+laptops are far larger and more lucrative than the cut-throat pricing of the consoles.

Sony, MS and Nintendo buys chip design right? not production.

So getting x billion to just design a chip gives you free R&D (since the company that paid for the design covers the development). The creation of a consoel chip that is supposed to stand over time can produce valuable knowledge.

After that it is up to MS, Sony and Nintendos to reduce price by selecting producers. No sweat for Intel.
 
Shifty nailed it, millions of computers sold every year (desktops, notebooks, netbooks, server) is a much bigger AND higher profit margin than consoles. Especially when their best process node is capacity constrained, hence the 45 nm IGP paired with the 32nm Arrandale. If Intel goes in the console business, they won't be using their best process nodes, which sort of defeats the purpose of choosing Intel in the first place.

The biggest semiconductor markets are computers, servers, then I'd guess mobile devices and GPU's, not consoles. Intel would try to compete in those markets first, especially the GPU market since GPGPU's will be threatening to take over CPU's in the next 10 years.

Tue but console is a single product with a single chip design that is in production over a time span of 10 years, nothing can match that. PC, servers and so on requires you to constantly push R&D cost, marketing and create new production.

To have a contract that gives you one product that you have to produce in a time over 10 years is a great deal, a stability ground. Why do you think Apple gets so great deal from their manufactures? Cause they offer long term business which means security.

The console market is growing eaxh generation, so far 140 million machines sold and all uses only 3 IBM Chip. Show me 3 single chip from one company that has been produced in larger quantaties.

Next Gen console could also help Intel to gain support for better advancement in the GPU area, if they could get one of them to use a Intel GPU you would get instant support for a Intel GPU from many developers.
 
Tue but console is a single product with a single chip design that is in production over a time span of 10 years, nothing can match that. PC, servers and so on requires you to constantly push R&D cost, marketing and create new production.

To have a contract that gives you one product that you have to produce in a time over 10 years is a great deal, a stability ground. Why do you think Apple gets so great deal from their manufactures? Cause they offer long term business which means security.

The console market is growing eaxh generation, so far 140 million machines sold and all uses only 3 IBM Chip. Show me 3 single chip from one company that has been produced in larger quantaties.

Next Gen console could also help Intel to gain support for better advancement in the GPU area, if they could get one of them to use a Intel GPU you would get instant support for a Intel GPU from many developers.

I think all the different models of specific processors at their various speeds in market would be very high in terms of number produced. The Conroe chip for Intel's Core 2 Duo line had almost a dozen different speeds it was sold at. Same chip, different clock rates (and voltages, multipliers, etc). Especially as successful as Conroe was, I can't see nothing but millions of those being produced by the end of the run.
 
The console market is growing eaxh generation, so far 140 million machines sold and all uses only 3 IBM Chip. Show me 3 single chip from one company that has been produced in larger quantaties.

Core2Duo
Core2Quad
Pentium
Pentium Pro
Pentium II
Pentium III
Pentium 4
etc

You do realize that the yearly PC market is in the 250+ million range right?
 
I would not worry that much that PS4 would not use CELL - sony made too big of an investment with the CELL to just discard it freely.

Just want to point out that this thinking is a little off, and has been discussed numerous times (in this thread alone multiple times). In terms of a pure business decision, you weigh your costs going forward rather than your expenses in the past.
 
Just want to point out that this thinking is a little off, and has been discussed numerous times (in this thread alone multiple times). In terms of a pure business decision, you weigh your costs going forward rather than your expenses in the past.

So basically you weigh the the return of the current investment against the cost of a new investment and how fast that will start to pay off.
I agree that the cost of the current investment is irrelevant to the decision of what to choose going forward. It´s the payoff of the current investment going forward that is relevant.

We can only guess how well Cell is paying off at the moment, but we can see more and more games being able to take advantage of its high processing power, new AA algorithms and other post processing etc.

If Cells power draw and cost comes down after the next die shrink, which I expect next year, Toshiba may start using it in their main stream TVs (not just hi-end) and some economics of scale may kick in. Of course that is just speculation, but I use it as an example of how Cell can payoff and we can probably assume it is a pretty long term investment. I see no reason to not believe Sonys talk of a ten year life cycle, the PS3 is the only console at the moment with growing sales figures so its sales has not peaked yet, a next generation console (PS4) is still far away.

The investment of knowledge in the esoteric CPU design in Sonys internal studios is another huge investment Sony has done that can give pretty good return. i.e. instead of upgrading the hardware you are constantly upgrading/optimising the software and Cell may have more room for optimization than more traditional cpu designs. More ways to distribute computational tasks etc.
 
So basically you weigh the the return of the current investment against the cost of a new investment and how fast that will start to pay off.
I agree that the cost of the current investment is irrelevant to the decision of what to choose going forward. It´s the payoff of the current investment going forward that is relevant.

We can only guess how well Cell is paying off at the moment, but we can see more and more games being able to take advantage of its high processing power, new AA algorithms and other post processing etc.

If Cells power draw and cost comes down after the next die shrink, which I expect next year, Toshiba may start using it in their main stream TVs (not just hi-end) and some economics of scale may kick in. Of course that is just speculation, but I use it as an example of how Cell can payoff and we can probably assume it is a pretty long term investment. I see no reason to not believe Sonys talk of a ten year life cycle, the PS3 is the only console at the moment with growing sales figures so its sales has not peaked yet, a next generation console (PS4) is still far away.

The investment of knowledge in the esoteric CPU design in Sonys internal studios is another huge investment Sony has done that can give pretty good return. i.e. instead of upgrading the hardware you are constantly upgrading/optimising the software and Cell may have more room for optimization than more traditional cpu designs. More ways to distribute computational tasks etc.

If a mega-fast-single-core-single-threaded CPU were to be PS4's CPU than those skills might not be THAT useful, but if they go with a multi-core design... whether cache based or not, those skills will be useful (I think that a very good Cell coder should have little problem handling a GF100 under OpenCL/DirectCompute/CUDA or a 12-processors multi-threaded Intel CPU).
 
So basically you weigh the the return of the current investment against the cost of a new investment and how fast that will start to pay off.
I agree that the cost of the current investment is irrelevant to the decision of what to choose going forward. It´s the payoff of the current investment going forward that is relevant.

We can only guess how well Cell is paying off at the moment, but we can see more and more games being able to take advantage of its high processing power, new AA algorithms and other post processing etc.

If Cells power draw and cost comes down after the next die shrink, which I expect next year, Toshiba may start using it in their main stream TVs (not just hi-end) and some economics of scale may kick in. Of course that is just speculation, but I use it as an example of how Cell can payoff and we can probably assume it is a pretty long term investment. I see no reason to not believe Sonys talk of a ten year life cycle, the PS3 is the only console at the moment with growing sales figures so its sales has not peaked yet, a next generation console (PS4) is still far away.

The investment of knowledge in the esoteric CPU design in Sonys internal studios is another huge investment Sony has done that can give pretty good return. i.e. instead of upgrading the hardware you are constantly upgrading/optimising the software and Cell may have more room for optimization than more traditional cpu designs. More ways to distribute computational tasks etc.

I didn't want to write too much on the topic in my reply above because as I mentioned, it has been discussed numerous times. Normally the pro-Cell argument is that now due to the familiarity of devs with the architecture, Sony would be doing themselves a disservice to throw the now maturing tools, dev experience, and knowledge base out the window and 'start over.' But my point was more, if Intel makes it a compelling proposition on a cost basis, do we delude ourselves into thinking that they could not as well offer robust tools and familiar environments for developers out of the door?

It's not about downplaying the distance Cell development has come, but acknowledging where competitors may be able to mitigate those gains with quick-ramp alternatives that also make financial/performance sense.

I'm a fan of Cell myself and its design philosophies, so I'm not advocating here one way or the other; it's simply that if the rumors are true, the fate of the architecture going forward in Playstation has already been decided, and we need to hypothesize within that context. I would be very happy from an enthusiasts standpoint if Cell was clocked/tweaked/evolved in a new incarnation for PS4 and that was the CPU. And I think if Sony were deciding the fate of Cell next year vs last year, its odds would look better and better, but unless we have any news to the contrary (and if so someone please share), it was last year seemingly.
 
I would not worry that much that PS4 would not use CELL - sony made too big of an investment with the CELL to just discard it freely. It would make much more business sense for them to actually provide the software tools they did not have/were late with on the PS3 from the get go, and bump up the quantitative characteristics of the CELL in the PS4. Double the SPE's, offer middleware similar to DICE's frostbyte engine so developers who don't feel like designing a packet-based pipeline wouldn't have to rediscover the wheel, let alone feel intimidated by the platform - voila, there's your PS4 core architecture, the done-right successor to the, well, revolutionary PS3.

PS4 will be x86 based... :p.

Not that I'd like it... a CELL v2.0 + massively multi-core PowerVR GPU (used mostly as a pixel shader tool) + shared memory pool (UMA) could be nice :D.

No need to worry about the DX11 pipeline up until the Triangle set-up stage... everything handled in software by CELL's SPE's and the PowerVR TBDR would pick the generated display list up and perform the rest of the magic while outputting the final frame-buffer back into the shared memory pool with the display controller reading straight from there to display the output on screen. If CELL wanted to run some post-processing too... well... the frame-buffer is right there (and more if you output more data from the GPU back into the UMA pool)...

Split memory pools seem awesome IMHO when you go for the best performance possible (but then you'd also do a nice thing if you did not cripple one of the two paths of the 2-way data sharing link like Sony did with PS3)... shared memory pools win if you want to have less troubles reaching peak performance (if you are sharing lots of data between CPU and GPU you are going to waste bandwidth and adding latency moving data around and wasting space on duplicated data sets/copy buffers...).
 
PS4 will be x86 based... :p.

You know something we ...dont?

Edit: Scratch the price talk..i was half awake...those were normal retail prices...:oops:

WRT to i7...i wonder how much power is left untapped? Even the newest games like Dragon Age and Metro uses a relaxing quarter of each 4C, no HT...We have seen i7 ripping through other CPU non-game tasks. like a very powerful CPU..but will such power be wasted on PS4 which needs only to run HD games and media...? What kind of unique software could be made on PS4/720 powered by x86....which were not done with the crappy IBM PPU?
 
Last edited by a moderator:
+1TB/s RAMBUS
What kind of buswidth would be needed to hit that datarate with the projected XDR tech to be available at the time of launch? Surely considerably wider than the current 64 bits (differential; 128 pins)...?

Not that I'd ever expect such ludicrous bandwidth to begin with, but hey... Speculation is totally free, and fun! :D
 
Just want to point out that this thinking is a little off, and has been discussed numerous times (in this thread alone multiple times). In terms of a pure business decision, you weigh your costs going forward rather than your expenses in the past.
I don't know how I left that impression. Let me try again.

The costs of going forward with a fixed software stack seem much more preferable to me than switching hw architectures. That said, IANAB (i'm not a business [guy]), so things may look skewed from my end.
 
The costs of going forward with a fixed software stack seem much more preferable to me than switching hw architectures. That said, IANAB (i'm not a business [guy]), so things may look skewed from my end.

They would seem preferable though due to the benefits in both learning curve and dev familiarity/tool costs; yet it was the architecture itself that caused headaches in those very areas. And purportedly, if previous rumblings are true, it was a developer polling as to stay-or-go in terms of Cell that is part of the reason for the would-be switch. Again this was during a darker time development-wise for the architecture than the present day, but HW decisions need to be made in advance, and once made are not easily changed.
 
The weird thing is that the fundamental reason for why Cell is what it is, and any trouble that may cause, did not go away. Cell designers did not toss the traditional memory model out by accident. It is a willful exchange for more core scalability, which leads to higher throughput per area. It is a very deliberate shuffling of complexity out of the hardware (which incures material costs per piece), into the software (which incurs one-time costs but is then either free or very cheap to replicate in mass). Sony is in the business to mass-manufacture things. They'd typically choose the solution that will approach the lower cost when, and this is the important part, when assuming very high volume runs. They have an observed tendency to assume that they can single-handedly drive volumes of whatever component they integrate straight to viability.
Cell is not a surprising design for Sony to pick at all.

Coherent caches may be "nicer", but the cost of coherency scales exponentially with the number of cores. For one or two or three cores, sure, coherent memory is a no-brainer. When you look at the lengths Intel and AMD are going to for their "many-core" x86s CPUs -- which for posterity are two~three full process nodes beyond the initial CBE design and run at roughly the same clocks --, you can already see it piling up though.

So more cores or "nicer" memory model? They do compete within the same die space, and it only gets worse the more cores you set as your baseline. What is the baseline for a next-gen CPU? 8 cores at 3.2GHz, 8-way-SP-FMA ... again? Or less?
 
The weird thing is that the fundamental reason for why Cell is what it is, and any trouble that may cause, did not go away. Cell designers did not toss the traditional memory model out by accident. It is a willful exchange for more core scalability, which leads to higher throughput per area. It is a very deliberate shuffling of complexity out of the hardware (which incures material costs per piece), into the software (which incurs one-time costs but is then either free or very cheap to replicate in mass). Sony is in the business to mass-manufacture things. They'd typically choose the solution that will approach the lower cost when, and this is the important part, when assuming very high volume runs. They have an observed tendency to assume that they can single-handedly drive volumes of whatever component they integrate straight to viability.
Cell is not a surprising design for Sony to pick at all.

Coherent caches may be "nicer", but the cost of coherency scales exponentially with the number of cores. For one or two or three cores, sure, coherent memory is a no-brainer. When you look at the lengths Intel and AMD are going to for their "many-core" x86s CPUs -- which for posterity are two~three full process nodes beyond the initial CBE design and run at roughly the same clocks --, you can already see it piling up though.

So more cores or "nicer" memory model? They do compete within the same die space, and it only gets worse the more cores you set as your baseline. What is the baseline for a next-gen CPU? 8 cores at 3.2GHz, 8-way-SP-FMA ... again? Or less?

Coherent and caches are not synonyms ... see Intel's latest massive multi-core processor research project and LRB... both have caches instead of local stores, but the former does not have HW cache coherency across cores while the latter does.

http://www.google.com/url?q=http://...zgQoAA&usg=AFQjCNFWj0lzLkw2IbdjCavSpZPrW_3cdw

CELL could have a future... IBM is not pushing it though and Sony has to decide if it wants to be the only one paying for CELLv2 or if Intel's push to get their tech into PS4 can get them a nice price : performance ratio and good tools.
 
Last edited by a moderator:
Yeah, I read that. I thought the "Message Passing Buffer" sounded very similar to a local store with some limitations, while the L2 looked like a local store with some enhancements. The L2 can be seen as the core's private address space. There's corresponding data somewhere else in the system, but you can't rely on identity. If you want identity, you sync/fence/flush/whatever, whereas for an SPE local store, you'd initiate DMA. Different actions to solve the same issue in the same circumstance. If you can detect the trigger conditions and write handling code, adding another thing to do there shouldn't be hard.

Of course the ability to do actual ad-hoc reads from any memory address is a good thing. But likewise, it is not at all incompatible with the notion that your local store doesn't mirror system RAM.
 
They would seem preferable though due to the benefits in both learning curve and dev familiarity/tool costs; yet it was the architecture itself that caused headaches in those very areas. And purportedly, if previous rumblings are true, it was a developer polling as to stay-or-go in terms of Cell that is part of the reason for the would-be switch. Again this was during a darker time development-wise for the architecture than the present day, but HW decisions need to be made in advance, and once made are not easily changed.

I don't know where you are getting this darker days talk wrt development from. Its just as dark now as it was then.
 
Status
Not open for further replies.
Back
Top