Predict: The Next Generation Console Tech

Status
Not open for further replies.
PPC A2 is maybe or probably tuned for lower frequencies, as an Atom or E350 is, so it might not be able to reach > 3GHz.
From the mouth horse (the chief engineer that designed Xenon too) it can be clocked @ +3GHz but that's not where they want to be in power efficiency (the +33% in clock speed from 2.4 GHz to 3.2GHz is likely to come at healthy power price).

EDIT
In regard to the TBE/Cell for IBM and Toshiba the picture is pretty clear they gave up on it.

For IBM their new Cell is the Power A2, it's low power has quiet some muscles for its power consumption, it has caches, transactional memory, ie it's a way more convenient base that the TBE.

If IBM were to add a cheap "gather" implementation as described in some Intel white paper it would come close to an ideal platform for number crushing ( the white papers should be available somewhere in the forum I've no time now it shows great result especially on narrow SIMD (4wide / 128bit wide).
 
Last edited by a moderator:
Maybe i am wrong, but wouldn´t Sony save themselves at lot of trouble by going after a CELL 2 in terms of development tools. The same for developers, wouldn´t it be possible for Developers to reuse some of the know how? And what about the price for a Cell, considering that Sony at least have some share in the CPU it may be cheaper to get that produced than paying for a Intel/Amd part where there is a cut for Intel/Amd Design + Production

I am sure it has been discussed plenty of times, but how much effort would it take to bring a Cell 2 up to Intel i5 lvl´s in terms of console performance.
 
Maybe i am wrong, but wouldn´t Sony save themselves at lot of trouble by going after a CELL 2 in terms of development tools. The same for developers, wouldn´t it be possible for Developers to reuse some of the know how? And what about the price for a Cell, considering that Sony at least have some share in the CPU it may be cheaper to get that produced than paying for a Intel/Amd part where there is a cut for Intel/Amd Design + Production

I am sure it has been discussed plenty of times, but how much effort would it take to bring a Cell 2 up to Intel i5 lvl´s in terms of console performance.

Save themselves a lot of trouble by once again using the most troublesome chip in console history...? :p
 
From the mouth horse (the chief engineer that designed Xenon too) it can be clocked @ +3GHz but that's not where they want to be in power efficiency (the +33% in clock speed from 2.4 GHz to 3.2GHz is likely to come at healthy power price).

That's interesting, thanks for sharing. Raising the clock frequency while reducing the core count and boosting the VMX units per core, you could get something in the right performance threshold for a next-gen console while still getting relatively low power consumption, couldn't you?
 
For sony, 4K comes for free.

The old Cell can already decode two 1080p streams at 40Gbps total. They only need twice that performance to do 4k in H264 (or quad-HD). The drive costs practically nothing today, a firmware update can make the drive read 128GB BDXL discs which is plenty for a full length film at over 100 Gbps (they could spec to 144Gbps mux, for a 4x drive requirement). Any GPU can display quad-HD. HDMI can output quad-HD too. It all comes for free, they'd be stupid not to implement it.


Movies are starting to go with higher frame rates. The Hobbit is 4k at 48fps (for stereoscopic 3d you double that). The new HDMI 1.5 is probably going to be released later this year. James Cameron may film Avatar 2 at 60 or 72 frames per second, or he may settle at 48. Either way a CELL 2.0 should pay off for Sony.

Kaz Hirai is should be able to correct the basic mistakes made with the PS3 launch. They'll have rumble in the controller (they have the rights to more advanced haptic feedback with the settling of that Immersion Corp. lawsuit) and will bundle a microphone headset. All the first party developers will be familar with CELL from day 1.
 
Doesn't it run smack in the face of GPGPU computing?

Not really, when little ALU's go to bed at night with visions of gpgpu greatness, they dream of growing up to be spu's.



The entire SoC can be fabbed at GloFo if its more standard PPC cores in a simpler configuration coupled with an AMD gpu, perhaps, whereas something more custom (Cell) might have to be done at IBM's Fishkill. I could be wrong about this point, I'm just looking at what would maximize manufacturing efficiency - (Having the entire chip and all its cores fabbed at one location). I'm drawing GloFlo into this from something called Common Platform - a manufacturing alliance recently announced between GloFo and IBM.

Again, not sure what GloFo has to do with it. Unless there's an amazing coup on their part, they aren't in the picture. IBM fabs Cell because they came up with a competitive price and had the capacity to meet demand. Toshiba is a long time trusted partner who will be part of the fab equation. As far as "Having the entire chip and all its cores fabbed at one location", that's a given. Unless you're talking about a stack chip design, then keep on talking. :LOL:

Can you elaborate on this a bit more, please. I'm just not quite certain what you might be referring to..

From my viewpoint, I interpret Vita as a total abandonment of the Kutargi design philosophy...more standardized parts, ease of programming, and efficiency as the design priorities.

Sure, Cell development was used as the crowbar to oust Kutaragi, at which point it became a poison pill in Sony HQ where showing support cost people their jobs (if you've worked in corporate long enough, you've seen this happen at least once). Almost all of this happened before Cell even had a chance to requit itself. That though has since happened, but unless something changed it still keeps the stigma. So, now enters Hirai who owes Cell alot for his ascension to the top and who is also aware of the costs thru the playstation ecosystem to replace it. Does that give hope for its survival? Certainly. Would he sacrifice it to consolidate power in HQ? Most definitely. That's why I still think it's chances are less than 10% to be used nextgen. That's better though than the almost 0% I'd have given it before.
 
Some devs claim it's not specifically the Cell which was the problem in the Kutaragi era. It's that they didn't pay enough attention to other studios and didn't pay enough attention to the software side, they didn't provide adequate libraries to develop and they expected everyone would just figure it out. Let's do assembly like it's 1992! :p The new management is supposedly giving a LOT of support to external developers with a pretty good code library for the Cell. They took criticism very seriously so hopefully it's looking good for the PS4.

They said in the interview that Sony excels at integration and should focus on that strength. So here's my utterly uneducated guess for the PS4, point-and-laugh at will:
- Power supply on the PS3 is small, stable and efficient, they know how to design good power supplies. Should be similar.
- Silent Cooling. They already have an amazing scirocco fan design that can dissipate over 200W is silence. First model should again target 200W.
- Ps Move, better eyeToy (higher frame rate and depth channel), Vita symbiosis. They can get ANY port from other systems. This neutralizes competitors's differentiating factors, unless MS surprises us with Kinect-2, which is very probable :D
- 128GB BDXL is a given, all RnD is done, low cost.
- Playback of any disc format imaginable, already covered. The rest is digital service offering, in progress.
- 1TB HDD, just get the best supplier, second SKU with a smaller HDD or maybe a few GB of internal flash (xbox360 style).
- Cell 2, They'll scale up and improve the Cell which they already own the design and fabs. 4x PPE, 16 SPE.
- Deal with a supplier of wide-IO memory dies in a stackable format.
- Customized Nvidia Kepler core, adapted the I/O for stacked memory and bus to CPU.
- They'll figure out a way to put CPU, GPU and memory on a 2.5D interposer. Heavy negotiation with all parties involved to make it happen. Standardize.

In this scenario, they'd have invested most of their RnD on this last point, and essentially betting the farm on it. They already have a fab with a good output of stacked CMOS sensors, so maybe it's feasible and they have that expertise, if they can pull it off they have the performance edge they need. Great differentiation. Cherry on the cake is that it does everything again.

Oh, and they should give the whole company to Richard Marks, he was right back in 2001 they didn't trust him, but he was f***** right all along ;)
 
I really wouldn't mind seeing a future iteration of CELL in PS4. I know everyone here is in love with the idea of GPGPU, but really how would CELL mess that up? It means less real estate in the console for the GPU? I just don't see the problem with including and improving upon CELL with the PS4. How much would a cut down Power 7 with four cores and four SPE's attached to each core cost in die size?

And as for what CELL would be doing for the PS4 if the machine has a good GPU this time, well it would probably end up making that good GPU better in some ways by some devs that decide to get every last bit of performance out of the machine. Seriously, what precludes a new CELL chip from again assisting the GPU with graphics tasks, even if the GPU is awesome? Or would it be useless as there'd be little CELL would be able to assist and help out with?
 
I really wouldn't mind seeing a future iteration of CELL in PS4. I know everyone here is in love with the idea of GPGPU, but really how would CELL mess that up? It means less real estate in the console for the GPU? I just don't see the problem with including and improving upon CELL with the PS4. How much would a cut down Power 7 with four cores and four SPE's attached to each core cost in die size?

And as for what CELL would be doing for the PS4 if the machine has a good GPU this time, well it would probably end up making that good GPU better in some ways by some devs that decide to get every last bit of performance out of the machine. Seriously, what precludes a new CELL chip from again assisting the GPU with graphics tasks, even if the GPU is awesome? Or would it be useless as there'd be little CELL would be able to assist and help out with?

I think Cell(SPE's) could help out in many respects and even some in graphics from Devs that want to push the envelope (1st party), but regardless how much Cell would be used for helping out the graphics end, it would certainly be useful for Audio and Decompression. That's baseline of what every dev regardless of background could use the minimum 6 spes (BC) for in future ps4 titles.

Although, I must say, if Sony is intending on using Cell going forward, it may be best for them to skip Tesla/GCN entirely. The pre-GPGPU designs were more perf/mm and if that means making way for more SPE's or a bigger GPU, it may be the more logical outcome for Sony. As a side bonus, it would also probably be cheaper to license pre-Southern-Islands GPU tech than GCN.

Food for thought.
 
Save themselves a lot of trouble by once again using the most troublesome chip in console history...? :p

Hehe, i think the most troublesome chip turned out to be the RSX, that is the one thing in the PS3 that almost everyone keeps bitching about. The Cell as a problem child comes more from it having to do the work that RSX couldn´t.

I just think about the work that goes along with a new CPU/GPU. Sony would have to start from "scratch" on every in house tool with a new CPU. With a Cell 2 they could start from where they are right now and focus on a new GPU. Maybe i am to optimistic, but i would say that anything that gives them and the developers a headstart is worth considering. And a console with something that is already well known and explored should give a basis for good looking launch games. And we all know how multiplatform games on the PS3 compared at launch.

As other have mentioned, i am not even sure how much of an upgrade is really needed, rather spend the money on GPU and Memory and do a minor but important upgrade to the cell. And of course keep on using the 1 SPE disabled for better yields, which i am sure saved Sony some money to begin with :)
 
Last edited by a moderator:
From the mouth horse (the chief engineer that designed Xenon too) it can be clocked @ +3GHz but that's not where they want to be in power efficiency (the +33% in clock speed from 2.4 GHz to 3.2GHz is likely to come at healthy power price).

I imagine that on 32nm SOI, they could get the power/performance to be roughly the same at 3.2 GHz as the 2.4 GHz part on 45nm SOI.

I wouldn't expect MS or Sony to use and 4 node/64 thread version either, but I keep thinking about Tim Sweeney's <a href="http://au.games.ign.com/articles/119/1196638p1.html="interview">quote</a> from September to IGN.

He mentions that some of the design challenges would be scaling to many cores (like 20 some). I don't think he's referring to programmable shaders like GCN. IMO, the only way to achieve a high core count in the next gen would be with simple cores like the A2 has.

I think we could see a derivative of the A2 in the next xbox. 32nm 2 nodes (8 cores and 32 threads) @ 3+ GHz. Power would probably be around 30-40 W for the CPU. Add a 60-75 W GPU like the 7750 and I think you have a really powerful console under a 150W power budget and like at a very reasonable cost.
 
http://www.develop-online.net/news/39951/Unreal-Engine-4-demo-at-GDC
Well it looks like Epic is finally about to showcase its full blown UnrealEngine4 at GDC this year although only to the selected licensees, partners and prospective customers. I expect tidbits on how well the engine runs on future game consoles, at what spec and so on.

Don't think so. They will showcase it to few developer, in a tiny space. We are just going to see a bunch of tweet saying "OMG, amazing what I saw today".. :???:
But all the leaks will be really welcome. Let's hope for them.
 
Don't think so. They will showcase it to few developer, in a tiny space. We are just going to see a bunch of tweet saying "OMG, amazing what I saw today".. :???:
But all the leaks will be really welcome. Let's hope for them.

The obvious hope is "shaky iphone video". The question is whether it will occur (if it does, I would have to wonder if Epic tried very hard to stop it).

I expect UE4 to put paid to all this diminishing returns nonsense :p
 
Software and ecosystem. Largely that was realised this generation.

I don't see that as particularly desirable ground for Sony to defend, imo if that's the battleground they should seriously consider whether it's even worth the effort.

Nintendo of course would be crushed under those conditions.
 
I imagine that on 32nm SOI, they could get the power/performance to be roughly the same at 3.2 GHz as the 2.4 GHz part on 45nm SOI.

I wouldn't expect MS or Sony to use and 4 node/64 thread version either, but I keep thinking about Tim Sweeney's <a href="http://au.games.ign.com/articles/119/1196638p1.html="interview">quote</a> from September to IGN.

He mentions that some of the design challenges would be scaling to many cores (like 20 some). I don't think he's referring to programmable shaders like GCN. IMO, the only way to achieve a high core count in the next gen would be with simple cores like the A2 has.

I think we could see a derivative of the A2 in the next xbox. 32nm 2 nodes (8 cores and 32 threads) @ 3+ GHz. Power would probably be around 30-40 W for the CPU. Add a 60-75 W GPU like the 7750 and I think you have a really powerful console under a 150W power budget and like at a very reasonable cost.

What about Wii U? That appears to be 45nm and quad core for the CPU based on early accounts. Would a quad core, 2.4ghz Power A2 be a good candidate for Wii U's CPU? Would that aspect be consistent with a console that is 1.5x Xbox 360?
 
What about Wii U? That appears to be 45nm and quad core for the CPU based on early accounts. Would a quad core, 2.4ghz Power A2 be a good candidate for Wii U's CPU? Would that aspect be consistent with a console that is 1.5x Xbox 360?

From what I can gather, the problem with going with a many-mini-core CPU vs a few-big-core CPU is much of the code is not able to be threaded out and run in parallel. Some things are, but much of it isn't.

I expect we will see higher thread capable CPU's, but the core count will remain < 8. They will likely attempt to increase IPC for the cases where multithreading is limited.

I'd like to think this would mean a Power7 derived CPU (even a tricore would be nice) but at this point, I'm not getting my hopes up. Xenon/Cell x2 with improved VMX and a turbocore option to run a subset of the cores at higher speed while disabling the unused cores is all I'm expecting.
 
Status
Not open for further replies.
Back
Top