Predict: The Next Generation Console Tech

Status
Not open for further replies.
Does that make sense when there looks to be less room for die shrinking, so less price reduction and probably no chance at hitting a non-lossy $200 pricepoint. I think the design will be based on predicitions of what process tech will be like 5 years into the generation, and a rather conservative prediction at that given this gen's lessons. PS3 was supposed to launched at 65nm, but missed the mark, and process shrinks haven't had the gains that PS2's process shrinks had. They'd possibly look at around 300-400 mm² aiming for a smaller die size and cheaper cooling by the time the consoles get slimified.

The bulk of cost in silicon production is capital expenditures cost from equipment investment. The running (production) cost is only a fraction of total cost.

Once downscaling slows down significantly, the amorization period of these expenditures increases. This means that the per transistor price will continue to fall for a long time afterwards. Once production costs dominate, fabs will concentrate on reducing those by using even more automation, move to 450mm wafers etc.

There will probably be more focus on initial power consumption. With downscaling slowing down, it is less likely to reduce power draw much. Power draw directly influences the cost of the cooling solution as well as reliability.

Cheers
 
liolio said:
In regard to perf I think 4 bobcats on the same chip as a HD56xx class of GPU should already offer a meaty improvement upon actual systems.

Problem with Bobcat is that it is an extremely anemic CPU. Being better than Atom is not massively impressive. Bobcat's three biggest saving graces are its extremely small die size, low power consumption and strong integrated GPU.

For a console you want a small die size, and strong performance. Unless Bobcat is modified it is not suitable fo anything other than a Wii 1.5 IMHO.

Looking at some announcements from Power VR, ARM, TI and Nvidia perhaps an ARM core is not so far fetched.
 
Problem with Bobcat is that it is an extremely anemic CPU. Being better than Atom is not massively impressive. Bobcat's three biggest saving graces are its extremely small die size, low power consumption and strong integrated GPU.

For a console you want a small die size, and strong performance. Unless Bobcat is modified it is not suitable fo anything other than a Wii 1.5 IMHO.

Looking at some announcements from Power VR, ARM, TI and Nvidia perhaps an ARM core is not so far fetched.

I dunno an 8 core bobcat cpu would be very very small. Current zacrete cores are a 2 core bobcat with a 6310 gpu and weigh in at 75mm2. if you look at the released diagrams you can see that the 2 bobcat cores would fit into the same space as the gpu core. So a 4core bobcat should weigh in at 75mm2 double that to 150mm2 for an 8 core and it should be pretty powerful and at the same time very small as these sizes are on 40nm so we should see at least 28nm for next gen systems. it would leave them alot of budget for the gpu.

Of course i don't know how an 8 core bobcat would stack up to other choices but it should be up for the task esp if its clocked closer to 2ghz a core than 1.6
 
I'm not sure Nintendo hasn't shown us the way forwards with Wii.
I think MS and Sony will try to promote something other than raw performance as a differentiator.
I'm not sure I know what that is, and it clearly doesn't precludes a significant leap in technology, but I think the better graphics/MFlop train is pretty much at the end of it's tracks as the sole driver of a new platform.
I've always considered perifierals like Wii, Move and Kinect as gimicks, but looking at the ammount of hardware that they have shipped, they clearly appeal to people.
 
Problem with Bobcat is that it is an extremely anemic CPU. Being better than Atom is not massively impressive. Bobcat's three biggest saving graces are its extremely small die size, low power consumption and strong integrated GPU.

For a console you want a small die size, and strong performance. Unless Bobcat is modified it is not suitable fo anything other than a Wii 1.5 IMHO.

Looking at some announcements from Power VR, ARM, TI and Nvidia perhaps an ARM core is not so far fetched.
In regard to Bobcat, manufacturers could decide to push the envelop (so clock the higher than 1.6GHz say somewhere above 2GHz). And I believe that PPU and Xenon really suck. I keep in mind 3dilletante post tho.
I my-self remember Capcom making a comparision between Xenon (so three cores in order) pentium 4 (dual cores OoO but still pretty low IPC).

But that's an overall picture, I would tend to think that the more you offload to the GPU the more the this kind of comparision would favor bobcat like architecture ie not number crushing monster but managing to maintain high IPC on complex code.
I might be wrong (note to my-self that should be my signature :LOL: )/

In any case Bobcats can be modified both by AMD and the company that buy a license. I should look at AMD roadmap in regad to bobcat evolutions but my feel is that it would be a good building block for next gen as far as CPU is concerned. (and as it evolves it can only get better).

In regard to ARM, they get better but if Bobcat level of perfs is not enough I don't think that it will be the case for A15, neither I think that the CPU that should be feature in Nvidia Denver would do better vs "the bobcat of this time" (I mean AMD is not static they're to improve bobcats).

For GPU, well I start to wonder about something, PowerVR and the like offer greater and greater GPUs but there has to be trade off, what I mean is ok last announcement are great, great perf per watts, but how about perfs per mm²? My belief is that if PowerVR were to provide HD5670 level of performances it would take them quiet a huge chip. I think that there is a middle ground (on a console so not a mobile device) and that for now it favors AMD (and Nvidia) as power management features start to appear on GPU.
 
Last edited by a moderator:
I'm not sure Nintendo hasn't shown us the way forwards with Wii.
I think MS and Sony will try to promote something other than raw performance as a differentiator.
I'm not sure I know what that is, and it clearly doesn't precludes a significant leap in technology, but I think the better graphics/MFlop train is pretty much at the end of it's tracks as the sole driver of a new platform.
I've always considered perifierals like Wii, Move and Kinect as gimicks, but looking at the ammount of hardware that they have shipped, they clearly appeal to people.

Since the 360's launch and it's importance of Live integration, I look at the future of consoles as like phones services. You pick a "carrier" because they have the features you want in their online service and go with it. Hardware is obviously part of the equation, but the software clearly is important too. Yes, people will nitpick over technical specifications and to which system can output better graphics or whatever, but clearly the future successes of the PS3 and Xbox 360 as well as their successors will ride on their online integration. The same argument has been made for all the different home electronics integrations systems that use specific brands that all readily interact with each other just about flawlessly. Case in point: Apple products. Buy a Mac, iPhone, AppleTV, etc.......

"We have the games, we have the service, we just don't have competition."

I should trademark that lol.
 
Yes. Remember EA's Next-gen Madden 'trailer'? We are a long way from that, and there's plenty of room for improvement in how games look that makes them more fun to play. The cost issue the same as any generation - you pick a pricepoint and develop for it.

I just wandering:is there any,statistical correct analysis about the perceived effect of the improved graphics on the spending behaviour?
I mean some experiment,to see that the same game,but with different visual content how affect the perceived value? (in nutshell,it has to be more complicated)


I think the guy ho actually decided the release of the BR never did similar trial.
 
I disagree, the online integration is the easiest to replicate. As I said, if a company really can't figure out how to design it right it right they can just copy the competition. The programming is basic, and the knowhow for setting up the server infrastructure is widespread and can be bought (not fundamentally different from media serving). All the important media services (iTunes, Netflix, Hulu, Youtube etc.) are third party and non exclusive. Getting the online components right only requires competency.

The prime differentiator will still be in the ability to do what consoles are supposed to do ... gaming. Here hardware rather than software makes the difference, so here you can differentiate and rely on the difference to survive for a while.

PS. well apart from marketing of course ... sigh.
 
Last edited by a moderator:
I disagree, the online integration is the easiest to replicate. As I said, if a company really can't figure out how to design it right it right they can just copy the competition. The programming is basic, and the knowhow for setting up the server infrastructure is widespread and can be bought (not fundamentally different from media serving). All the important media services (iTunes, Netflix, Hulu, Youtube etc.) are third party and non exclusive. Getting the online components right only requires competency.

The prime differentiator will still be in the ability to do what consoles are supposed to do ... gaming. Here hardware rather than software makes the difference, so here you can differentiate and rely on the difference to survive for a while.

PS. well apart from marketing of course ... sigh.

Between the 360 and PS3, it's how well integrated Live was to the 360 that I think made it ultimately successful. While PSN is just fine for most of us, for the common consumer, I'm sure they prefer Live, especially when all their friends are probably on it. Yes, Live has existed since 2003 (or 2002?) and the 360 launched a year ahead of the PS3, but it was so important to the 360's launch and package.

The Wii is in a way held back by it's terrible hardware (in comparison) and lacking online features, but I think it could've held a candle on that regard had Nintendo not botched it and actually encouraged the higher end developers to actually cater towards the Wii and online multiplayer.

Way I look at it with the 360, you're not buying a console as much as you're just getting Xbox Live. Yes, hardware can and will make a difference next time around, but if the same multi-platforming mayhem happens again (especially if Nintendo is on technical parity with MS and Sony - at least feature set wise) people I'm sure will take a very good look at how online integration is panned out just like I'm sure many do currently. But I will concede to that fact that the Xbox 360 has the better end of the deal as far as software exclusives go (at least when I consider everyone but myself) when compared to the PS3, and the Wii is on it's own with what happened to it.
 
I agree that online is insufficient as a differentiator, although I'm not sure that replicating others service is really a viable way to go, people really underestimate the investment in designing/building Live that MS made.

I think 3D might be a viable differentiator, although not just displaying 3D games, though that's predicated on widespread adoption.
Given the success and cost of Kinect, I have to wonder if it couldn't be more radical. The whole Wii/move/kinect thing smacks a lot of a move back to the VR/AR focus of the early 90's.
 
Since the 360's launch and it's importance of Live integration, I look at the future of consoles as like phones services. You pick a "carrier" because they have the features you want in their online service and go with it. Hardware is obviously part of the equation, but the software clearly is important too. Yes, people will nitpick over technical specifications and to which system can output better graphics or whatever, but clearly the future successes of the PS3 and Xbox 360 as well as their successors will ride on their online integration. The same argument has been made for all the different home electronics integrations systems that use specific brands that all readily interact with each other just about flawlessly. Case in point: Apple products. Buy a Mac, iPhone, AppleTV, etc.......

"We have the games, we have the service, we just don't have competition."

I should trademark that lol.

I think the importance of online/DD/integration is a bit overplayed around here. I don't have (would like to know either way) the data to back my point (then again who does around here), but I can speak for myself and a a number of people I know that live/online is used only sporadically--especially for RPG/adventure/strategy type of games.

Take the very popular Dragon Age for example: it's got no online and the DD add-ons are absolute crap. Actually, I'd say the Awakening expansion is also crap, but that's a diff discussion.

Now, if in the *future* online/live turns out being the widely used way to play as well as to acquire content, I'm damn sure games will not cost $60 a pop (the actual inflation-adjusted value of course).
 
While PSN is just fine for most of us, for the common consumer, I'm sure they prefer Live, especially when all their friends are probably on it.
The common consumer doesn't care that much about multiplayer gaming ... and as for communicating with friends, the facebook juggernaut is probably going to make that a third party non exclusive part of the online experience as well.
 
The common consumer doesn't care that much about multiplayer gaming ... and as for communicating with friends, the facebook juggernaut is probably going to make that a third party non exclusive part of the online experience as well.

Facebook won't be integrated into every game.

FPS games are well into the mainstream and therefore so is multiplayer. It might not be for everybody, but the best selling games are always seem to have it.
 
I think 3D might be a viable differentiator, although not just displaying 3D games, though that's predicated on widespread adoption.
Given the success and cost of Kinect, I have to wonder if it couldn't be more radical. The whole Wii/move/kinect thing smacks a lot of a move back to the VR/AR focus of the early 90's.
Motion controls are all about party and physical training games (excercise, dancing, martial arts). They are successful in that, it's effective in pushing hardware at least, but the consumers who buy into it I don't think are the same people looking for VR. Of course a smart tech guy could sell VR to management and pretend it's just an extension of motion controls ... a little white lie.

Even though Sony has already presented a HMD, I still think Nintendo is the most likely to go this way.
 
Does anyone have a reckoning on how fast an ARM based system could be when fitted to a home console? If the NGP is as powerful as it is on a handheld based platform, what could be designed and fitted to a home console to give it similar performance?

Im just thinking Nintendo here, if they choose to go for a form factor <25W for the entire system again then would ARM be a good choice to maximise perf/watt and perf/cost given the seemingly favourable licencing terms?

For instance, how would say a Power VR 6 with quad core Cortex 15 perform in comparison to AMD fusion in a <25W formfactor and how would the costs compare? The entire system on a chip integration seems to suit Nintendo from a spacing and cost perspective as they probably don't want the ancilliary ports that a PC based chipset doesn't have. Im also certain that there will be a number of new chips targetting this power bracket given the announcement of Windows 8 for ARM.

Finally if they went in this direction, would it at all be beneficial to bring both their handheld and home console lines onto the same architecture?
 
THQ said that they are expecting a Wii successor shortly. 2012 seems fair.


Possibly, however GDDR5 already provides more than enough bandwidth I believe to emulate ED-RAM successfully.

With a large MC yes. Xenos to edram bw it's in the order of 250 GB/s or am i wrong?

I think they will still use a form of off-die cache next-gen. If the power/die-size budget it's more constrained than the current generation, they will stick with a 128bit MC for the GPU, as it will be around or under 200 mm^2.

Will a 4Q@2013 console be able to use the 22nm process for the CPU side and the 20nm for the GPU side? GF sets at end of 2013 the production start for 20nm.
 
With a large MC yes. Xenos to edram bw it's in the order of 250 GB/s or am i wrong?

Yes but the bandwidth to the ED-RAM is only 32GB/S IIRC.

I think they will still use a form of off-die cache next-gen. If the power/die-size budget it's more constrained than the current generation, they will stick with a 128bit MC for the GPU, as it will be around or under 200 mm^2.

They may go with on die cache actually, the current GPUs even on the PC side are going for ever increasing levels of cache as a substitute for bandwidth. It just means one fewer chips to fab especially as the power density of chips is increasing they won't be pushing any reticle limits or yields by having that chip on the same die as the GPU.

Will a 4Q@2013 console be able to use the 22nm process for the CPU side and the 20nm for the GPU side? GF sets at end of 2013 the production start for 20nm.

Give it another year for a console to come out on a particular node. If 22nm comes out in 2013 then expect to see it on consoles in 2014. The high margin server and desktop chips get first dibs at a new process. It may happen the other way, but it is safer to expect a bigger node than to assume they will release on a bleeding edge node.
 
Yes but the bandwidth to the ED-RAM is only 32GB/S IIRC.
To emulate 360 effectively without eDRAM, you'd need enough framebuffer bandwidth to cope with the same workload that 360 is using in its eDRAM. Now on paper that's hard as you have 250 GB/s to use, but in real terms games rarely hit that peak as I understand it. Unless you go out of your way to saturate that internal bus with work, the bandwidth consumption of 360 is no different to a PC of similar spec. Thus for BC reason any conventional GPU could probably be used, with perhaps a bit of slowdown if a game was very heavy on this internal eDRAM BW. That may be true of some titles with masses of transparency.
 
Status
Not open for further replies.
Back
Top