Predict: The Next Generation Console Tech

Status
Not open for further replies.
You can't please everybody ... once the dance game market is driven into the ground by oversupply it will take a while to find the next big thing for motion controls. The real gamers are less fickle ... but they do want a significant improvement in graphics, if Microsoft and Sony don't collude it's very dangerous to disappoint them in this regard because the other one would mop up.
 
You can't please everybody ... once the dance game market is driven into the ground by oversupply it will take a while to find the next big thing for motion controls. The real gamers are less fickle ... but they do want a significant improvement in graphics, if Microsoft and Sony don't collude it's very dangerous to disappoint them in this regard because the other one would mop up.

Will those real gamers get significantly better graphics with significantly better hardware when developers are targeting the following platforms?

Xbox 360/PS3 <<< Wii U <<<<< NG1 <<< NG2 where NG1/NG2 are the respective next gen Microsoft/Sony platforms. Even once they drop support for the legacy platforms they'll likely still release games for the Wii U. Once the base game design is set for the lowest common denominator the difference faster hardware makes won't be realized and the cost for doing justice to that performance will be again more expensive. Exclusives won't cut that mustard either given the fact that they will have significantly smaller markets to target. If people won't be able to tell if a next generation console is 2* faster than a current generation console as a whole then they likely won't know that one platform is 2* the performance of the other given increased diminishing returns.

P.S. I never said slow hardware, my target has always been realistic hardware given the other requirements they also must meet.
 
You need to convince mom to buy the upgrade, and it will take power to do that, not a fancy UI update. Early adopters (the people willing to pay $400 and more for a box) are also most likely to be looking for performance.

And the problem exists when you release something mediocre and your competition decides to push the envelope and you wind up looking 2nd best, so I expect MS and Sony will both try to get the most out of their boxes.
 
The laptop cost is somewhat immaterial. For instance, what Microsoft did with the Xbox 360 was buy the rights to the Xenos design, ATI/AMD only gets royalties. Thus, microsoft's cost is pure: the fab cost plus a small royalty. They don't need to make a huge markup on the fab cost because they plan to make it up in software, accessories and services.

The difference between a robust solution for cooling and a meager one can be just a few dollars in parts. Those dollars can add up here and there, but if it is necessary, it is necessary.

My point is you're referencing laptop GPU's all the time when we talk about what MS could fit into a console. But those GPU's come at a premium price compared to a similarly speced desktop part. Just because they can get that kind of power in a $2000 laptop doesn't mean MS are happy to do that in a $400 console.

I know MS won't buy each chip, but the amount it'll cost them for the design and the cost of the royalties and even how much it costs to fab them (surely mobile GPU's have significantly lower yields) will change depending on what they want from the chip.

Also if a great cooling solution is such a small cost increase compared to one that makes the solder melt in your console then why didn't MS use one with XBox 360? PS3 was a big hot and loud box too, why didn't they spend the extra few dollars for a robust cooling system?
 
Last edited by a moderator:
They charge a premium for the laptop GPU's because they can but as far as the silicon goes isn't it down to die area for cost (I guess some choice binning as well)? Not that they will be using anything like a discrete part for a console anyway.
 
There is couple of big differences between laptop cooling and console cooling that haven't been discussed yet.

1) Laptop can downclock if it becomes too hot, console always need to be able to work at peak performance
2) Combine 1 and worst conditions and Do Not Fail(hot summer day, high ambient temperatures)

I wouldn't expect what has been stuffed inside laptop and what seems to work there to function on a console reliably when you combine 1&2 from above. Especially not so when you run the console at peak performance for 6h+ gaming session...


It would only downclock if it was overheating or on battery power, a gaming laptop would not be fit for sale if it downclocked under normal conditions when running a high work load (like playing a game).

A console would have a lot more space to work with than a laptop.


Looks noisy. Dunno if the console companies are going to consider the noise profile along with the heat profile, but there's going to be some point where they decide their box is too noisy if they just put lots of small fans in there. PS3's cooling solution looked well engineered and was pretty costly and space consuming. As cooling tech hasn't advanced a massive amount (or any amount beyond fans and heat sinks), perhaps the best cooling we can hope for is something the same as PS3. In which case whatever chips are used need to fit the thermal output of PS3.

Unless PS3's cooling solution wasn't very optimal and better engineering can deal with higher temperatures in the same space.

It does not matter if the laptop looks noisy, not only is it highly unlikely that the next gen consoles will have 2 GPUs, but also the laptop is much thiner than a console. In a console you could fit one slower, large fan to do the cooling (like the PS3) instead of a bunch of small, fast and noisy ones.

They charge a premium for the laptop GPU's because they can but as far as the silicon goes isn't it down to die area for cost (I guess some choice binning as well)? Not that they will be using anything like a discrete part for a console anyway.

I doubt that Laptop GPUs are high binned parts, they have all the signs of being low binned parts.

e.g The GTX580M runs the same core as the GTX560 Ti, but is clocked at 620Mhz on the core vs 822Mhz, shaders at 1240 vs 1645, memory 3000 vs 4008.
 
Last edited by a moderator:
They charge a premium for the laptop GPU's because they can but as far as the silicon goes isn't it down to die area for cost (I guess some choice binning as well)? Not that they will be using anything like a discrete part for a console anyway.

But when it comes to a custom GPU that's only being used for that system binning surely means throwing away a lot of working chips, which will drive up cost.

For example the chip being mentioned (6990M) is 100w, but the standard non binned version is the 6870 which is 150w. If they were only using that chip for Laptops they'd have to be throwing away a hell of a lot of 6870's..
 
Last edited by a moderator:
I doubt that Laptop GPUs are high binned parts, they have all the signs of being low binned parts.

e.g The GTX580M runs the same core as the GTX560 Ti, but is clocked at 620Mhz on the core vs 822Mhz, shaders at 1240 vs 1645, memory 3000 vs 4008.

I'd suggest they are binned for heat and power rather than high performance.
 
But when it comes to a custom GPU that's only being used for that system binning surely means throwing away a lot of working chips, which will drive up cost.

a lot? a few? some? Obviously they have the advantage in a pc environment of still using a part which might not reach certain goals. My point being I don't think what they are charging for a high end laptop part has any significant relation to its cost. The console is going to benefit from economies of scale that these laptops most certainly do not.
 
a lot? a few? some? Obviously they have the advantage in a pc environment of still using a part which might not reach certain goals. My point being I don't think what they are charging for a high end laptop part has any significant relation to its cost.

Of course, I'm just saying that I'm looking at this within the context of a console.

anexanhume is pointing out a GPU which is binned so they can drop it from 150w to 100w for use in a laptop and saying that kind of thing would be fine for a console, I don't agree. That sounds like pretty picky binning to me. I'm not expert but if you're going to only use chips that run 17% cooler than normal then you're going to be throwing away quite a lot of chips.

EDIT: forgot the 20% down clock.
 
Last edited by a moderator:
Well it's not just binned to reach 100w, it's down clocked by 20% or something. I really don't think there's much point in looking at a 6990m in terms of what will be in a console launching in 2013, you're going to see 2 or 3 generations of parts between now and then, and while the console part might be similar, I wouldn't expect it to be the same. It might be useful as a reference point in terms of TDP, but I'm not sure how that is really helpful.
 
Good point something I noticed and totally forgot in my last post :) 20% down clock, and another 17% from binning.

I agree its not useful to look at a laptop GPU to see what next gen consoles will use.

EDIT: I could be wrong on the TDP actually, looking around the 6990M is often reported as 75W. Though that doesn't do anything to change my mind. I can't see any console manufacturer designing their consoles to require significant CPU/GPU binning over and above "this chip works, this one doesn't".
 
Last edited by a moderator:
Remember that power and heat do not scale in a linear way.

The drop in clocks might be enough to cut the heat and power down.
 
My point is you're referencing laptop GPU's all the time when we talk about what MS could fit into a console. But those GPU's come at a premium price compared to a similarly speced desktop part. Just because they can get that kind of power in a $2000 laptop doesn't mean MS are happy to do that in a $400 console.

I know MS won't buy each chip, but the amount it'll cost them for the design and the cost of the royalties and even how much it costs to fab them (surely mobile GPU's have significantly lower yields) will change depending on what they want from the chip.

They wouldn't need the same yields. As others have pointed out, they could afford to exceed the 200W barrier given their space.

Also if a great cooling solution is such a small cost increase compared to one that makes the solder melt in your console then why didn't MS use one with XBox 360? PS3 was a big hot and loud box too, why didn't they spend the extra few dollars for a robust cooling system?
Lowest BOM cost while meeting specs. They didn't build in margin because they are being graded on cost per console, not engineering excellence. I would almost bet my left arm the engineer on the project wanted beefier cooling.

I'm agree with many here.

My bet for next gen is something like 1408 SIMD/streamprocessors as a future Radeon HD 7850 (effective performance of a 5870?), As was said by many here, today we have the 6990M (1120 streans/715MHz) under 40nm dissipating 75 watts in relatively small size of notebook (with many things like large battery etc) may be possible place with good cooling system even a gpu 100 watts.

(a 1408/SIMD/HD7850/6950 gpu could reach new highs on a closed box console)

6990M is a 100W card to compete with the 485M/580M. The 6970M is the 75W card.
 
6990M is a 100W card to compete with the 485M/580M. The 6970M is the 75W card.


Maybe you're right, however there is conflicting information showing 6990M as 75 watts TDP and others claiming 100 watts, but either way it still under 40nm perhaps at 28nm allow a 1408ALUs/gpu and still meet the 100 watts,so i still believe in a system with efficient cooling system allows this level gpu in a next gen console.



http://computerinvoices.com/geforce-gtx-580m-sli-radeon-hd-6990m-cf.htm

http://en.wikipedia.org/wiki/Comparison_of_AMD_graphics_processing_units

http://www.tomshardware.com/reviews/gtx-580m-sli-hd-6990m-crossfire,3022-2.html

Nvidia about Geforce 500M revision N13 series Entusiast model at 28nm at 75 watts:

http://www.nordichardware.com/news/71-graphics/43990-nvidias-first-mobile-28nm-gpus-revealed.html

28nm.mobile.png
 
Last edited by a moderator:
You can't please everybody ... once the dance game market is driven into the ground by oversupply it will take a while to find the next big thing for motion controls. The real gamers are less fickle ... but they do want a significant improvement in graphics, if Microsoft and Sony don't collude it's very dangerous to disappoint them in this regard because the other one would mop up.

This is exactly what I've been saying from the beginning. MS and Sony need to push for as much performance possible or they risk losing a good chunk of the early adopters to the competitor.

Will those real gamers get significantly better graphics with significantly better hardware when developers are targeting the following platforms?

Xbox 360/PS3 <<< Wii U <<<<< NG1 <<< NG2 where NG1/NG2 are the respective next gen Microsoft/Sony platforms. Even once they drop support for the legacy platforms they'll likely still release games for the Wii U. Once the base game design is set for the lowest common denominator the difference faster hardware makes won't be realized and the cost for doing justice to that performance will be again more expensive. Exclusives won't cut that mustard either given the fact that they will have significantly smaller markets to target. If people won't be able to tell if a next generation console is 2* faster than a current generation console as a whole then they likely won't know that one platform is 2* the performance of the other given increased diminishing returns.

P.S. I never said slow hardware, my target has always been realistic hardware given the other requirements they also must meet.

IMO if we see a repeat of this gen, where the two high end systems are roughly equal, I see these two systems becoming the target platforms/specs.

If there's sales to support the investment, I do see the Wii-U enjoying far greater support than the Wii did this gen. Even if the 720/ps4 outclass the Wii-U in spec, it should still be far easier to down port a 720/ps4 game to a WiiU than it was to port a 360/ps3 game to the Wii.
 
I doubt that. It's peak theoretical performance is slightly better than the PS3 and 360 from what I've seen. No way that performance can be achieved only consuming 30W. My guess is between 60 and 100W.

It´s using newer designs thus more efficient. Wii had GPU heating problems when it was using only less than 20W and the current Wii U box is just bigger than that. 50W+ seems impossible for that design
 
It might be slightly more efficient due to experience at the node, but if it's going to have 100%+ the performance of the ps360 on the same node of technology (40nm) it's probably going be in the neighborhood for consumption.
 
Status
Not open for further replies.
Back
Top