Predict: The Next Generation Console Tech

Status
Not open for further replies.
I think you could get a GPU with similar if not much better power numbers than a laptop part by doing a custom production. Right now I believe the laptop parts are down clocked and screen for best power numbers from the same production runs (as the full 7870 for example). All those parts are on the same process variation (likely 28nm HP as Nvidia is using) and targeting ~1 GHz performance.

If for instance, they chose to build a part with the same number of shaders and targeting only 750 MHz with a custom layout, they should be able to achieve higher density (thus smaller die size) as the transistors don't need to be as large. Larger transistors give you stronger drive and faster swiching times. Reducing their size will also lower leakage.

Going to a slower clock may allow them to use a LP variation of the 28nm process. That should further help power consumption and leakage.

With customization and a lower clock (750 MHz), I think you could reach a GPU a size of around 175mm and about 2 Tflop performance (1280 shaders & 750 Mhz) in about 75W. At least I hope. Another 35-45 for the CPU, 20 for fast RAM, and 20 for the rest of the system and you have a system around a 150W power requirement. Not unreasonable.

I'm going to guess migration of an existing design to an LP process won't be peas and carrots. Designs tend to be optimized for the process they are on.
 
From GAF. Grain of salt and all that (because he mistakenly said 32nm steamroller, where he likely meant Llano or Trinity). Now the post didn't come out of nowhere. It was pestered out after multiple posts detailing the origin of his information (he's in Markham, ONT, Canada and apparently lives with one of the engineers).

The implication here is that the CPU has been downgraded from its original target of an APU based on Steamroller to... yech... an APU based on Jaguar. He's also suggesting the same thing I've heard in that Microsoft has revised their console for the beefier and will essentially be the "OG Xbox" of the generation - with more of AMD's resources dedicated toward that particular console.

Links to original sources are considered good etiquette ;)
 
That GAF guy also fairly much claimed AMD will be doing the next Xbox CPU as well IIRC, so...yeah that seems dubious to me. If MS want BC (and it seems to be a priority for them unlike Sony), pretty sure they would need an IBM CPU again?

He also said something about his roommate/source moved from the Sony project to the MS one. If I recall, back when AMD was doing Wii/360 hardware, they had incredible strictness about keeping the two projects separate and no communication between them, and that for two projects that basically were not competing on performance, so I'd imagine it'd be times ten for Sony vs MS. So that sounded a possible red flag too for what it's worth.
 
Ok guys, GTX 690. 3072 shaders :p

Put this in a console. 400 watts. Make it as big as you need. 5.6 teraflops. You'll need 8GB of RAM to go with.

599 subsidized :p

A generation gap on top of a generation gap...
 
Ok guys, GTX 690. 3072 shaders :p

Put this in a console. 400 watts. Make it as big as you need. 5.6 teraflops. You'll need 8GB of RAM to go with.

599 subsidized :p

A generation gap on top of a generation gap...

Its $999 just for the card, I'm not sure anyone wants to sell 10 million of something that loses $600+ a unit... thats 6 billion in loss. ;)
 
That GAF guy also fairly much claimed AMD will be doing the next Xbox CPU as well IIRC, so...yeah that seems dubious to me. If MS want BC (and it seems to be a priority for them unlike Sony), pretty sure they would need an IBM CPU again?

He also said something about his roommate/source moved from the Sony project to the MS one. If I recall, back when AMD was doing Wii/360 hardware, they had incredible strictness about keeping the two projects separate and no communication between them, and that for two projects that basically were not competing on performance, so I'd imagine it'd be times ten for Sony vs MS. So that sounded a possible red flag too for what it's worth.

For what it's worth like I mentioned in that thread I had heard MS made an all-AMD switch as well though I'm still working on getting more confirmation of that. I give MS credit for how well they've been keeping things close to their chest, though the early rumors may have dealt with how the console was shaping up before the switch to make it supposedly more powerful than before.
 
I distinctly remember back in Wii/360 development days AMD had separate campuses in separate cities for the projects, and no communication at all was allowed between them.

And that was for Wii and 360, where since they were non competing in performance terms, you'd think it wouldn't even be that huge a deal. As I say MS vs Sony would be times 100.

So this guy saying his roommate worked on the Sony hardware, then moved to the MS hardware, seems odd, though not entirely impossible.
 
Its $999 just for the card, I'm not sure anyone wants to sell 10 million of something that loses $600+ a unit... thats 6 billion in loss. ;)

Video card pricing bears little relation to cost. Rather market segmentation and profit volume. GTX 680 doesn't cost anywhere near 1k, though yes it would be expensive.

Very roughly most big GPU dies cost ~$100 I think. GK104 is even quite small for a big chip. Although I dont doubt yields currently suck, once that gets sorted it should be a pretty economical chip.

Of course my initial suggestion was tongue in cheek, but then again once you realize it's a 294mm^2 part, it probably isn't so wild!
 
Ok guys, GTX 690. 3072 shaders :p

Put this in a console. 400 watts. Make it as big as you need. 5.6 teraflops. You'll need 8GB of RAM to go with.

599 subsidized :p

A generation gap on top of a generation gap...

technically anything is possible, it is just a question of subjective future cost/benefit analysis of sony and microsoft, or in other words how sony or microsoft are predicting the reaction of consumers at different price/performance levels of the hardware. The best entrepreneurs are those (like steve jobs for example) who predicts best what future consumers would like to buy. But if they got it wrong they loose a lot of money...

sony evaluated subjectively and believed that the 600$ ps3 hardware would be financially a huge success, they were proven wrong by consumers, they have been forced to cut the price in 1 year from 600$ to 400$ loosing a lot of money.

would this influence the decision of sony for its ps4 ? I think absolutely yes, but how this would influence the decision of sony ? only sony decides, and if they make the wrong prediction of future consumers behavior they would be in real trouble...
 
Good piece on chip stacking, new nodes, interposers, bigger wafers etc. All affect potential next gen processors for consoles if MS and Sony want to be enterprising:

http://cdn.eetimes.com/electronics-...ee-20-nm-variants--3-D-ICs-ahead?pageNumber=0

Basically, we are on the cusp of so many technologies: FinFET adoption, TSV and use of interposers, 450mm wafers, UVL etc.

Even that article touches on the high costs and challenges of many of those technologies. I'm doubting any will be ready for console use in 2013-14.

More interesting to me is the article says there aren't fundamental physical atom size challenges to node shrinkage down to 7nm (it will just be more difficult, slower, and costlier to get there).
 
Is there any technical reason why xbox 720/ps4 won't be sporting amd hd 8850 or higher gpu's with 3-4gb gddr5? And still be ready to launch in holiday 2013.
 
Is there any technical reason why xbox 720/ps4 won't be sporting amd hd 8850 or higher gpu's with 3-4gb gddr5? And still be ready to launch in holiday 2013.

It may well be sporting a newer technology graphics than that which is available today, but if you see 3 or 4GB of gddr5, it would probably be for the entire system. A console isn't going to be running above 1080p
(4K is a pipe dream, get over it already)
so the benefits of more than 2GB just for graphics are pretty limited and GDDR5 still isn't cheap.
 
It may well be sporting a newer technology graphics than that which is available today, but if you see 3 or 4GB of gddr5, it would probably be for the entire system. A console isn't going to be running above 1080p
(4K is a pipe dream, get over it already)
so the benefits of more than 2GB just for graphics are pretty limited and GDDR5 still isn't cheap.

I agree, 2GB of GDDR5 (or XDR2 :p:LOL:) is our best hope. That can do 1080P on modern games with room for high res textures and processing effects. 2GB/2GB split (or 4GB unified) seems like our best hope.
 
There's been a lot of talk online about next gen consoles being underpowered. Why would anyone pay $400 plus for a console that's only a little improvement over this gen?
 
There's been a lot of talk online about next gen consoles being underpowered. Why would anyone pay $400 plus for a console that's only a little improvement over this gen?

They wont. If they are so underpowered, (say worst case scenario, 6670) they will be priced at 199 or something.

I dont think they will be underpowered though.

Plus, it's almost impossible to build something not a lot better than this gen. A 6670 is 3-5X this gen. What that 6670 would prevent imo is a move to 1080P...as you wouldn't see enough difference unless we stayed at 720P, plus I imagine it's pretty fill rate limited.

A 6670 would only put 1.3 the flops per pixel at 1080P as a Xenos, whereas it would do 3X at 720P, speaking only of raw flops per pixel. Nobody is going to buy a next gen console with 1.5X better looking games, just at 1080P...3X at 720P would be more compelling.

The more I think about it the more nigh ridiculous the 6670 is. Next gen would be a laughingstock...Imagine, 360 got called Xbox 1.5 more than a few times in it's early years....
 
Status
Not open for further replies.
Back
Top