Predict: The Next Generation Console Tech

Status
Not open for further replies.
Looking at R800 leaks, 800-900MHz, ~2.1B transistors at 330mm^2, 2GB of memory, on 40nm (iirc) it would appear some of my estimations of what we could see at 32nm was pretty close. Obviously 2012-2013 is a while off and we don't know a lot about potential shrinks beyond, but all signs indicate there is a lot of potential for very fast GPUs in future consoles that are significantly better than current gen hardware. 3-4TFLOPs with high levels of IQ (AA, AF, shader percision, efficiency, more memory and management, etc) should be attainable. The amount of work per pixel really will shift the focus to art and good tools for generating content.

Will this end the age of "bigger, better, faster" and move to "fast enough--and more investment in fast product development over shere platform 'potential'"?
 
??? I don't get your point, 3dilettante certainly knows that, hence his comment.
"Seriously" without a question mark means I agree with him :)
Just supplying additional info to underline the argument. His main point seemed to be that it was too early for G80. I'd like to add that G80, as an architecture, can be seen as wasteful for a closed-box design. The extra transistors beyond G70 are in large part spent on wider, fancier ALUs, when in a closed box you'd probably prefer to have more ALUs to get more work done per clock, i.e. old architecture, lots of throughput.

It's the old FP32 vs FP24 vs FP16 debate, basically.
 
"Seriously" without a question mark means I agree with him :)
Just supplying additional info to underline the argument. His main point seemed to be that it was too early for G80. I'd like to add that G80, as an architecture, can be seen as wasteful for a closed-box design. The extra transistors beyond G70 are in large part spent on wider, fancier ALUs, when in a closed box you'd probably prefer to have more ALUs to get more work done per clock, i.e. old architecture, lots of throughput.

It's the old FP32 vs FP24 vs FP16 debate, basically.
I completely misunderstood you, no prob so :)
 
The PC is not a console to be launched, it doesn't need AAA exclusives ... the PC will simply be there regardless. Ports, RTSs, CRPGs, MMORPGs keep it in existence. MMORPGs if nothing else will keep it so.

Every major MMO studio is either developing or has plans to develop an MMO on consoles.

In a year Valve might be able to market a DX 11 PC in console form factor for 400$ ... a box which would be able to render say Mass Effect 2/3 substantially prettier than the Xbox 360.

The Xbox 360 is $299 right now and you want consumers to pay $400 in a year? For higher AA and a tiny game selection.

PS. the casual market isn't a market I'm interested in ... it's an orthogonal issue. It might be the bigger market, it might not be, it certainly doesn't have the games I spend money on in it. You can't service both markets at the same time and for the moment the gamer market is still there and too substantial to simply ignore. You won't pull people like me along into the depths of hell consisting of popcap games and shovelware.

You are a market of one. Nobody is going to spend $50M to make a game to sell to just you, that is, assuming you even pay for it.
 
...3-4TFLOPs with high levels of IQ (AA, AF, shader percision, efficiency, more memory and management, etc) should be attainable. The amount of work per pixel really will shift the focus to art and good tools for generating content.

Will this end the age of "bigger, better, faster" and move to "fast enough--and more investment in fast product development over shere platform 'potential'"?
This post got me thinking, next-gen perhaps needs the best, most considered design of any console? Process shrinks are running out. Hardware afterwards is going to offer very limited improvements, and future processing leaps are either going to need new engineering tech to be mainstreamed, or to go networked.

This suggests to me next gen is going to be long, and the launch hardware is going to have to be designed to last. It'll need lots of leg-room for software optimisations to do more and more with the hardware.

As such I think the hardware should be as open as possible. A big lump of very fast RAM would be more flexible than an eDRAM solution like the 360's. Looking at the 360 now, we see some rendering options it isn't suited to, limiting rendering methods to a set. Thankfully that set is plenty good enough for this gen, and the hardware is fast at it! But next gen if there's to be eDRAM, I think it needs to be as an open scratchpad, rather than a specialist framebuffer resolve only thing. If devs want to stick textures in there, or audio data, let them!

I'm feeling programmability will be everything. A slower Larrabee versus a faster CPU+GPU combo could win out in the end, after 7 years on the shelves and still selling strong, because developers will be brilliant in turning programmability into attained results.
 
Every major MMO studio is either developing or has plans to develop an MMO on consoles.
Meh, the tripple A MMORPGs are on the PCs ... now and in the near future. Plans are nice, but no one is going to spend 50 million targeting consoles while Blizzard is pulling in small country GDP and the console MMORPG market is peanuts.
You are a market of one. Nobody is going to spend $50M to make a game to sell to just you, that is, assuming you even pay for it.
I don't pay for your mum either ... but lets not make this personal.
 
Meh, the tripple A MMORPGs are on the PCs ... now and in the near future. Plans are nice, but no one is going to spend 50 million targeting consoles while Blizzard is pulling in small country GDP and the console MMORPG market is peanuts.
But the console market is peanuts because no-one's put a decent MMORPG on them! I'm sure if WOW appeared on console, it'd do as nicely as on PC. Why wouldn't it? It's also stupid to take WOW as the benchmark. Most MMORPGS don't do fantastically. Just like most console games don't sell in Halo/Wii Sport/GTA numbers.

And looking up at the thread title, I see this line of discussion has to make sure it's not off topic. ;)
 
This post got me thinking, next-gen perhaps needs the best, most considered design of any console? Process shrinks are running out. Hardware afterwards is going to offer very limited improvements, and future processing leaps are either going to need new engineering tech to be mainstreamed, or to go networked.

This suggests to me next gen is going to be long, and the launch hardware is going to have to be designed to last. It'll need lots of leg-room for software optimisations to do more and more with the hardware.

As such I think the hardware should be as open as possible. A big lump of very fast RAM would be more flexible than an eDRAM solution like the 360's. Looking at the 360 now, we see some rendering options it isn't suited to, limiting rendering methods to a set. Thankfully that set is plenty good enough for this gen, and the hardware is fast at it! But next gen if there's to be eDRAM, I think it needs to be as an open scratchpad, rather than a specialist framebuffer resolve only thing. If devs want to stick textures in there, or audio data, let them!

I'm feeling programmability will be everything. A slower Larrabee versus a faster CPU+GPU combo could win out in the end, after 7 years on the shelves and still selling strong, because developers will be brilliant in turning programmability into attained results.

How how about a dual-GPU solution (w/ or w/o scratchpad) with a handful of fast OOOe cores (e.g. AMD64 are fairly small but efficient) with slightly beefed up Vector units? The GPUs with CPUs would allow first gen software to excell and "substantiate" the cost of next gen, but with more flexible GPUs and CT as time progresses more heavy work could be moved GPU side. Think of the GPUs as your graphics & vector hardware with the CPUs being the low laying fruit and task managers and everything that doesn't easily work well with the GPU/SPE paradigm. Somethings will always be hard to make parallel so having at least a few very fast, easy to use cores (especially if the market moves toward accessibility) could be worthwhile.

How many publishers want to invest $50M in a title where they have to work with 64 SPEs or an all Larrabee system? Do these designs really meat their immediate and intermediate goals? Can they get similar "legs" going with a different design? How about managed code acceleration ...
 
For PC MMORPG's to work for consoles we would need to bring back the Jaguar controller or the games would need to come with keyboard and mouse, or possibly stand-alone for extra accessory sales. I'm not sure how many people would use keyboard and mouse for their consoles and still be in front of a TV laying back on their couch. Some type of stand would be needed that made it really comfy.

For MMO's to work with a controller on a console they would need to be simplified quite a bit, perhaps made more complex in certain ways.
 
Shifty said:
Lot of stuffs, I cut for clarity
Larrabee is indeed tempting, I guess it's all depends on the performance gap between Intel and ATI/Nvidia. If ATi is right ie " we will crush them by a factor of 4" (I not sure I remember the quote properly but that was the idea) then larrabee may not be that much of a good choice, if the gap is minimal, say 50%faster that's another story.
For what we know the new HD5870 is power hungry, ATI own word is 190 Watts. If we are to double the througput once again to reach 3/4 TFLOPs even @28 power consumption consumption will be way too high for a console. Thus I'm not that optimistic.
Back to larrabee, it seems to suffer from quiet some disadvantages right now. For the same transistor budget (+2 billions) as ATI they end with a bigger chip +500mm² against supposedly 330mm², and to reach the same FLOPS figure they need to clock the thing @ 2GHz which @45nm might very well be out of their reach (late hints were more about ~1GHz). That's a consistent disadvantage in costs and perfs.
GPU will also pay their tribute to flexibility but it looks like they have some room left. To catch up Intel need to be able to reach higher clock speed to make up for lower process density and thy have to reduce the cost of the massive L2 cache (in regard to GPU, and that's 8MB for a 32 cores larrabee). I remember that Intel was working on something close to what IBM did to reduce the amount of transistors needed to make a memory cell, that may a savior if they manage to divide by two (or a bit more) the die space taken by L2 cache (and it has to be fast enought).

By the way there is a lot of talk about CPU vs GPU right now (and right here) and how architecture will meet or not, how would look a super flexible GPU without going the larrabee route?
 
I don't think the next gen consoles will have the cutting edge technology that a good lot of us want to see. Since their target resolution is @ 1920x1080... They'll probably just use existing technologies available now and wait for a die shrink while improving some of its parts to give them a higher level of efficiency. I wonder if they'll continue to use a PPC-based CPU, or will they go to either AMD or INTEL's offerings. The only thing I want to see in those specs is at least 4GB of RAM I'd be ecstatic if they'll have 8 GB.
 
I don't think the next gen consoles will have the cutting edge technology that a good lot of us want to see. Since their target resolution is @ 1920x1080... They'll probably just use existing technologies available now and wait for a die shrink while improving some of its parts to give them a higher level of efficiency. I wonder if they'll continue to use a PPC-based CPU, or will they go to either AMD or INTEL's offerings. The only thing I want to see in those specs is at least 4GB of RAM I'd be ecstatic if they'll have 8 GB.
If Sony goes to Intel, Ms may have to anticipate/adapt, even if Larrabee won't be standard in the pc realm, x86 CPU (I speak of the huge cores) are. That would allow for easy PC ports.

There is a lot of talk about many cores design for the next generation systems but I wonder if one will have what it takes to support the development of such chips. Intel will have one, that's for sure.
Where GPU will stand by this time is unclear, what kind of flexibility directxs12 will require?
Basically to match larrabee one would have to convince IBM to support a part of the R&D. It's possible as their is nothing new on the cell front and IBM may want something to compete against larrabee in the hpc market.
In any case if one his on Intel side it would make sense to go with AMD/ATI for the other one.
 
In a year Valve might be able to market a DX 11 PC in console form factor for 400$ ... a box which would be able to render say Mass Effect 2/3 substantially prettier than the Xbox 360.

That PC will have to run windows, which is at least 50-60$ in bulk, not to mention MS will be reluctant to give them bulk pricing since it will compete directly with the 360. This cost isn't shared with other consoles. Microsoft would've done it themselves by releasing a GameOS that only worked with a limited set of hardware like OS X, it's obvious that they're pushing the 360 instead.
 
These days you can get a $400-500 PC that will shame all consoles, but I truly believe power/heat will be a big factor next gen. That's the reason I'm not getting an HTPC, because it'll be big, loud, and will have high power and cooling requirements. The console is almost like a dvd player or a receiver, it needs to have lower power and heat than a PC to sell well.

That is why I could see the PS4 use the cell and larrabee, since cell is already pretty efficient when it comes to performance/watt, and ever since the P4, Intel learnt their lesson and power/heat is #1 consideration in their chip designs ever since. This is directly opposed to ATI/Nvidia, where their video cards use more power than the total power of the x360 or ps3. Current GPU's definitely aren't power efficient compared to the Core architecture we've seen from Intel, and I believe Intel can make a much better GPU regarding performance per watt than both ATI and Nvidia.

I feel both chips should be connected to at least 4GB of super-fast ram. The ram will make next-gen huge open world games possible. Think Killzone 2 in a city of million people with each person having their own AI etc. Also, PS4 probably wouldn't be any bigger than the PS3 slim, I think we're done seeing monster-sized consoles at launch.

Higher resolution textures could be included in PS3 games to make them forward compatible, since blu-ray has a lot of storage space. MS can do the same by putting them in a second disc and having an install, or download from XBL for free, since the next xbox will definitely have some sort of hard drive or SSD as standard. Existing games can also be patched to look better on the next gen consoles if all you're doing is rendering at higher res, with better textures and higher AA.
 
Basically to match larrabee one would have to convince IBM to support a part of the R&D. It's possible as their is nothing new on the cell front and IBM may want something to compete against larrabee in the hpc market.
You have it all backwards. Intel hasn't proven yet they can compete with Cell, let alone its next iteration. We know there's an x86 base, which means more transistors spent on instruction decode, which means less transistors spent on execution resources. IOW if both designs are tailored to the same die size, Intel's architecture should be expected to perform worse.
 
I don't think I got it all "backward" Intel has nothing to prove in regard to the cell at all.
On some market (not game) it will be ATI and more importantly Nvidia that will have to prove themselves.
 
Last edited by a moderator:
Looking at R800 leaks, 800-900MHz, ~2.1B transistors at 330mm^2, 2GB of memory, on 40nm (iirc) it would appear some of my estimations of what we could see at 32nm was pretty close. Obviously 2012-2013 is a while off and we don't know a lot about potential shrinks beyond, but all signs indicate there is a lot of potential for very fast GPUs in future consoles that are significantly better than current gen hardware. 3-4TFLOPs with high levels of IQ (AA, AF, shader percision, efficiency, more memory and management, etc) should be attainable. The amount of work per pixel really will shift the focus to art and good tools for generating content.

Will this end the age of "bigger, better, faster" and move to "fast enough--and more investment in fast product development over shere platform 'potential'"?

Given what say, KZ2 looks like on a lowly 7800GTX..it's kind of absurd what the 2.1 B transistors and 2.6 teraflops of 5870 would give us in a closed box environment...I mean wow. That's 7 times the transistors in the console GPU's, if not more.

And one would think we'll see better than 5870 in next gen consoles. It's pretty scary.

Of course every gen the consoles start out looking scary powerful (all the talk of the gigaflops in Cell and 300M transistors in RSX sounded pretty imposing at one time) but before long, they look weak in so many areas. It never fails.
 
The Xbox 360 is $299 right now and you want consumers to pay $400 in a year? For higher AA and a tiny game selection.

But that's always the choice that consumers have to make whenever we get a next-gen console.

We going to stop progress now?

However in previous generations, consoles were at $199 or lower price points before next-gen consoles launched.
 
Given what say, KZ2 looks like on a lowly 7800GTX..it's kind of absurd what the 2.1 B transistors and 2.6 teraflops of 5870 would give us in a closed box environment...I mean wow. That's 7 times the transistors in the console GPU's, if not more.

And one would think we'll see better than 5870 in next gen consoles. It's pretty scary.

Of course every gen the consoles start out looking scary powerful (all the talk of the gigaflops in Cell and 300M transistors in RSX sounded pretty imposing at one time) but before long, they look weak in so many areas. It never fails.

Yes, but look at say FMIII photomode. Something like an 5870 could run that in realtime at 60fps with all the highest IQ settings (AA, AF, motion blur, shadows, etc). As there is diminishing returns in resolution, poly count, etc and aliasing of various sorts gets cleaned up and the equation is, "Not a matter of what we can do, but how much we can do" in terms of object/object detail the balance shifts decidely to art. Sure, better hardware will always look better but the major issues this generation (aliasing of edges, textures, and shaders, texture resolution and variety, shadow quality, GI hacks) and seeing how they are being addressed in software and how the new HW really brute force resolves these issues I, personally, am excited about the prospects of next gen hardware being something we can "live with" for a while.

IQ this gen drives me nuts, especially textures and shadows and the lacks of 3D grass and brush. But a lot of progress has been made as well.
 
Status
Not open for further replies.
Back
Top