Predict: The Next Generation Console Tech

Status
Not open for further replies.
fair point. What is the power envolop for todays console gpu?

I think rsx and xenos both started in the 50W range. There would be savings moving the chip to consoles. The total power for both HD systems is under 200W now. I'm not sure what manufacturers would see as the limit for a console. I don't expect them to be 500w monsters, but I suppose its possible.
 
The PS3 has 256MB of main memory, and the original design has 4 XDR DRAM chips of 512Mbit each. As lithography improves, these four could be reduced to a single 2GBit device soldered to the mainboard, with all communication with the rest of the system unchanged. I'm convinced that at the end of the PS3 life cycle, we'll see this done. To do a similar number with the 512Mbit GDDR3 chips however, you would have to move away from commodity chips.
Interesting, but IMO that's not really related to Rambus technology. They just have a lot more data lines per chip, right?

Speaking of which, when will we see 360 move to four 1MBit chips?
 
Yeah for the present version of Cell, they'd have to stick with for DRAM devices. Now on the assumption they do a redesign at 32nm - which seems likely given how much dead area the chip is taking on in these automated shrinks - maybe we'll see them go with XDR-2 instead, since the I/O will be set for a revamp. In that instance we might expect a mem-clock of 6.4 GHz to Cell's 3.2, and thus the ability to reduce to two devices.
 
I think IBM will feel far more comfortable using a commodity technology than a Rambus one in their enterprise products ...

Maybe IBM is happy with just sharing the cost of CPU logic design with Sony and Toshiba, I mean they already changed the memory interface to DDR2 for their HPC Cell.
 
Im' not sure about how involved Toshiba and Sony are in the evolution of cell (dp version and further).
In some slide I posted on another forum (cell performance) the cell B.E is not tagged with brand but others cell evolution are IBM "branded" or "toshiba" for "spurs engine"

I don't know how STI works in regard to IP.
Looks like everybody can use the original IP but evolutions/redesigne etc are proprietary.
 
Then how will they get 25.6 GB/s? A single 2GBit chip will only get you a quarter of that at 3.2 GHz.
Thanks.
You're right, they use four chips connected in parallel. I should have known that writing from memory on peripheral knowledge is stupid. :)
It seems getting access to the overviews requires registration at Rambus.
Rambus presentations
Sigh. I think it's preferable to plow through the Qimonda tech docs. Ugh.
 
I think rsx and xenos both started in the 50W range. There would be savings moving the chip to consoles. The total power for both HD systems is under 200W now. I'm not sure what manufacturers would see as the limit for a console. I don't expect them to be 500w monsters, but I suppose its possible.


I assume you meant started in the 500W range?
 
Thanks.
You're right, they use four chips connected in parallel. I should have known that writing from memory on peripheral knowledge is stupid. :)
It seems getting access to the overviews requires registration at Rambus.
Rambus presentations
Sigh. I think it's preferable to plow through the Qimonda tech docs. Ugh.

Since that are only using half the capacity they could go to a 2 chip design or even an exotic single split chip. Either way, there's a consolidation strategy available to them.

Another consideration is shinkage, less/more flexible pin placements are desirable as pin placements must maintain a certain amount of spacing. This is a large part of why the Cell already has a 45nm shrink and the Xenon struggled to get a 65nm shrink and may never see a 45nm.

Btw, currently the XDR is the single largest component cost on the PS3 and its shrink/consolidation is the primary focus of cost reductions this yr.
 
Since that are only using half the capacity they could go to a 2 chip design or even an exotic single split chip. Either way, there's a consolidation strategy available to them.

Another consideration is shinkage, less/more flexible pin placements are desirable as pin placements must maintain a certain amount of spacing. This is a large part of why the Cell already has a 45nm shrink and the Xenon struggled to get a 65nm shrink and may never see a 45nm.

Btw, currently the XDR is the single largest component cost on the PS3 and its shrink/consolidation is the primary focus of cost reductions this yr.

complete bullshit...
The xenon is not directly linked to the ram and the Xenos acts as the nothbridge.
Between the connection between the Xenon and Xenos had been designed with shrink and price reduction in mind, it's made through few really fast serial lanes.

Xenon is still not shrink because it costs money and Ms is alone on this front... unlike Sony thus STI.
 
Last edited by a moderator:
I know every gen we all say, this will be the gen of 60fps for all games, but,I really expect that by the time the next gen is out it will be smashing 1080p @60fps for most things. I literally cannot see why this wont happen.

60 FPS will never, ever, become the standard because devs will always push the hardware to it's knees.

People thought this gen's GPU's were so beastly as to never 30 FPS too. And the gen prior to that. And prior to that..

Now we see Xenos and RSX as the pathetic weaklings they are, the exact same way we will see Xenos 2 and RSX 2 3 years after they are out.

Crysis alone will likely not run 1080P 4XAA 60 FPS on next gen consoles..a game that is out now. Considering current cards dont get close to that, and even a couple more doublings wont do it.

Something that looks as good or better probably will, since Crysis is terribly unoptimized, but it illustrates trhe point.
 
60 FPS will never, ever, become the standard because devs will always push the hardware to it's knees.

People thought this gen's GPU's were so beastly as to never 30 FPS too. And the gen prior to that. And prior to that..

Indeed, it is a fallacy based on a PC mindset. Yes, these new GPUs can push current games to 60fps at uber resolutions etc, but that is hardly related to the nature of console-developed games. As you say, developers will just push the engine to get the most visual bang a screenshot can show. Whether or not they do it efficiently will be the crux.

Sure we could have 60fps games this gen, but we'd be stuck with last gen graphics. Heck, even the XBLA/PSN games aren't all 1080p 4xAA.
 
Im' not sure about how involved Toshiba and Sony are in the evolution of cell (dp version and further).

They're still very much a collaborative partnership, but yeah IBMs HPC branch and Toshiba's SpursEngine branch are projects that are funded by the individual companies pursuing them. The partnership as a whole is mainly concerned with die shrinks right now and process technologies. They'll get more active when time comes around for the 'Cell 2' development.

I assume you meant started in the 500W range?

Definitely 50W rather than 500W. :)

Since that are only using half the capacity they could go to a 2 chip design or even an exotic single split chip. Either way, there's a consolidation strategy available to them.

Another consideration is shinkage, less/more flexible pin placements are desirable as pin placements must maintain a certain amount of spacing. This is a large part of why the Cell already has a 45nm shrink and the Xenon struggled to get a 65nm shrink and may never see a 45nm.

Btw, currently the XDR is the single largest component cost on the PS3 and its shrink/consolidation is the primary focus of cost reductions this yr.

Upnorthsox, you've completely lost me. Not only do I not understand your statement wrt 'half the capacity,' the Xenon is definitely going to get down to 45nm.

As for XDR being the single largest cost center right now, I would find that nearly impossible to believe.
 
Indeed, it is a fallacy based on a PC mindset. Yes, these new GPUs can push current games to 60fps at uber resolutions etc, but that is hardly related to the nature of console-developed games. As you say, developers will just push the engine to get the most visual bang a screenshot can show. Whether or not they do it efficiently will be the crux.

Sure we could have 60fps games this gen, but we'd be stuck with last gen graphics. Heck, even the XBLA/PSN games aren't all 1080p 4xAA.

I will carry on living in my fantasy world that one day the majority of games will have 60fps....

get goal....
 
In some sense, the 60+ fps we see on the PC side are due to the overall decline of the PC market - there very few PC games made to really push hardware. If PCs were the main target for most graphics-intensive games, you'd be seeing sub-30fps all over the place.
 
Indeed, I conceed, you are all correct. We will not have too many 60fps console games nect gen. It will be the same as this gen, but on 1080p instead of 720p.....
 
Indeed, I conceed, you are all correct. We will not have too many 60fps console games nect gen. It will be the same as this gen, but on 1080p instead of 720p.....

Which will be very interesting as it means PC's won't even be scaling up resolution over console games any more. Image quality should be roughly even aswell (although by then 8xMSAA may well be a high end standard for PC's.

So were will PC's poor all that extra power? We're only going to need double the power for 60fps/1080p in "next gen" console games so thats achieved 1 year after their release. What do even more powerful GPU's do after that?

Hopefully devs will do something with all that spare power, since mid range PC's will usually run at a lower res than the consoles and hence not need as much power, we may see console to PC ports becoming even more common.
 
That if 1080p will be the standard for "next-gen" consoles to begin with (something I doubt)!
 
Status
Not open for further replies.
Back
Top