Predict: The Next Generation Console Tech

Status
Not open for further replies.
That's about the technical side of console design, Cell is special for Sony because its IP cost is supposed to be cheaper than an IP entirely developed by a third party. In other words, SCE has to recover the development cost of Cell by reusing its IP for a game console unless it's embedded in every TV. In an earlier phase in a console life cycle transistor cost is important but after shrinks IP cost becomes more apparent in the way of cost cut.
Sony will have to pay for the GPU no matter what.

And if they can produce the cell freely, designing a 4PPU 32SPU huge chip wouldn't come free in regard to R&D and implementation could be costly:
need for huge bandwith internal and external
Gpu also need that it could be dificult to achieve an UMA design.
etc.

Shortly making a four time bigger won't come anything to free in regard to costs.
 
Too Tricky...
Fewer chips on the board = better board.

Start with a mature die process and design a complimentary internal bus architecture.
Join separate chips into System on Chip units as production and die processes mature.
Ok but look at this generation depending on the silicon spent on the future system it will tough to pack all this little cores together.

Anyway you're right Fewer chips on the board = better board.

But I was missing time yesterday night (had to go to bed working early in the morning :( ).

I think that implementing two gpu is non trivial.

If we look at how SLI/crossfire systems work right now, it looks clear that manufacturers won't be in a position to afford it:
two vram pools
twice the rop
etc.

How to deal with that and reduce the overheads?
I guess that where it can hurt in regard to R&D.

I guess that at some point the cpu provider and the gpu provider have to work together.

How it could look?

I don't know what is possible in regard to process/r&d cost etc..

I see two possibilities:

1) include most of the gpu fixed function hardware on the cpu as well as the memory controler, and your left with two coprocessors/chip

made mostly of shader cores.

2) Something that would looks more like what AMD could come with.
could be made of only two chips:
1: CPU+shader cores 2: shader cores plus fixed functions hardware.


While the later might lend to a cheaper mobo design but I feel that the former good yield to better results and wouldn't cost that much

(always questionable when we speak of millions and millions of units...).

You could provide the CPU/GPU a huge bandwidth and use two fast serial lane to connect the two coprocessors to the cpu (gpu thread scheduler infact, think about a Y for the datapath)
If the main thread scheduler is on cpu it could make load balacing easier that having two (one on each die) trying to discuss.
 
Sony will have to pay for the GPU no matter what.
GPU designs and associated costs won't be the same in every possible case. In addition, there are other factors such as available process technology, memory technology, yield, and motherboard layout. In the above examples by Shifty, "300 M on CPU and 300 M on GPU, or 50 M on CPU and 550 M on VPUs, across multiple dies if necessary", the cost structures of these 2 designs will be very different. If I were a GPU manufacturer and it had to be on 2 dies, I would demand 2x licensing fees plus a SLI/Crossfire license! Another drawback is, the CPU in the latter configuration will not benefit from process shrinks as much as the former.
 
Last edited by a moderator:
GPU designs and associated costs won't be the same in every possible case. In addition, there are other factors such as available process technology, memory technology, yield, and motherboard layout. In the above examples by Shifty, "300 M on CPU and 300 M on GPU, or 50 M on CPU and 550 M on VPUs, across multiple dies if necessary", the cost structures of these 2 designs will be very different. If I were a GPU manufacturer and it had to be on 2 dies, I would demand 2x licensing fees plus a SLI/Crossfire license! Another drawback is, the CPU in the latter configuration will not benefit from process shrinks as much as the former.
I agree completely.
Thus that why I made a post about how this kind of setup could implemented.

But more insight is welcome in regard to what is possible or not (even cost put aside) ;)

EDIT
About the size of the CPU, it 's one of the reason why I wondered if it would be possible to put all the gpu fixed function units on the cpu.
A bigger chip (pin number I guess) can help in regard to bus size.
 
About the size of the CPU, it 's one of the reason why I wondered if it would be possible to put all the gpu fixed function units on the cpu.
A bigger chip (pin number I guess) can help in regard to bus size.
AMD would want it since they can license a future version of their Torrenza co-processor technology for silicon-level integration along with all software infrastructure though I don't know how difficult it is to program for game programmers while Intel is trying to use x86 and SMP for the very reason.

Then again, WRT Sony, the Cell B.E. architecture (and its DMA programming model) already covers it as a heterogeneous design. It has a ring bus (EIB) for scalability. SpursEngine has 4 SPEs and 4 H.264/MPEG2 fixed-function cores interconnected by EIB. Of course they'll continue to evaluate all possible deals as they did with NVIDIA for RSX till the last deadline, but reinventing a wheel and getting a license of a new third-party IP is not a very likely option for them at this moment, IMHO, unless it can be a bargain for some external reason like Rambus and NVIDIA deals or it's super-efficient in perf/watt.

Also, Wii's effect on the market is tremendous. Investors won't like a console design that is ambitious in terms of manufacturing cost and long-term vision. It will make a console life cycle shorter, and make a leap between consoles smaller. In a word, it will become conservative. The age of "300 M on CPU and 300 M on GPU, or 50 M on CPU and 550 M on VPUs, across multiple dies if necessary" may be already over for all 3 companies.
 
Also, Wii's effect on the market is tremendous. Investors won't like a console design that is ambitious in terms of manufacturing cost and long-term vision.
Do investors get a say? They can choose to pull out of funding a company if they think the hardware's gonna be a flop, but the companies don't need to declare their hardware intentions. Investors in Nintendo had no idea what they were gonna pull.
 
I don't think Wii's success will make Microsoft & Sony do modestly upgraded 360/PS3 for the NextXBox/PS4.

32nm should be the process for NextXbox CPU and GPU.
Maybe PS4 also.

The leap from current multicore CPUs (Xenon and CELL) which have ~165M and ~235M transistors to manycore CPUs is going to require at least a billion transistors.

The current console GPUs are over 300M transitors (Xenos GPU+EDRAM and RSX) so I'm thinking at least 2B transistors for each next-gen GPU. That's being really conservative too.

The upcoming GT200 has at least 1B and it's a design that should've been out in late 2007, instead it's coming Q3 2008. Nvidia's next-gen architecture for 2009-2010 should be at least 2B transistors.


3B transistors in next-gen Microsoft and Sony consoles is a pretty conservative estimate given that current consoles have roughly 1/2 a billion.
 
AMD would want it since they can license a future version of their Torrenza co-processor technology for silicon-level integration along with all software infrastructure though I don't know how difficult it is to program for game programmers while Intel is trying to use x86 and SMP for the very reason.

Then again, WRT Sony, the Cell B.E. architecture (and its DMA programming model) already covers it as a heterogeneous design. It has a ring bus (EIB) for scalability. SpursEngine has 4 SPEs and 4 H.264/MPEG2 fixed-function cores interconnected by EIB. Of course they'll continue to evaluate all possible deals as they did with NVIDIA for RSX till the last deadline, but reinventing a wheel and getting a license of a new third-party IP is not a very likely option for them at this moment, IMHO, unless it can be a bargain for some external reason like Rambus and NVIDIA deals or it's super-efficient in perf/watt.

Also, Wii's effect on the market is tremendous. Investors won't like a console design that is ambitious in terms of manufacturing cost and long-term vision. It will make a console life cycle shorter, and make a leap between consoles smaller. In a word, it will become conservative. The age of "300 M on CPU and 300 M on GPU, or 50 M on CPU and 550 M on VPUs, across multiple dies if necessary" may be already over for all 3 companies.

A collaboration between ATI and IBM seems more likely than with Nvidia I agree.
On the other side, ATI/AMD may be laking on the software side, if Intel and Nvidia offer havoc and physicX support on all support they are likely to make most of the optimisations fortheirs respective platfrom.
If this wouldn't be that much of a problem for MS I agree on the fact that it could be more annoying for SOny.

I agree in regard to the silicon budget since the very begining of this thread ;)
Wii impact and maybe less room for future die shrink, will push manufacturers to design cheaper systems.
But I don't MS or Sony will go with something as conservative as Nintendo did with the Wii.

May edit and add some stuffs later.
 
I don't think Wii's success will make Microsoft & Sony do modestly upgraded 360/PS3 for the NextXBox/PS4.

32nm should be the process for NextXbox CPU and GPU.
Maybe PS4 also.

The leap from current multicore CPUs (Xenon and CELL) which have ~165M and ~235M transistors to manycore CPUs is going to require at least a billion transistors.

The current console GPUs are over 300M transitors (Xenos GPU+EDRAM and RSX) so I'm thinking at least 2B transistors for each next-gen GPU. That's being really conservative too.

The upcoming GT200 has at least 1B and it's a design that should've been out in late 2007, instead it's coming Q3 2008. Nvidia's next-gen architecture for 2009-2010 should be at least 2B transistors.


3B transistors in next-gen Microsoft and Sony consoles is a pretty conservative estimate given that current consoles have roughly 1/2 a billion.

I guess it's easier to count in die size.

In the 360 @90nm
xenon ~170mm²
xenos ~180mm²
edran ~70mm²
= 420 mm²

I stated ealier in this topic, and as One reminds, we're likely to have less.
I would put my bet between 300 and 350 mm².
Even if MS and Sony come with powerful system, I feel like they will try to offer a tigher package.
@90nm 1mm& was roughtly million transistors
will edit later... have definitively to go to bed.

EDIT
with such a silicon budget @32nm there would be no need to have a dual gpu system.
Butid the systems launch @45 why not.
 
Last edited by a moderator:
Do investors get a say? They can choose to pull out of funding a company if they think the hardware's gonna be a flop, but the companies don't need to declare their hardware intentions.
If investors info are blacked out, then higher executives who are not directly responsible for games will complain about high-risk moves anticipating what investors will do when they know.
The upcoming GT200 has at least 1B and it's a design that should've been out in late 2007, instead it's coming Q3 2008. Nvidia's next-gen architecture for 2009-2010 should be at least 2B transistors.
Will it have a reasonable thermal design power at 32nm for a console? Though we need a bigger computer PSU each year I doubt there's no ceiling.
 
I have not read the thread all the way so excuse me.

My prediction is as follows...

Sony will stick with IBM and Nvidia for the PS4. A new more powerful revision of Cell most likely starting on the 45nm process and the GPU being a distant cousin of GT200 of some sort.

I believe Nintendo will go all AMD next time around ditching the IBM based CPU. I'm thinking Nintendo may go with a Fusion APU with no external GPU.

MS on the other hand I think is going to be the most interesting of them all. With a Intel Nalehem based CPU, and a Larrabee based GPU.
 
I have not read the thread all the way so excuse me.

My prediction is as follows...

Sony will stick with IBM and Nvidia for the PS4. A new more powerful revision of Cell most likely starting on the 45nm process and the GPU being a distant cousin of GT200 of some sort.

Some variant of GT200 would be far too old by 2012-2013 when PS4 will probably arrive. GT200 is almost certainly going to be an overhaul of the G80 architecture much like G70 was an overhaul of the NV40. GT200 is not Nvidia's true next-gen, all-new, clean-sheet architecture, like G80 and NV40 were in 2004 and 2006 respectively. The next completely new Nvidia architecture should see the light of day in 2009 or 2010. It'll combat Larrabee, R800, etc.

Lets call that all new next-gen architecture NV60/GeForce11, with GT200 being NV55/GeForce 10.
(G80 & G92 being NV50). The PS4 GPU should be a distant cousin of NV60, not this year's upcoming GT200/NV55, IMO.


I believe Nintendo will go all AMD next time around ditching the IBM based CPU. I'm thinking Nintendo may go with a Fusion APU with no external GPU.

That would be one of several good options for Nintendo. They do need a new, modern architecture that takes them out of the late 1990s and into the 21st Century. Wii's Broadway CPU and Hollywood GPU are very much 90s tech.

MS on the other hand I think is going to be the most interesting of them all. With a Intel Nalehem based CPU, and a Larrabee based GPU.

I doubt Microsoft will go with Intel technology for the CPU. The only thing Microsoft might go with is Larrabee if Larrabee starts to prove itself with its first showing this year. Microsoft will either stay with IBM for the CPU (something that's like an improved Xenon with more cores that are better, more cache, etc. Or do something custom of their own with help (alot of help) from IBM or AMD.
 
Last edited by a moderator:
Some variant of GT200 would be far too old by 2012-2013 when PS4 will probably arrive.

For sakes, I would hope not. 2012 sounds a bit too far. I think we will start hearing about the new xbox by 2009/2010.


I doubt Microsoft will go with Intel technology for the CPU. The only thing Microsoft might go with is Larrabee if Larrabee starts to prove itself with its first showing this year. Microsoft will either stay with IBM for the CPU (something that's like an improved Xenon with more cores that are better, more cache, etc. Or do something custom of their own with help (alot of help) from IBM or AMD.

Well I doubt they will stick with IBM as the power PC is long in the tooth. I suppose they could stick a Cell in there as I don't think it's tied to just Sony. But that's kind of pulling it. AMD seems possible, but nothing from that camp sounds like it would fit. Regardless, I do think Intel will put roots in the next xbox one way or another.
 
For sakes, I would hope not. 2012 sounds a bit too far. I think we will start hearing about the new xbox by 2009/2010.

Hearing about new Xbox by 2009 or 2010, yeah, but you were talking about PS4 in that paragraph. I don't expect to hear about PS4 until 2011-2012 and release 2012-2013 which is 6-7 years after PS3, roughly the same amount of time PS2 had before PS3 came out.

The GPU won't be related to GT200 because that GPU will be far too old & outdated for PS4. That would be like thinking in April 2003 that PS3 would be using something related to NV35/GeForce FX, something several years beyond PS3's NV47 based GPU.

Here's what I mean, looking down the road, example of how thing could go:

2006: new Nvidia architecture (NV50/G80) ==> 2007: G80 respin (G92) ==> mid 2007: G80/G92 refresh/upgrade/overhaul (GT200) ==> early 2009: GT200 respin ===>late 2009/early 2010: new Nvidia architecture (NV60) ===> 2011: NV60 refresh/upgrade/overhaul (NV65) ===> PS4 GPU NV65 derivative ?
 
Last edited by a moderator:
For the 360 it would come from increasing the size of eDram to about 30MB. All IMHO, of course!
 
Last edited by a moderator:
For the 360 it would come from increasing the size of eDram to about 30MB. All IMHO, of course!

One of the things I never understood about the 360's architecture is why they decided on 10MB.

Knowing that this generation would be completely HD, why not make the eDRAM large enough to handle the minimum HD resolution, 720p?
 
One of the things I never understood about the 360's architecture is why they decided on 10MB.

Knowing that this generation would be completely HD, why not make the eDRAM large enough to handle the minimum HD resolution, 720p?
It's about cost, really.

The eDRAM is enough to handle 720p, just not with AA. It's a bit odd that they didn't go with 7.5MB or 15MB, but I guess 10MB allows those ~600p 2xAA resolutions (clearly better than SD but not possible with only 7.5MB) as well as 3 tiles for 4xAA instead of 4, and 15MB was probably too expensive.

I'm sure ATI would have loved it if they could put the EDRAM on the same die like Sony did with PS2, thus allowing better use of it for texturing, post-processing, and deferred rendering, but the economics of the fabs they had access to precluded it.
 
It's about cost, really.

Indeed! From a physical standpoint, an extra 50% in eDRAM space is not trivial, particularly considering the heat problems that we already see and also considering the manufacturing issues with larger chips. eDRAM is also more difficult to produce I believe.

The eDRAM is enough to handle 720p, just not with AA. It's a bit odd that they didn't go with 7.5MB or 15MB, but I guess 10MB allows those ~600p 2xAA resolutions (clearly better than SD but not possible with only 7.5MB) as well as 3 tiles for 4xAA instead of 4, and 15MB was probably too expensive.

hm... I'm inclined to believe the 10MiB figure was also considered for adding 4xMSAA to backward compatible games very easily. Seems like the simplest rationale. :p
 
I predict that Microsoft will release the xbox 720 (or whatever it is called) This fall during christmas retail season. It will have a blu-ray drive, 2 gig of ram, and a 3 year newer GPU and and CPU, but otherwise will just be a mildly faster (say 4 to 8 times the existing pixel fill rate over all) than the existing 360. Microsoft brand XBox 360 perephrials will be compatable with it, but third party accessories will all be broken. Oh and it will include the MS version of the wii mote. Because it is, from a technical standpoint, the existing 360 just with a few upgrades, it will be 100% backwards compatable with 360 software.

I think the MS version of the wii mote will forgo the motion controls, and it will just be a pointer. Its main use will be for some version of Internet explorer on this device. Web developers only option for getting multi-media enabled web pages on it will be silverlight, it will never support the Adobe products. It might even be hampered in the areas of DHTML, again to get those sorts of effects, you will need to use Silverlight.

It will retail in North America for about $600.
 
Status
Not open for further replies.
Back
Top