Nintendo Wii specs (courtesy of maxconsole.net)

The problem that I see is that the Nintendo listed the PowerPC in Wii as a 90nm CPU and GL is a 130nm CPU, unless Nintendo has put money for reducing the space of the GL from 130 to 90nm I don´t understand how they can be the same processor and my confusion is greater when I remember that Iwata said that Nintendo didn´t make new factories for Wii components.

I don't really see that as a problem to be honest, if the chip is a 750GL then no doubt it will be tweaked anyway (just like the 750CXe to Gekko). Also IBM/NEC would do the work to shrink the CPU to the new process and NEC's factories would fab the chips, so Nintendo wouldn't need to make any new factories for that.
 
Last edited by a moderator:
The problem that I see is that the Nintendo listed the PowerPC in Wii as a 90nm CPU and GL is a 130nm CPU, unless Nintendo has put money for reducing the space of the GL from 130 to 90nm I don´t understand how they can be the same processor and my confusion is greater when I remember that Iwata said that Nintendo didn´t make new factories for Wii components.
I'd missed that in all the 750GX/GL talk. I just thought perhaps Nintendo went with a 130nm chip. It says right on their webpage, though, that Broadway is made on a 90nm SOI process.

Transitioning from 130nm to 90nm should take a 750GX (with 1 MB L2 cache) from 52.5 mm^2 to 25 mm^2. That's a fair bit smaller than Gekko's die size of 43 mm^2 on 180nm. At this point, I can't see it being economical for Nintendo to pay IBM to shrink a 750 to a process it is not produced on and underclock it to about half of its potential speed. (I'm estimating a 750GX @ 90nm would top out at 1500 MHz. The 750cxe @ 180nm went to 600 MHz and the 750GX @ 130nm hits 1000 MHz.)
 
I wonder if this isnt true (750GL based) as I remember that 1-2 months ago there is a rumor that the CPU would end up at 1.1Ghz?
 
I wonder if this isnt true (750GL based) as I remember that 1-2 months ago there is a rumor that the CPU would end up at 1.1Ghz?

The 750GX/GL run at up to 1Ghz on 130nm, I'm sure they would surpass 1.1Ghz on 90nm. Though we haven't heard much to backup this 1.1Ghz rumour since it first came out so..

is the 750GX comparable to a G5 at the same clock speed?

No.
 
The 750GX/GL run at up to 1Ghz on 130nm, I'm sure they would surpass 1.1Ghz on 90nm. Though we haven't heard much to backup this 1.1Ghz rumour since it first came out so..

alternatively, they may not be so interested in upping the clock as downing the wattage.
 
is the 750GX comparable to a G5 at the same clock speed?

Maybe in memory access times or int performance, but overall no way.

alternatively, they may not be so interested in upping the clock as downing the wattage.

For such a low power chip, I don't know if dropping to 90nm would help all that much. Wouldn't SOI help more?
 
alternatively, they may not be so interested in upping the clock as downing the wattage.

I meant that there's no reason to think it can't be a 750GX/GL derivitive based on a rumour about a 1.1Ghz clock speed. Because either of those CPU's should run faster then 1.1Ghz on a 90nm process.
 
Why use an old G3 series CPU instead of a G4e or G5 CPU ??
G4 was realease before Gamecube was release, and G5 was out there on 2003.

G4 basically is a G3 with Altivec vector units, but a G4e, with improved integer and floating point performance, deeper pipeline and instruction queues, would be hell of a great CPU for Wii.
Is it a matter of cost??
 
EVERYTHING is a matter of cost in the console biz.

It's also a matter of power dissipation, size, complexity, ease of integration, convenience, and likely a host of other reasons.

Nintendo is apparantly building Wii as a slightly scaled-up GC performance-wise, and significantly scaled-down size-wise. Therefore it makes sense they'd use the original CPU, why spend a lot of work adapting a new CPU to an already finished hardware platform? 'If it ain't broke, don't fix it' applies perfectly in this situation.
 
Why use an old G3 series CPU instead of a G4e or G5 CPU ??
G4 was realease before Gamecube was release, and G5 was out there on 2003.

G4 basically is a G3 with Altivec vector units, but a G4e, with improved integer and floating point performance, deeper pipeline and instruction queues, would be hell of a great CPU for Wii.
Is it a matter of cost??

Because is Freescale (old Motorola semiconductors) who have the rights over the the G4e and at the same time the original VMX implementation patent and desing is property of Motorola, IBM used one of their own for PowerPC 970 only and they don´t have the tech for a VMX unit on G3 architecture.

Is sad that my iMac G4 1Ghz is more powerful than Wii (G4 1Ghz, 256MB RAM (upgraded to 768MB), 64MB video, NV17 GPU) and if we see the MacMini we can see how they could get better results with a little box.

All the fault in the simple Wii design is because of IBM, G5 cannot be used in a little box (this is why Apple jumped to Intel, because their laptops were outdated in technical specs against PCs) and Nintendo was forced to re-use the GCN technology.

And AMD-ATI HD console for Nintendo the next time? Who knows? Perhaps.
 
It's also a matter of power dissipation, size, complexity, ease of integration, convenience, and likely a host of other reasons

I know. But G4 at 0.20um is 80mm2 size, and sure it could be fitted in 40mm using 0.12 or 0.09um. Also it may not dissipate much more. These things also depend strongly on on-die cache size and frecuency.

IBM used one of their own for PowerPC 970 only and they don´t have the tech for a VMX unit on G3 architecture.

Didn't G4 implement VMX/Altivec?? It had 3 Altivec vector units if I recall correctly.

G5 cannot be used in a little box

Why?
 
I know. But G4 at 0.20um is 80mm2 size, and sure it could be fitted in 40mm using 0.12 or 0.09um. Also it may not dissipate much more. These things also depend strongly on on-die cache size and frecuency.



Didn't G4 implement VMX/Altivec?? It had 3 Altivec vector units if I recall correctly.



Why?

Altivec/Velocity Engine/VMX was originally a design from Motorola outside the AIM consortium (Apple-IBM-Motorola) that was added to the 750 version from Motorola to create the PowerPC 7400 aka PowerPC G4. This is why IBM cannot use it and a possible G3+VMX from IBM doesn´t exist. The Big Blue designed their own VMX for the PPE and G5 pipelines but never for the 750, you have to remember that 750 is a short pipeline processor and 970 and PPE are long pipeline processors.

In other words, IBM never had a VMX unit for 750 architecture in development and never existed.

And about the G5, Apple had plans to use it on the Powerbook line but the huge temperature that the G5/970 produces makes impossible to make a laptop, a mac mini like computer or a console with the size of Wii with a 970 inside. PowerPC from 603 to G4 were designed for the consumer market and their performance per watt was great (this is why thinks like the original iMac, the original iBook, PowerMac G4 Cube and other computers were possible, the excelent performance per watt and it was the same argument, the performance per watt who made the jump from PowerPC to Intel) but the G5 is only a Power4 core with VMX and the Power4 design wasn´t made with laptops and systems like Wii in mind.
 
What would stop them to put a second core, even if just two CPU on one die (like P4D) that would still make the CPU about 21mm^ (cheap, small, low power, low dev costs...) still it should give a nice boost to performance (specially given that the renderer on Wii still have a lot of work on the CPU) even if not the best option.

Anyway I still find hard to belive that wii is just GC at 150% of the performance, for eg why are they upgrading their tools (nintendo/havok...)that would be normal if there is a new architeture or a superset of the old one, but this way it doesnt make much sense.
 
Last edited by a moderator:
What about Xenon and Cell ?? I'm sure they are big in size, and I don't think they consume little power ;)
Sure, Sony and Microsoft are not looking for a small wattage-efficient system, but still...
 
What about Xenon and Cell ?? I'm sure they are big in size, and I don't think they consume little power ;)
Sure, Sony and Microsoft are not looking for a small wattage-efficient system, but still...

That is a completely diferent thing, cell alone should cost more than the wii (all HW).
 
That is a completely diferent thing, cell alone should cost more than the wii (all HW).

You get what you pay for and, truth be told, Cell is probably more powerfull than all of the Wii chips combined (in terms of theoretical FLOPS performance, anyway), and then some.

To me, the real letdown with the NWii was the CPU and GPU, both not much more powerfull than Gamecube, a 5-year old technology showcase...
Let's hope the gameplay makes up for it.
 
You get what you pay for and, truth be told, Cell is probably more powerfull than all of the Wii chips combined (in terms of theoretical FLOPS performance, anyway), and then some.

To me, the real letdown with the NWii was the CPU and GPU, both not much more powerfull than Gamecube, a 5-year old technology showcase...
Let's hope the gameplay makes up for it.

Heh, that's an interesting point, Cell probably could put out better graphics on its own (though you may have to cut out some filtering algorithms) then the entire wii system can do.
 
Heh, that's an interesting point, Cell probably could put out better graphics on its own (though you may have to cut out some filtering algorithms) then the entire wii system can do.

hasn't more than a decade of GPU history really taught you nothing?

i mean, i'm all for software rasterizers, but you're just outright ridiculous here.

just one simple question for you, what hit do you think EMBM would impose on the SPE's LS scheme? mind you, flipper/hollywood have oomphs of transistors dedicated to the task. combined with low latency embedded memory. at all levels of the memory hierarchy.

but let's assume for the sake of argument that after some unhuman heroism a clever coder manages to squeeze some high-enough framerates out of cell at some hollywood-biased tasks. what CPU power do you expect to have left after that for actual game code?

i think some people on these forums should really cut on the hype dose..
 
Back
Top