Nintendo Wii specs (courtesy of maxconsole.net)

Maxconsole said:
The GPU of the Wii is identical to the GC’s but it is on average 1.5X faster

:LOL:, all credebility lost in one sentence.

Not that it was ever credible, they don't have any documents or info themselves, they are simply quoting the bloke from GAF..
 
Teasy said:
:LOL:, all credebility lost in one sentence.

Not that it was ever credible, they don't have any documents or info themselves, they are simply quoting the bloke from GAF..

Eh.. its not from GAF.

Maybe people just want to live in denial.. but Q4 will bring the cold truth.
 
ZiFF said:
Eh.. its not from GAF.

Maybe people just want to live in denial.. but Q4 will bring the cold truth.

I only thought that the story originated from GAF because you gave GAF as the source when you posted this rumour a couple of days ago....

So Maxconsole was actually the original source for this story, either way its not confirmation of the original rumour. Its just the same rumour from the same single source in a new thread.

By the way why don't you actually tell us why you believe this rumour is true, instead of posting things like "your in denial" and "you'll see the cold hard truth"...
 
Last edited by a moderator:
Squeak said:
There was a guy some years ago that proved mathematically that everything Renderman can do, can, with a enough perseverance, be replicated on a basic GPU with indirect texturing.

Old tech is by no means bad tech just because it's old. On the contrary in some cases going back to the roots and using the process improvements, can result in a more "streamlined", clean design.
What's more, I'd hardly call Flipper really old, it's 2000-2001 tech, plus it was way ahead of it's time in the basic features. It just wasn't clocked high enough/used enough transistors.

Well I was always fascinated by Rendition's use of a MIPS RISC core to perform most of their graphics tasks in their Verite line.

They were totally programmable chips. But, as I'm sure you know, that's not usually the performance-optimal way to go. ASICs that are specifically tailored for the calculations you want to do are the way to go. The Verite V1000 chips were horribly slow and buggy compared to the less flexible Voodoo Graphics. V2200 was neat but far too late to compete.

I bet you could program those chips to do an awful lot of nifty stuff, just not in real-time. And that's the key here.
 
Sounds like EA has really cozied up to Wii as well. I heard that at LGC there will be a joint EA/Nintendo conference. I've only liked a few EA games myself,but the support can't hurt.
 
I noticed a pertinent blurb on Ars Technica:

http://arstechnica.com/news.ars/post/20060802-7407.html

Apparently Broadway is exactly the same as Gekko according to the specs. If that's the case, then Broadway is a PowerPC 750CXe at 729 MHz. The 750CXe tops out at 600 MHz. If Nintendo had wanted a Gekko at 729 MHz I'm sure it would have been more in their interest to pursue the 750GX instead.
 
OtakingGX said:
I noticed a pertinent blurb on Ars Technica:

http://arstechnica.com/news.ars/post/20060802-7407.html

Apparently Broadway is exactly the same as Gekko according to the specs.

well, correct me if i'm wrong but i believe many a b3der pointed that out aleady here. repeatedly ; )

and now Hannibal, too, does not seem to buy too much into that "leak".

broadway being an overclocked gekko, and hollywood being an overclocked flipper? surely not impossible but not very probable.
 
OtakingGX said:
I noticed a pertinent blurb on Ars Technica
Sounds to me like Hannibal is a bit bitter at the idea that he could have been completely wrong about predicting the CPU :p
At any rate I disagree with him - if Wii sells well, the joke will be on MS and Sony.

If that's the case, then Broadway is a PowerPC 750CXe at 729 MHz.
729Mhz Gekko - CXe doesn't exactly have all the features listed there. I wonder though - is the clock result of wanting to recycle existing design, or was there a need anyway to update a higher grade 750x core for Gekko features, to reach 729Mhz (after all a GX changes basically nothing save for some pipeline lenghts etc. IIRC).

darkblu said:
broadway being an overclocked gekko, and hollywood being an overclocked flipper? surely not impossible but not very probable.
To be fair, this rumour has been around for about a year now (that I'm aware of, perhaps longer). And problem for the skeptics is that leaks from IGN were heavily implying 750x core as well.
 
Last edited by a moderator:
darkblu said:
well, correct me if i'm wrong but i believe many a b3der pointed that out aleady here. repeatedly ; )

and now Hannibal, too, does not seem to buy too much into that "leak".

broadway being an overclocked gekko, and hollywood being an overclocked flipper? surely not impossible but not very probable.
I pointed it out myself on the first page and linked to the 750CX/CXe/CXr datasheet. When I got this reply I thought perhaps that meant people thought that a descendent of Gekko would share the same features as Gekko:
Asher said:
Which isn't surprising, given that Broadway is a very direct descendant of the 750Cxe.
But the problem is as pointed out on Ars, this doesn't appear to be a descendent of Gekko, rather, it appears to be a copy/paste job of Gamecube documentation. Just because the specs aren't laughable, like all the 5.0 GHz quad core PowerPC 970 and 700 MHz GPU with 256 MB embedded 1T-SRAM, doesn't make their source more credible.
 
Fafalada said:
To be fair, this rumour has been around for about a year now (that I'm aware of, perhaps longer). And problem for the skeptics is that leaks from IGN were heavily implying 750x core as well.

well, i, personally, would not be that surprised as i can hardly see many other options for robust gekko emulation but a cpu that is essentially a gekko in itself. either as a well-clock-upped one or a multi-(read: dual) core one.

as about hollywood, things there stand a bit differently. i believe. we'll hopefully see.
 
darkblu said:
well, i, personally, would not be that surprised as i can hardly see many other options for robust gekko emulation but a cpu that is essentially a gekko in itself. either as a well-clock-upped one or a multi-(read: dual) core one.
I think general ISA compatibility could have been robust enough, but anyway. As I said it before - it would have been nice to see at least dual core when they opted for low clock speed already.

I agree that GPU is a different story (as in compatibility options being a lot more open), but we have yet to hear anything about it.
 
OtakingGX said:
I pointed it out myself on the first page and linked to the 750CX/CXe/CXr datasheet.

yes. please, excuse me, Otaking, i had already forgotten some of the people who originally noted that and i thought you were emphasizig Hannibal's discovery of the 'sameness' of that broadway leak with gekko. my bad.
 
Obiously those specs just scream "Gekko", but is everyone buying this?

It may be the truth, but it may also be just people working off previous (possibly fabricated) "leaks". Personally I'm hesitant to swallow this without some undeniable confirmation. Faf, you seem to believe this is true; you don't have a dev kit, do you?
 
I don't think only a few are 'buying it'. So far discussion is mostly how plausible are they. As for CPU, they could have gone with 1.5 GHz CPU for little effort. I'm trying to think why you wouldn't. Penny pinching seems rather weak a in the grand scheme of things that's going to make a huge difference. I guess a reason to go slower is considering what the processor has to do, deciding 4x the speed won't show much improvement over 2x the speed in what the players see, so going with the simplest option. We're used to console makers trying to get maximum performance for the price. What happens when they shift tact and try to select the minimum hardware to do the job they want? Maybe this is it...:???:
 
darkblu said:
either as a well-clock-upped one or a multi-(read: dual) core one.

That had been what I thought when I first saw (in IGN) 2,5-3x improvement with 50% higher clock.

Wouldn't be easier do a good job in getting high performance in this case than with the more advanced machines because of having much more rendering work on the CPU, I ask because, IIRC, Q4 (originaly the engine is for DX7 level HW IIRC) patch moved the rendering to the second core and got a improvement of ~69% and this kind of work is also the better suited for parallel code.

BTW a dual core Gekko variant wouldnt be really cheap, I mean Gekko does have 43mm^ IIRC at 90nm it would have around 22mm^ that is very low?
 
Since this isn't being mentioned those specs are either off or fake. I know this because ign pulled a story mentioning that the cpu was 750GL (they said cl but that's mostly likely a grammar error). I also know this because a user there who is legit said his source from within IBM has said this as well. My question now is how does a 750GL help make WII 2-3x better than what GC does?
 
The 750GL is basically the same as the 750GX. The only difference between the GX and GL is the speed ratings and power consumption.
 
The 750GL is basically the same as the 750GX. The only difference between the GX and GL is the speed ratings and power consumption.

The problem that I see is that the Nintendo listed the PowerPC in Wii as a 90nm CPU and GL is a 130nm CPU, unless Nintendo has put money for reducing the space of the GL from 130 to 90nm I don´t understand how they can be the same processor and my confusion is greater when I remember that Iwata said that Nintendo didn´t make new factories for Wii components.
 
Back
Top