Nintendo's hardware philosophy: Always old, outdated tech?

Tahir2

Veteran
Supporter
Can we please stop the Nintendo likes using old hardware when the only time they used hardware that was really outdated was with wii.

Incorrect.

Gamecube was also outdated for its time. Wii even more so.

Then all you need to do is look at its handhelds, including its latest and greatest.

Nintendo learnt an important lesson after N64 - one we all hope it unlearns for the geek in us but I doubt they will.
 
Gamecube was also outdated for its time. Wii even more so.
I don't think that's a fair assessment. It was unconventional, and Nitendo scrimped a bit, but the only reason XB was much better was the stupid amount of money MS was willing to throw at the project. Overall GC was a comparable piece of hardware, not a generation out of touch unlike Wii.

Then all you need to do is look at its handhelds, including its latest and greatest.
That's not a valid comparison, as the 'old' tech is more power efficient. GB was poopy in specs compared with the likes of GameGear, but those choices meant for a preferred system by the handheld gamers.

The only theme I see as constant regards Nintendo is that they tend to be conservative with RAM, both capacity and bandwidth. I can imagine 512 MBs actually being a valid choice in their eyes, excused by claiming they want to keep developer costs down or somesuch. If they bite the bullet and go with two gigs of fast RAM, they'll be able to at least hold a candle to the MS and Sony offerings afterwards, whereas being cheap now will leave the platform looking old real fast.
 
Can we please stop the Nintendo likes using old hardware when the only time they used. hardware that was really outdated was with wii (...)


Wut?
DS: arm9 from the early 90's and a "basic" gpu with no texture filtering.
Gamecube: gpu from 3 year old alladin igp and G3-ish PowerPC from mid 90's
Wii: rehashed gc gpu with no extra functionality and same, higher clocked cpu
3DS: 4 year old gpu.


How can you not agree that Nintendo has been using old hardware for the last 10 years?
 
Wut?
DS: arm9 from the early 90's and a "basic" gpu with no texture filtering.
Gamecube: gpu from 3 year old alladin igp and G3-ish PowerPC from mid 90's
Wii: rehashed gc gpu with no extra functionality and same, higher clocked cpu
3DS: 4 year old gpu.


How can you not agree that Nintendo has been using old hardware for the last 10 years?

Both GameCube's CPU and GPU were upgraded and managed to be very competetive in comparison to the other systems so I don't really think you can call them outdated. I mean if GC's CPU was outdated, then so was XBox's. Wii yes obviously, but as far as 3DS, we don't know what the final parts are so its moot.
 
Gamecube was at least reasonable for the time. Imo it was PS2 level, which was solid 2000 tech, while clearly a notch below Xbox. But they made an effort with GC, albeit not exactly a superb one.

Anyways I think a lot of people are trying to discredit the "leaks" and then they start assuming "well, 512MB is crazy low so it must be at least 1GB" and then the proceed to act as if 1GB+ is a given. Imo we really have to go with the leak as the best info we have now not wishful thinking.

However the more I think the more I think if Nintendo is trying to aim performance wise at 360+10% they are going to be in big trouble. With all this controller tech it's going to be at least 299 if not more, while the 360 by 2012 will be flooding the market probably with $149 (or less) machines and late gen, graphically impressive games like Gears 3 creating an almost PS2 like Tsunami of late gen sales.

But then on the other hand it is Nintendo so if people get excited about the new gimmick it could fly off the shelves like Wii, too. But I think in either case it's going to be in trouble when the next round of consoles from Sony/MS come along which is the real 800lb gorilla.

Unless again, Nintendo is actually planning staggered consoles cycles like I said. 2012 Wii2, 2014 PS4/XB3, 2016 Wii3...but that would just get nasty ugly :(

I mean if GC's CPU was outdated, then so was XBox's

Err, not really as Xbox's GPU was quite superior.

I do agree that GC was clearly a reasonably competitive, if again imo less than stellar, technical effort for the time frame though. Nintendo's last.
 
Last edited by a moderator:
GameCube used ArtX's Flipper GPU. It was comparable to the original Xbox in most ways, stronger in some, weaker in others. Overall it was extremely good tech for 1999 (when Flipper was designed) released in 2001. Nintendo basicly sat on GameCube hardware for one year. It was originally meant for a 2000 release. GameCube turned into my favorite Nintendo console because it was like having an SGI Infinite Reality engine in my home. I loved Twilight Princess.

Anyway, I hope Nintendo uses an RV770 at 28nm in Wii 2.
 
Last edited by a moderator:
GameCube used ArtX's Flipper GPU. It was comparable to the original Xbox in most ways, stronger in some, weaker in others

I knew the thread would get derailed here but yeah, I strongly disagree.

Heck Doom 3 on Xbox still looks better than any Wii FPS let alone Gamecube.
 
I'm not trying to derail the thread. I'm simply stating fact. Flipper and NV2A each had various strengths and weakness compared to each other. that is all.
 
Its been a while since I've debated that subject but I seem to remember NV2a being quite significantly superior in most ways.
 
Superior on paper but when it came to games the GC showed it was able to compete looking at games like RE4, Metroid Prime and Rogue Squadron.

Also remember Xbox had more RAM...
 
Metroid Prime 2 was just as good looking as anything on Xbox. BTW Chronicles of Riddick used too much shiny shine bump mapping...I guess if you're easily impressed with that effect then I could see why you'd say it's way better than anything on GC.
 
Last edited by a moderator:
Gamecube was at least reasonable for the time. Imo it was PS2 level, which was solid 2000 tech, while clearly a notch below Xbox. But they made an effort with GC, albeit not exactly a superb one.

Anyways I think a lot of people are trying to discredit the "leaks" and then they start assuming "well, 512MB is crazy low so it must be at least 1GB" and then the proceed to act as if 1GB+ is a given. Imo we really have to go with the leak as the best info we have now not wishful thinking.

However the more I think the more I think if Nintendo is trying to aim performance wise at 360+10% they are going to be in big trouble. With all this controller tech it's going to be at least 299 if not more, while the 360 by 2012 will be flooding the market probably with $149 (or less) machines and late gen, graphically impressive games like Gears 3 creating an almost PS2 like Tsunami of late gen sales.

But then on the other hand it is Nintendo so if people get excited about the new gimmick it could fly off the shelves like Wii, too. But I think in either case it's going to be in trouble when the next round of consoles from Sony/MS come along which is the real 800lb gorilla.

Unless again, Nintendo is actually planning staggered consoles cycles like I said. 2012 Wii2, 2014 PS4/XB3, 2016 Wii3...but that would just get nasty ugly :(



Err, not really as Xbox's GPU was quite superior.

I do agree that GC was clearly a reasonably competitive, if again imo less than stellar, technical effort for the time frame though. Nintendo's last.

FYI 512MB ram has never been mentioned as a leaked hardware spec, it was a guess by a french website (they said they would guess that 512MB would be the minimum Nintendo would include based on the other info they had).

Also I said CPU not GPU. XBox and GC's CPU's were pretty comparable. Certainly one wasn't significantly ahead of the other, so if one was outdated then both were outdated.

GC's GPU was a couple of years behind XBox's GPU in many ways (though ahead in others). I would agree that XBox had the more powerful GPU overall obviously.
 
Last edited by a moderator:
Also I said CPU not GPU. XBox and GC's CPU's were pretty comparable. Certainly one wasn't significantly ahead of the other, so if one was outdated then both were outdated.


I agree, the Intel 733 MHz X86 CPU in Xbox and the 485 MHz IBM PowerPC Gekko CPU in GameCube were pretty much even. The 256K L2 cache in Gekko was nice :)

GC's GPU was a couple of years behind XBox's GPU in many ways (though ahead in others). I would agree that XBox had the more powerful GPU overall obviously.

You're right, Flipper GPU in GameCube was at least 1 year behind the NV2A in Xbox. Flipper was being brainstormed way back in 1998, designed mainly in 1999 and ready for production in 2000.

IGNcube: You say you began talking to Nintendo® in 1998. So from white paper designs and initial design to final mass production silicon how long was the development process?

Greg Buchner: Well, there was a period of time where we were in the brainstorm period, figuring out what to build, what's the right thing to create. We spent a reasonable amount of time on that, a really big chunk of 1998 was spend doing that, figuring out just what [Flipper] was going to be. In 1999 we pretty much cranked out the gates, cranked out the silicon and produced the first part. In 2000 we got it ready for production, so what you saw at Space World last year was basically what became final silicon.

http://cube.ign.com/articles/099/099520p1.html



The NV2A was designed in 2000-2001. NV2A was certainly more powerful than Flipper and had features not invented for consumer GPUs (true shaders) at the time Flipper was being made. Overall though, Flipper was a far better design choice than NV2A which cost Microsoft dearly. Microsoft changed GPU makers for the 2nd-gen Xbox (360) while Flipper went on in the form of a shrunk down, higher clocked version called Hollywood found in the Wii which reached mass-market success.
 
Last edited by a moderator:
Incorrect.

Gamecube was also outdated for its time. Wii even more so.

I totally disagree. GameCube was superior to PS2 in many (not all) ways. Sure, Nintendo sat on the GC hardware for a year, it was originally intended for a late 2000 release, but it certainly wasn't outdated. The GameCube could go head-to-head with the somewhat (in some areas) more powerful Xbox.
 
D
Wut?
DS: arm9 from the early 90's and a "basic" gpu with no texture filtering.
Gamecube: gpu from 3 year old alladin igp and G3-ish PowerPC from mid 90's
Wii: rehashed gc gpu with no extra functionality and same, higher clocked cpu
3DS: 4 year old gpu.


How can you not agree that Nintendo has been using old hardware for the last 10 years?

Deadmeat is that you? Nobody's Perfect, is that you? ;)

GameCube's GPU, Flipper, was a completely fresh, new design, from the Alladin IGP GPU.

p.s., I still like the old Lockheed Martin Real3D tech. Too bad Lockheed couldn't / wouldn't adapt to the consumer market, for Sega.
Thank god ATI swooped in and bought much of Real3D. You can bet that Xbox 360's Xenos GPU produces such awesome visuals (for the time) because of not only ArtX (aquired in 2000) but also Real3D. We (no pun intended) shall see that outstanding IP in Nintendo's next console ;)
 
Superior on paper but when it came to games the GC showed it was able to compete looking at games like RE4, Metroid Prime and Rogue Squadron.

Also remember Xbox had more RAM...

Thank you. I think GC could compete quite well with Xbox, which had vastly superior specs, on paper.
Although GC had nothing like Halo, Halo 2 or Doom 3, I'll take the 60fps of Metroid Prime 1/2 over 30fps fps on Xbox. I own both consoles, love them both, am not a fanboy of either. I just like to point out that Xbox, GC and PS2 each had various strengths and weaknesses compared to each other. No one console was superior in every aspect.
 
Nintendo are very careful when they spec their machines, they always have been.

Take the SNES. It had a weak CPU compared to the Megadrive and it ran at a lower resolution almost all the time, but Nintendo gave it lots of colors, more and bigger sprites (even though it couldn't do as much with them) and a couple of fancy background modes and on balance people were more impressed with the SNES.

It was a smart choice. Nintendo calculated that a powerful CPU wouldn't be as big a benefit for the games they wanted to sell to customers (and support on their platform), and they wanted to keep the price of the unit down, so they put in a cheap cpu and a graphics chip with carefully targeted strengths.

It's a very different approach to Sony and MS (and formerly Sega), but it normally works well for them. They were caught out by the Xbox, but you can see how the GC was designed to give them the edge over the PS2 in terms of how pretty things looked, while being as cheap to manufacture as possible.

Nintendo know that cripplingly expensive, powerful hardware does you no good when it comes to exclusive games. When most people are playing a platform exclusive killer app they don't actually care if it could be better on another system, and the relatively small number of zealots will take care of themselves by blindly insisting that their favourite first party game couldn't be done on any other system.

Yeah, occasionally the stars align and you get a Halo moment, when breakthrough game design takes advantage of that hugely expensive, very powerful new games system, but betting on that happening wouldn't be good business. Actually, apart from Halo, I can't think of any other examples.

[Edit] In short, Nintnedo's hardware philosophy isn't "Always old, outdated tech", I think it's carefully targeted strengths with a requirement for the hardware costs to justify themselves through the software they enable.[/Edit]
 
Last edited by a moderator:
Nintendo's strategy in the handheld market has always been to use what they called "whithered" technology. Old enough to be dirt cheap to produce, but just good enough for fun games. For consoles they were smart, but generally produced competitive designs. That has changed in the last decade. They migrated their "whithered technology" approach to the home market, paired it with a compelling gimmick and made bank on the Wii. It looks like they're going to do the same again with their next console. But that means their success will largely rest on how well their next control innovation goes over with the main stream. If it hits, they could have another Wii. If it falls flat, they'll probably be back to N64/Gamecube levels.
 
Nintendo are very careful when they spec their machines, they always have been.

Take the SNES. It had a weak CPU compared to the Megadrive and it ran at a lower resolution almost all the time, but Nintendo gave it lots of colors, more and bigger sprites (even though it couldn't do as much with them) and a couple of fancy background modes and on balance people were more impressed with the SNES.

It was a smart choice. Nintendo calculated that a powerful CPU wouldn't be as big a benefit for the games they wanted to sell to customers (and support on their platform), and they wanted to keep the price of the unit down, so they put in a cheap cpu and a graphics chip with carefully targeted strengths.

It's a very different approach to Sony and MS (and formerly Sega), but it normally works well for them. They were caught out by the Xbox, but you can see how the GC was designed to give them the edge over the PS2 in terms of how pretty things looked, while being as cheap to manufacture as possible.

Nintendo know that cripplingly expensive, powerful hardware does you no good when it comes to exclusive games. When most people are playing a platform exclusive killer app they don't actually care if it could be better on another system, and the relatively small number of zealots will take care of themselves by blindly insisting that their favourite first party game couldn't be done on any other system.

Yeah, occasionally the stars align and you get a Halo moment, when breakthrough game design takes advantage of that hugely expensive, very powerful new games system, but betting on that happening wouldn't be good business. Actually, apart from Halo, I can't think of any other examples.

[Edit] In short, Nintnedo's hardware philosophy isn't "Always old, outdated tech", I think it's carefully targeted strengths with a requirement for the hardware costs to justify themselves through the software they enable.[/Edit]

I agree for the most part. Very interesting post, actually.

Your post reminded me of something....I recall reading a usenet post about the Super Famicom/SNES. Among other things, it mentioned how the SFC/SNES was originally supposed to have stronger sprite and/or background manipulation hardware (re: scaling & rotation) than it had (Mode 7). This more advanced hardware was cut out of the released console, but added back into certain games in the form of various DSP chips in the carts. That's how I understood it anyway.

Ahh here it is:
Wow, bet you didn't know Super Mario World, PilotWings, and F-Zero were
developed on the GS! A new Super Nintendo emulator, supposedly developed
by former Nintendo employees, has been released. Here is a very
interesting clip from the documentation:


(the full text of the docs can be found at
http://members.aol.com/emunews/silhouette.txt)


----


The first batch of games for the Super Famicom were developed around
1988 and 1989. Popular Super Famicom titles, like F-Zero and Super
Mario World, were the most difficult for several reasons--if nothing
else, the Super Famicom hardware specifications changed in small ways
at least twice during the development project, requiring changes to
existing code. (Trivia tidbit: the original Super Famicom plans called
for much more extensive onboard 3D hardware--PilotWings was developed
assuming that this hardware would be present, and since this chip was
scrapped from the Super Famicom at the last minute, Nintendo was forced
to include this 3D chip on the PilotWings board in order to keep the
game on schedule.)



The other reason for difficulty in development is much less known, and
very surprising--almost all the programming for these titles was done
on the Apple IIgs! This seems ridiculous until you realize that both
the Super Famicom and the Apple IIgs are based on the 65816 processor,
a cheap toy with inadequate processing power that was stuck in the
Super Famicom to smooth over the early development process (since it is
backwards compatible with the 6502, the NES' processor). However, it
was soon realized by the development teams that a reliable 65c816
development platform could not be found on the usual Nintendo platforms
(most Nintendo devs at that time had a generic PC, excluding the art
and marketing department which was mostly Macintosh and a few segments
of the development team). Deadlines approaching, the Apple IIgs was
chosen as a quick if inelegant solution--several C and assembly
compilers were available, and testing and debugging was easier since we
were able to use a native 65816 for testing.


However, the IIgs proved woefully inadequate for large projects. Most
machines didn't even have hard drives! Compile times for even meager
projects were horrendous, and keeping all the work on floppies was
getting out of hand. And obviously, the graphics support on IIgs' was
minimal, so testing out small programs required switching between a
Super Fami prototype on the left and a IIgs on the right. Programmers,
in general, hated the thought of Super Famicom development. Many
continued to write 6502 code using the old NES development environment,
choosing to ignore on the 16-bit advantages of the 65816 in order to
complete the project without losing their sanity.

http://groups.google.com/group/comp.sys.apple2/msg/c48edd4e1b8fd69c?hl=en&dmode=source

I am confused about what Mode 7 could do in the released SNES without an extra chip in the carts. Is Mode 7 boosted by the DSP? Can SNES actually do scaling & rotation without the DSP?


Anyway, getting completely back on-topic, I hope Nintendo doesn't get too cheap when it comes to things like RAM. One thing that nobody has mentioned here or on GAF is, will the Wii 2 GPU have at least as much on-chip bandwidth as the Xbox 360's Xenos? 256 GB/sec was really impressive for 2004 hardware released in 2005. Can 1T-SRAM reach that?
 
Last edited by a moderator:
Back
Top