Nintendo's hardware choice philosophy *spawn

Shifty Geezer

uber-Troll!
Moderator
Legend
Well, why not give Nintendo the benefit of doubt for the time being?
Because the evidence isn't really there at the moment, and Nintendo have a reputation based on prior designs of going with lower options. The original rumours of eDRAM pointed at high bandwidth, but as some have mentioned, there's actually a good chance that it's low bandwidth in a cheap package. There's just as much reason to believe that as to believe there's loads of BW there, so why pick to believe either case (give them the benefit of the doubt or take it as low BW) when all we really have is a big question mark?
 
Because the evidence isn't really there at the moment, and Nintendo have a reputation based on prior designs of going with lower options. The original rumours of eDRAM pointed at high bandwidth, but as some have mentioned, there's actually a good chance that it's low bandwidth in a cheap package. There's just as much reason to believe that as to believe there's loads of BW there, so why pick to believe either case (give them the benefit of the doubt or take it as low BW) when all we really have is a big question mark?

That's simply not true. Apart from Wii that has never been the case. Frugal design yes, as in getting the most for the money and using unusual solutions towards that goal.
But Wii is the first time they have not launched a home console that at the time of release was not pretty much cutting edge for the price and compared to the competition.

Now, I'm not saying Wii U will be cutting edge because it's clearly not.
But I'm saying that there is nothing in the released data to indicate that it couldn't at least be a match for the 360.
 
That's simply not true. Apart from Wii that has never been the case. Frugal design yes, as in getting the most for the money and using unusual solutions towards that goal.
But Wii is the first time they have not launched a home console that at the time of release was not pretty much cutting edge for the price and compared to the competition.

You've also got the SNES which launched with an incredibly weak (for its launch time) CPU. Maybe if the CPU weren't so poor we wouldn't have seen such severe slowdown in early titles like Gradius 3. I would argue that N64 also had some pretty major design compromises but they were probably more unforeseen; with SNES Nintendo was releasing games using an add-on DSP from day one.

That aside, Shifty Geezer never said anything about home consoles. His comment easily applies to GBC, GBA, DS, and 3DS.
 
That's simply not true. Apart from Wii that has never been the case.
Don't look at whole systems; look at individual choices. Look at everything surrounding Wii U. They could have easily used a larger battery in the Wuublet. They could have used a beefier CPU. They could have used a larger RAM bus. They could have used a 16GB minimum flash chip. they could have used USB ports with enough juice to power an HDD or Wuublet over USB. Pretty much every bit of info we've had regards Wuu has been on the low end. We had the same thing with Wii. All the possibilities for fancy, or even just contemporary, hardware, and in the end it surprised everyone by being a doubled up GC. 3DS is seriously underspec'd compared to what the competition is doing too. One doesn't, and probably shouldn't, need to go back through Nintendo's whole history - whatever their design decisions in the 1980s and 90s don't necessarily show anything about current mindset, any more than the Sony of now is revealed in the Sony of the 80s and 90s. But the past ten years have been pretty consistent with Nintendo choosing the cheapest solution out of whatever ideas this board has entertained in discussion. That doesn't mean any unknown is a forgone conclusion, but it sets considerable precedent to believe out of two possibilities regards Wuu's hardware, there's good chance it'll be the lesser performing choice that made it in; certainly enough that I won't give Nintendo the benefit of any doubt! ;)
 
Their choices and success with the Wii are enough to think they would do it again.

Do what? Rerelease the Gamecube with upped clock and more memory? That's clearly not what they are doing. They are being parsimonious, not cheap or stupid.
The wii released at a lower price with a new controller, with the requirement of full BC. I have a hard time seeing how it could have been made much different.
There are a few things I would have changed (like the classic controller not working with GC games) but those are details.

You've also got the SNES which launched with an incredibly weak (for its launch time) CPU. Maybe if the CPU weren't so poor we wouldn't have seen such severe slowdown in early titles like Gradius 3. I would argue that N64 also had some pretty major design compromises but they were probably more unforeseen; with SNES Nintendo was releasing games using an add-on DSP from day one.

That aside, Shifty Geezer never said anything about home consoles. His comment easily applies to GBC, GBA, DS, and 3DS.

The 3.5 Mhz 65816 was pretty much equivalent to the 7.5 Mhz 68000 in the Megadrive. A little worse in some cases a little better in others. Again a wise choice for a console, where the deficiencies of the 65816 are easily overcome (like harder assembly code) and the advantages of cost and lower heat generation are very important.
Where the SNES really shined is in the support chips and the memory department though.

Going for the lower option with handhelds back in the day and to some extent even today was right, given the battery consumption issue and because people tend to break end lose handhelds more often and they would be less inclined to rebuy (and even buy in the first place) if it's too expensive.
 
The 3.5 Mhz 65816 was pretty much equivalent to the 7.5 Mhz 68000 in the Megadrive. A little worse in some cases a little better in others.

Nintendo's 65c816 may have executed about as many instructions per cycle on average as a 7.5MHz 68k, but those instructions were on average poorer quality which made a tremendous difference in all sorts of tasks.

And this isn't considering that SNES came out two years after Megadrive. Or you could consider that it came out three years after PC-Engine. While PC-Engine only had an 8-bit (also 65xx derived) CPU it was clocked twice as high and could therefore easily beat SNES's a good deal of the time.

A 68k wouldn't have made sense for SNES because Nintendo needed something they could license to integrate into a custom chip that had other functions. But they could have easily spent some of the big money they accrued on the NES in customizing the chip so it didn't suck less. Originally they intended to fix it by including the DSP I mentioned on board, but they scrapped that (if this weren't the case you'd have never seen a launch title like Pilotwings utilizing it)

Where the SNES really shined is in the support chips and the memory department though.

Yes, the PPUs and SPU were nice, but we're not talking about those.

Going for the lower option with handhelds back in the day and to some extent even today was right, given the battery consumption issue and because people tend to break end lose handhelds more often and they would be less inclined to rebuy (and even buy in the first place) if it's too expensive.

It's not "right" from from a power consumption standpoint to go for ARM11 today. Or an ARM9 in DS really, or even an ARM7 in GBA, or a slightly modified GBC in GB, because these were all old designs that didn't give you the best perf/W bang for your buck.

I don't really have anything to say about your comment that people break and lose their handhelds, can you back that up with much?
 
And that was a year ago when prices were higher, and for 28nm instead of the dated 40nm process the WiiU is likely to be made on. And according to Nvidia's presentation on transition between process nodes (from early this year iirc) the cost would be a fraction of that of the 28nm node last year.

Granted, I'm drunk, but even now that compares favourably in terms of cost per MB/s to using 8 x 256MB chips once you factor in a single $ for a more complex motherboard for the double width memory bus. And in 5 years things would only get worse and worse if you'd chosen the 8 memory chips and 128-bit bus over the edram.

The only figures I have to go on are from DRAM Exchange (http://www.dramexchange.com) but they show 1GB of 800 mHz DDR2 (8 x 128MB) going for well over half the price of 4GB of 1600 mHz DDR3 (8 x 512MB), and 1GB of DDR3 1333 mHz (8 x 128MB) being around half the price of the massively slower DDR2 alternative.

This isn't taking into consideration any processing overhead of using eDRAM. Is the eDRAM even done on the same 40nm process as the GPU and not on a separate 45nm island?

Your number is only for the big vendors: Samsung, Hynix, Spectek, Micron, not the Chinese vendors I mentioned. I don't have price numbers off hand but you needn't look further than all the super cheap Chinese tablets still using DDR2. Mind you, a console would command millions of DRAM chips a year just by itself, so I doubt that the prices will go up.

Nintendo hasn't been shrinking their console chips (at least with Wii), but the Chinese DRAM chips will continue to shrink for them.
 
Don't look at whole systems; look at individual choices. Look at everything surrounding Wii U. They could have easily used a larger battery in the Wuublet. They could have used a beefier CPU. They could have used a larger RAM bus. They could have used a 16GB minimum flash chip. they could have used USB ports with enough juice to power an HDD or Wuublet over USB. Pretty much every bit of info we've had regards Wuu has been on the low end. We had the same thing with Wii. All the possibilities for fancy, or even just contemporary, hardware, and in the end it surprised everyone by being a doubled up GC.

The same could be said for almost any other manufacturer of midline electronics. It's all those extra small costs that make the product more expensive. You can always add one more little thing to make something better. But in the end you'll have to price it so people will buy it and you'll make money.
In my eyes microsoft was much worse in some instances with the launch of the 360. 1982 PCP production values, pisspoor quality control, no wireless out of the box with super expensive wifi dongle, wired controllers, no HDMI etc.

One doesn't, and probably shouldn't, need to go back through Nintendo's whole history - whatever their design decisions in the 1980s and 90s don't necessarily show anything about current mindset, any more than the Sony of now is revealed in the Sony of the 80s and 90s.

Most big companies have a personality of some sort or even soul if you will, that is remarkably stable over the years. Just like a person has all the molecules replaced every 7 years and still be more or less the same, a company can go through time and have a similar approach it did 20 or 30 years ago. That's why it's so difficult for large corporations to change course and do something new if their core market/expertise is changing rapidly. Look at Xerox, IBM and soon microsoft for prime examples of that.

But the past ten years have been pretty consistent with Nintendo choosing the cheapest solution out of whatever ideas this board has entertained in discussion. That doesn't mean any unknown is a forgone conclusion, but it sets considerable precedent to believe out of two possibilities regards Wuu's hardware, there's good chance it'll be the lesser performing choice that made it in; certainly enough that I won't give Nintendo the benefit of any doubt! ;)

Six of those ten years have been with the Wii and the GC was anything but cheap compared to PS2. The xbox had microsoft hemorrhaging money, so that doesn't really count.


Nintendo's 65c816 may have executed about as many instructions per cycle on average as a 7.5MHz 68k,
but those instructions were on average poorer quality
which made a tremendous difference in all sorts of tasks.

Define poorer. The 65xx(x) line was more RISCy than the 68000, apart from not being a load store architecture.

And this isn't considering that SNES came out two years after Megadrive. Or you could consider that it came out three years after PC-Engine. While PC-Engine only had an 8-bit (also 65xx derived) CPU it was clocked twice as high and could therefore easily beat SNES's a good deal of the time.

The PC-Engine ran out of a small but fast 8Kb SRAM. That's the prime reason it could a go so fast. Look at other late 80's 65xx systems. None of them go above 4Mhz. I suspect yields must also have been an issue even though it was a small chip due to the larger thermal strain put on the silicon.

A 68k wouldn't have made sense for SNES because Nintendo needed something they could license to integrate into a custom chip that had other functions. But they could have easily spent some of the big money they accrued on the NES in customizing the chip so it didn't suck less. Originally they intended to fix it by including the DSP I mentioned on board, but they scrapped that (if this weren't the case you'd have never seen a launch title like Pilotwings utilizing it)

The 68000 wouldn't have made sense because it was too large to put together with other stuff on the same die and because they were intending to have BC with the NES during development, where it would have been perfect. But they ended up with "just" more bang for the buck.

Yes, the PPUs and SPU were nice, but we're not talking about those.

Why not?

It's not "right" from from a power consumption standpoint to go for ARM11 today. Or an ARM9 in DS really, or even an ARM7 in GBA, or a slightly modified GBC in GB, because these were all old designs that didn't give you the best perf/W bang for your buck.

What would you have suggested then?
 
Define poorer. The 65xx(x) line was more RISCy than the 68000, apart from not being a load store architecture.

Where do I even begin? I'm guessing you didn't do a lot of programming in either.. That "RISC" claim about 65xx is just an attempt to give props to the zero page which is a relatively small optimization. It's not very RISCy in a lot of other ways, like having variable width instructions and memory indirect address modes.

It's a pain to program in because everything has to go through a pair of accumulators or index registers vs 8 data + 8 address registers on 68k. Since most instructions had to load either a 16-bit immediate or 16-bit memory value over a crippled 8-bit bus it didn't even need a real 16-bit ALU and may not have even had one.

And even more fundamental things like having to setup the carry flag before adds and subtracts because there's only adc and sbc instructions. 68k was pretty

The PC-Engine ran out of a small but fast 8Kb SRAM. That's the prime reason it could a go so fast. Look at other late 80's 65xx systems. None of them go above 4Mhz. I suspect yields must also have been an issue even though it was a small chip due to the larger thermal strain put on the silicon.

And yet SuperGrafx, released a year before SNES, managed fine with 32KB of said SRAM. 65xx computers are irrelevant because SNES wasn't an 8-bit computer being sold in the late 80s (which would have been well past their expiration date); what applies to discounted 8-bit computers shouldn't apply to SNES. I mean, are we really talking about new computers here or computers that had been sold for years prior? We could extend the argument to 16-bit derivations of 65xx if anyone outside of SNES and Apple IIgs even used them.

In the worst case scenario, if they really were DRAM latency limited to < 4MHz, then they could have at least put a small SRAM where it could run faster from (like with SA-1). It already had variable wait-states based on memory type, so it was even slower than the 3.58MHz rate implied. Besides that, why are we only looking at 65xx derived designs in the first place? There's a reason this CPU was quickly left behind by the 90s, and Nintendo was using it pretty much after everyone else had moved on. They should have realized they could do better.

The 68000 wouldn't have made sense because it was too large to put together with other stuff on the same die and because they were intending to have BC with the NES during development, where it would have been perfect. But they ended up with "just" more bang for the buck.

I already said 68k wouldn't have been a good choice either. Maybe they were lured in by the BC, but that was obviously a bad factor no matter how you look at it.


Because we're not? You're arguing that until Wii Nintendo never compromised on a part. I don't have to make a case that the entire system sucked.

What would you have suggested then?

For any of those?

GBC was a petty cash-in that should have been skipped altogether, with the next one accelerated. It barely improved on a handheld that was released 9 years earlier..
GBA could have used a StrongARM as was available in Newtons years prior. They could have at least clocked the ARM7 quite a bit higher. Not related to the CPU, but they really screwed the audio, both in depriving it of any kind of hardware synth and pushing the sound through an awful PWM connected to an awful speaker.
DS could have used an ARM11 or even much faster ARM9 (see GP32, released years prior, that had a 133MHz ARM9 and ran off of AAs). DS also continued the legacy of really gimping audio quality, although this time more on the analog side than digital.
3DS could have used a Cortex-A5 at the very least, if not a Cortex-A8 or A9.
 
Last edited by a moderator:
Where do I even begin? I'm guessing you didn't do a lot of programming in either.. That "RISC" claim about 65xx is just an attempt to give props to the zero page which is a relatively small optimization. It's not very RISCy in a lot of other ways, like having variable width instructions and memory indirect address modes.

It's a pain to program in because everything has to go through a pair of accumulators or index registers vs 8 data + 8 address registers on 68k. Since most instructions had to load either a 16-bit immediate or 16-bit memory value over a crippled 8-bit bus it didn't even need a real 16-bit ALU and may not have even had one.

And even more fundamental things like having to setup the carry flag before adds and subtracts because there's only adc and sbc instructions. 68k was pretty
Yes, 68000 was said to be nice to work with. But also a deadend in the longrun A halfway house between the minis of the 70's and a real software/highlevel language centric architecture (sadly also not something that won out in the end). It was certainly not RISCy in any way.
The 6502 was RISCy in it's approach and spirit but not in the later formal definition of the term. It was a lean faster version of the already lean 6800. Of course not the way to do the programmer friendly CPU, but fast and cheap.

And yet SuperGrafx, released a year before SNES, managed fine with 32KB of said SRAM.

That was a desperate response to SNES released a few months before the Japanese launch. It was discontinued shortly after. Maybe because it was to expensive to produce?

In the worst case scenario, if they really were DRAM latency limited to < 4MHz, then they could have at least put a small SRAM where it could run faster from (like with SA-1). It already had variable wait-states based on memory type, so it was even slower than the 3.58MHz rate implied. Besides that, why are we only looking at 65xx derived designs in the first place? There's a reason this CPU was quickly left behind by the 90s, and Nintendo was using it pretty much after everyone else had moved on. They should have realized they could do better.

The Apple IIc Plus used the cache approach in 1988 running at 4Mhz with a 6502. It would take until 1990 or so before (expensive) accelerators would seriously best that. Trouble with using higher clocks is of course that the chips will have lower yields.
The 6502 is long dead for anything but microcontrollers but the spirit is still very much alive in ARM processors, that was originally designed to replace the 6502 in Acorns computers.

Because we're not? You're arguing that until Wii Nintendo never compromised on a part. I don't have to make a case that the entire system sucked.
Where did I do that? I'm only arguing that overall their consoles were if not on edge then not as far from as suggested.

GBC was a petty cash-in that should have been skipped altogether, with the next one accelerated. It barely improved on a handheld that was released 9 years earlier..
GBA could have used a StrongARM as was available in Newtons years prior. They could have at least clocked the ARM7 quite a bit higher. Not related to the CPU, but they really screwed the audio, both in depriving it of any kind of hardware synth and pushing the sound through an awful PWM connected to an awful speaker.
DS could have used an ARM11 or even much faster ARM9 (see GP32, released years prior, that had a 133MHz ARM9 and ran off of AAs). DS also continued the legacy of really gimping audio quality, although this time more on the analog side than digital.
3DS could have used a Cortex-A5 at the very least, if not a Cortex-A8 or A9.

GBC was cheap and cheerful and the priority was on being an update on GB and having BC. If the advantages of going ARM was so obvious at the time why didn't other handhelds at the time do it (Wonderswan and Neo Geo etc.)? Some of the same things can be said for GBA. I was disappointed with the GBA, but in hindsight I don't think they would have been able to succeed with a more expensive machine then. The market and the cost of the technology for quality 3D was not for high end handhelds yet.
I don't see the GP32 doing anywhere near the 3D that DS was able to do, plus no touchscreen and rechargeable battery to pay for.
And with regards to 3DS it's still a lot more affordable than Vita, which is currently borderline bombing.
 
Last edited by a moderator:
Yes, 68000 was said to be nice to work with. But also a deadend in the longrun A halfway house between the minis of the 70's and a real software/highlevel language centric architecture (sadly also not something that won out in the end). It was certainly not RISCy in any way.
The 6502 was RISCy in it's approach and spirit but not in the later formal definition of the term. It was a lean faster version of the already lean 6800. Of course not the way to do the programmer friendly CPU, but fast and cheap.

So to answer my question you did not work with them right? I'm sorry but I don't agree with the common claim that 65xx was RISC-like at all. You can't even say it was a new approach to begin with given that it's a little more than a mildly stripped down 6800. 6800 even had the fast zero page, which is the biggest "RISC" like feature people attribute to 6502.

Maybe you think that this cutting down was what the "RISC spirit" was about but that's really not it - RISC was about making simple instructions that could execute quickly. 6502 was about making a small processor so they could sell it for very little. Totally different premise.

And yeah, 68k was a dead-end in the long run, like almost everything else. But it made through several serious generations, much better than you can say for 6502's descendants.

That was a desperate response to SNES released a few months before the Japanese launch. It was discontinued shortly after. Maybe because it was to expensive to produce?

It doesn't really matter why it was released nor why it failed, what I'm trying to demonstrate to you is that you could easily pair more than 8KB of SRAM with a 6502 derivative and still clock it as high.

But if you really care, the real reason why it failed is probably because it offered too little over PC-Engine; just the extra RAM and an extra BG layer/set of sprites. So if it was a reaction to SNES it didn't come close to the same graphical or audio levels. There were some claims going around that it failed because the CPU was too weak for the enhanced graphics, but I don't see how that could possibly hold for SuperGrafx and not SNES.

The Apple IIc Plus used the cache approach in 1988 running at 4Mhz with a 6502. It would take until 1990 or so before (expensive) accelerators would seriously best that. Trouble with using higher clocks is of course that the chips will have lower yields.

I thought the trouble with clocks was DRAM. You sure that in 1988 there was a yields problem hitting 4MHz for 6502s? I guess those > 7MHz Hu6280s were really screwed afterall huh?

I don't know if it took a long time before expensive accelerators beat a 4MHz 6502 in 1988, I just now how incredibly silly that sounds when native 16-bit computers were already beating that for a while...

The 6502 is long dead for anything but microcontrollers but the spirit is still very much alive in ARM processors, that was originally designed to replace the 6502 in Acorns computers.

There's that old ARM = 6502 in "spirit" nonsense. You really need to go program in both for a while. Maybe ARM's developers had fondness for 6502 and wanted something like this but the execution was nothing of the sort.

Where did I do that? I'm only arguing that overall their consoles were if not on edge then not as far from as suggested.

Now you're just giving me the run-around.

Shifty Geezer said: "Nintendo has a reputation based on prior designs of going with lower options" to which you said that has NEVER been the case apart from Wii.

I'm pretty sure that's saying that Nintendo has not once before Wii used a lower end option part than they should have. Now you're trying to tell me that it's only true if it applies to everything used in the console, or what?

GBC was cheap and cheerful and the priority was on being an update on GB and having BC. If the advantages of going ARM was so obvious at the time why didn't other handhelds at the time do it (Wonderswan and Neo Geo etc.)? Some of the same things can be said for GBA. I was disappointed with the GBA, but in hindsight I don't think they would have been able to succeed with a more expensive machine then. The market and the cost of the technology for quality 3D was not for high end handhelds yet.

Haha, "cheerful." Do you have any idea how biased you sound? You should be a Nintendo spokesperson. You really want to assume Nintendo made the best possible choice at every turn. Why can't you accept that they aren't perfect? I have no problem pointing out a variety of hardware mistakes all the other game companies made..

And I'm not sure where you picked this up but I really never said they should have used ARM in 1998. Regarding GBA I also didn't say anything about 3D acceleration - what I said is they should have used a faster CPU (even a 24MHz ARM7TDMI would have been a big improvement, preferably something where you can control the clocks to some extent, and they should have not certified much less MADE games that used busy loops, buhhh) and they should have used a better audio subsystem on all fronts. The graphics were generally OK.

I don't see the GP32 doing anywhere near the 3D that DS was able to do, plus no touchscreen and rechargeable battery to pay for.

You're doing it again. I said the CPU was a lot weaker (vs a handheld released years earlier). I didn't say anything about any other part of the system (okay, I said the analog audio quality sucks.. I can't believe they still used this awful ~10-bit PWM that aliases like crazy). I'm starting to think you don't think CPU matters if the rest of the system is nice.. Not to say that the rest of DS was terribly impressive... Very awkward design, that one.

And with regards to 3DS it's still a lot more affordable than Vita, which is currently borderline bombing.

So? That's for a whole ton of reasons. I'm pretty sure picking a pair of Cortex-A5s with NEON wouldn't have broken the bank vs a pair of ARM11s. It would have barely made a difference at all. But I suspect that Nintendo likes to have their hardware decisions finalized years earlier than other companies.
 
The same could be said for almost any other manufacturer of midline electronics. It's all those extra small costs that make the product more expensive. You can always add one more little thing to make something better. But in the end you'll have to price it so people will buy it and you'll make money.
In my eyes microsoft was much worse in some instances with the launch of the 360. 1982 PCP production values, pisspoor quality control, no wireless out of the box with super expensive wifi dongle, wired controllers, no HDMI etc.
We're specifically talking about choice of technology as to why I feel we shouldn't give Nintendo the benefit of the doubt. Regardless of whether MS went too cheap in production values etc., their hardware choices were had a decent amount of 'highs' such that during a discussion of what XB360 might be, we could entertain any possibility and it panned out. Speaking specifically about Nintendo's hardware choices, they have all sided on the cheap side, no? Their reasonings may well be to save costs, but that reinforces the belief that their choices regards GPU is the cheap option rather than the powerful option.

Six of those ten years have been with the Wii and the GC was anything but cheap compared to PS2.
Wii and DS and 3DS. GC was comparable to its peers, not a cheap hardware design, but that seems to come from a different philosophy when Nintendo were trying to compete. Now they aren't. Their behaviour towards hardware design for their past 4 devices has been generally to cheapen out, sometimes extremely so. Ergo my expectation of them regards Wuu follows the logic of a pattern of behaviour that'll be repeated. Doesn't mean it will be, but there's no real logic to giving Nintendo the benefit of the doubt when trying to guess what's in Wuu. Assuming eDRAM is high bandwidth is based not on the evidence of what we see on screen, nor the evidence of Nintendo's current MO, or indeed any evidence AFAICS, so I see no sense in taking it as read that its a high BW part.
 
I don't think it's fair to include the DS and 3DS. At the time I doubt Nintendo could have added a whole lot more powerfull hardware to the DS and still include 2 screens and good batterylife for just 150 euro's while not losing money on it.

Even the Wii wasn't that bad. Obviously the hardware wasn't anything to write home about but it was cheap. At the time the ps2 was about half the price of a wii so Nintendo could have probably dropped the price another 50 if they needed to, but lucky for them wii took off. Anyway at 250 euro's I didn't feel like the system offered such bad value for money. At least it wasn't slower than the last gen consoles, come with a (probably) more expensive controller and built-in wifi and memory.

Point being: Wii was built to a certain price so I wouldn't call that cheaping out as they never intended to compete on specs and the price showed that. Not as much as it could, but thats because the idea took off so well.

Wuu OTOH, to me, is bad value for money and Nintendo cheaping out in every way. Wuu should be 250 euro's, not 350. It's not even faster than this gen's consoles but a whole lot more expensive while the only thing extra is the wuublet. Well whatever Nintendo claims I just don't believe the tablet is that expensive. Plus all the other stuff Shifty already mentioned like usb ports not even being capable of powering a hdd. (while there is hdd support, to me that's just stupid)

Wuu is just half a dozen of GC's stuck together with a cheap lcd trown in selling for a price close to what you would expect of a high speced new console.
 
I don't think it's fair to include the DS and 3DS. At the time I doubt Nintendo could have added a whole lot more powerfull hardware to the DS and still include 2 screens and good batterylife for just 150 euro's while not losing money on it.

Sure it's fair. Nintendo could have allowed a higher clock speed and it wouldn't have cost them anything. They did just that on DSi which also bizarrely came with a reduced capacity battery. In fact I bet Nintendo intentionally crippled DS's clock speed so they could create more market segmentation down the road with DSi.

The higher clock speed does affect worst case power draw, but if it had a PLL to the clock it could have been avoidable except where necessary (halting does help.. alarmingly some games still use idle loops some of the time). This would have probably been preferable to what you got with a fixed 66MHz CPU speed. GP32 managed this much.

3DS is a more flagrant example of being behind the times, releasing an ARM11 3 years after Cortex-A8 devices started appearing. It's hard to justify their decisions when a billion other mobile devices of every form down to the lowest end phones and PMPs include better CPUs. A modest improvement here wouldn't have done much to decrease the worst case of an already awful battery life, dominated by other power suckers. It wouldn't have cost that much in either licensing or die space to use something substantially better.

Even the Wii wasn't that bad. Obviously the hardware wasn't anything to write home about but it was cheap. At the time the ps2 was about half the price of a wii so Nintendo could have probably dropped the price another 50 if they needed to, but lucky for them wii took off. Anyway at 250 euro's I didn't feel like the system offered such bad value for money. At least it wasn't slower than the last gen consoles, come with a (probably) more expensive controller and built-in wifi and memory.

Wii hardware should have barely cost more than Gamecube hardware plus the controller, and Nintendo was selling the Gamecube at $99, allegedly for profit. This doesn't account for R&D but the hardware R&D for Wii should have been very low too unless Nintendo spent $2b developing the controller.

In other words, Wii could have been a lot better hardware for the price. It's just hard to notice when it had competitors which were both a lot more expensive and a lot more capable.

Perhaps the gap between Wii and previous gen stuff was bigger than the gap between Wii U and XBox 360/PS3.. especially when you consider most people were playing PS2s which typically had the weakest showing. But Wii was released a lot later in its generation.

Wuu is just half a dozen of GC's stuck together with a cheap lcd trown in selling for a price close to what you would expect of a high speced new console.

That's not awfully fair for a console that has a totally different GPU. The only place the popular "X predecessors duct taped together" language really works to any extent at all is DS.
 
Nintendo does hardware design for survival.
The WiiU is a low cost design - this gives them wiggle room to lower prices to achieve a sufficient number of consoles in consumer hands to provide an eco system that works, even if it has to change a bit.
They've done everything they can to make a system that both has novelty appeal and lowers the barrier of entry
- A controller with a built in screen that promises new game play perks
- A controller that doesn't hog the TV - solves conflicts/saves money
- Backward compatibility of games and peripherals.
- low cost design for future price adjustments

The novelty seeker, the upgrader, the cheapskate - they will all be catered to.
The current pricing is to skim the cream this holiday season. The $349 SKU is likely to be $279 or even $249 come this time next year. (But then, the late buyer missed out on a year of gaming on the new console.) And it will drop from there.

Put another way, game consoles are toys, or if you want to be pretentious, entertainment devices. The success of a design is properly measured by
- Absolute level/volume/uniqueness of entertainment (If you're not cash constrained)
- level/volume/uniqueness of entertainment/$ (If money is a limited resource)
- level/volume/uniqueness of entertainment at a given maximum price point (For those where money is tight.)
Nintendo wants to be able to compete for purchases in all three categories. Their hardware designs reflect this.
 
If Nintendo had released a true next generation system it would of played into there favor, they would of got the jump on Sony and Microsoft and maybe forced them to release earlier and they of prevented sales of other next generation systems.

People who payed for a true next gen Wii U would maybe think twice about dropping big money on another console as well...
 
That's not awfully fair for a console that has a totally different GPU. The only place the popular "X predecessors duct taped together" language really works to any extent at all is DS.
I think Wii's a very good fit to that description, in it's spirit rather than literal sense. Wii is Gamecube hardware.
 
See here is my take on what nintendo is doing;

- Apparently nintendo has some hardcore-code-monkeys that don't mind coding every game to the metal.
- Nintendo doesn't want their hardware to overheat and die after 5 years of dust and heavy use.
- Nintendo wants to sell thier consoles for most $400 at launch without needing to wait on game sales to make a profit.
- Backward compatability
- 5 year console life span

All these things lead to barebones hardware. Requiring third parties to need to spend lots of time optimizing thier game for the hardware instead of dropping random ports. Everything is made on the cheap such that its low heat, cost and lasts unlike the dreamcast disk drives.

Now image if Sony had gone the same route with the PS3 being a PS2 upgrade? No one would have complained and ps3 would have won. No need for HD remixes of old games. Everybody would be already upto date. Instead we have the ps3. The ps3 is like the wii but opposite in everyway that counts.

Anyway, Super mario Galaxy is the perfect example of a bueatiful wii game that was clearly coded in assembly from scratch using every ounce of the wii pipeline. No thirdparty company is going to waste their time doing that. Frankly nintendo doesn't care if thirdparty ports suck.
 
I think Wii's a very good fit to that description, in it's spirit rather than literal sense. Wii is Gamecube hardware.

Exactly, so it's not fair to call it two Gamecubes duct taped together, it doesn't even deserve that honor :p

Now image if Sony had gone the same route with the PS3 being a PS2 upgrade? No one would have complained and ps3 would have won. No need for HD remixes of old games. Everybody would be already upto date. Instead we have the ps3. The ps3 is like the wii but opposite in everyway that counts.

I find that very, very unlikely. Sony didn't have the Wiimote gimmick and they didn't have strong first party exclusives. And I don't think PS2 would have scaled that well (although more VRAM would have helped) - it had some big core design deficiencies like lack of L2 cache and a very feature light GPU. So a Wii-like treatment, where they merely overclock it, change the memory layout a little, and a glue/security processor would have worked pretty poorly for a PS2 successor. XBox360 would have slaughtered it, and possibly beaten Wii in the long run.
 
Last edited by a moderator:
Back
Top