I think we can all agree at this point (and probably would have for years) that Nintendo makes console hardware for Nintendo. Or at least this is clearly the case for Wii, DS, 3DS, and now Wii U. This has two obvious implications - the hardware gets the new peripherals that support their game ideas and the raw power is only as much as they feel they comfortably want to utilize for that generation (or maybe just a little more). But it seems like there could be more nuances to this.
Nintendo, as a first party software developer, is very used to the Broadway core at this point. And a lot of the developers probably also have little experience developing on XBox 360 and PS3 since their work is exclusive. Giving them three of them clocked twice as fast and with better L2 cache is going to look like a huge improvement, and more than enough to facilitate a generational improvement in their games. And it's
It's totally reasonable that Nintendo would want to stick with the levels of tech that they are for their own games, since AAA titles on other platforms cost far more to develop. So right now Nintendo is probably making much more profit per game sold, taking much less risk (although most of their titles are inherently pretty safe anyway), and selling almost as many copies. Given that they could be dead last in the console race and probably still consider the console an overwhelming success if they can keep selling the same quantity of first party titles. And I don't think that worse graphics and CPU is threatening that, if anything's threatening it it'd be Nintendo losing franchise appeal and stagnating. But that's a very different problem.
In this light the alleged design actually makes a ton of sense. It's enough to support the budget levels they want to spend on games. It's an architecture they're very familiar with and have a lot of tool support in place for so costs little to move to. You get cheaper BC (putting a separate Broadway on the die might not be a huge area expense but it'd definitely cost some in engineering time) and you get a good licensing deal from IBM who seems about as eager to promote eDRAM on CPUs as pushing newer CPUs.
I could actually see going for better hardware as being a minor disadvantage in some ways, even ignoring costs. Letting third parties produce much better looking games would put more pressure on Nintendo to increase their development efforts. And keeping the generational boosts in check gives them more room to grow next time. Eventually everyone will hit a wall so it makes sense that Nintendo would want to take things as slowly as possible. It could be that Nintendo has a backup plan to release the next console early if this one is in trouble and they think it'll help. It's possible they even had such a plan for Wii but went with it because they thought they could get away with it (and did).
Sure, Nintendo could have probably offered the same performance using even smaller and lower power more modern cores, but if more performance isn't even desirable why bother? The power consumption difference is negligible with the GPU taking the bulk of it and the die area is also pretty negligible and possibly already pad limited (could in fact be a reason why they stuck with a 64-bit DRAM interface).
But yeah, this is all a big downer for third parties and probably not good for the industry at large. But none of that necessarily matters that much to Nintendo. What I find really mind boggling is why they'd hamper the battery life on the controller. They seem really determined to screw up on battery life these days, which is pretty disappointing given the history with their older handhelds. There is no way that the dollar they saved on the battery is justifiable given the bad PR and bad reaction they'll get from users. People want devices they don't have to constantly remember to charge. No one wants to reserve a wall socket for their controller (or be plugged in to the wall for that matter). All it'll take is the thing dying enough times to make some owners want to use the thing less, and while they're using it less they end up buying fewer games.
Nintendo, as a first party software developer, is very used to the Broadway core at this point. And a lot of the developers probably also have little experience developing on XBox 360 and PS3 since their work is exclusive. Giving them three of them clocked twice as fast and with better L2 cache is going to look like a huge improvement, and more than enough to facilitate a generational improvement in their games. And it's
It's totally reasonable that Nintendo would want to stick with the levels of tech that they are for their own games, since AAA titles on other platforms cost far more to develop. So right now Nintendo is probably making much more profit per game sold, taking much less risk (although most of their titles are inherently pretty safe anyway), and selling almost as many copies. Given that they could be dead last in the console race and probably still consider the console an overwhelming success if they can keep selling the same quantity of first party titles. And I don't think that worse graphics and CPU is threatening that, if anything's threatening it it'd be Nintendo losing franchise appeal and stagnating. But that's a very different problem.
In this light the alleged design actually makes a ton of sense. It's enough to support the budget levels they want to spend on games. It's an architecture they're very familiar with and have a lot of tool support in place for so costs little to move to. You get cheaper BC (putting a separate Broadway on the die might not be a huge area expense but it'd definitely cost some in engineering time) and you get a good licensing deal from IBM who seems about as eager to promote eDRAM on CPUs as pushing newer CPUs.
I could actually see going for better hardware as being a minor disadvantage in some ways, even ignoring costs. Letting third parties produce much better looking games would put more pressure on Nintendo to increase their development efforts. And keeping the generational boosts in check gives them more room to grow next time. Eventually everyone will hit a wall so it makes sense that Nintendo would want to take things as slowly as possible. It could be that Nintendo has a backup plan to release the next console early if this one is in trouble and they think it'll help. It's possible they even had such a plan for Wii but went with it because they thought they could get away with it (and did).
Sure, Nintendo could have probably offered the same performance using even smaller and lower power more modern cores, but if more performance isn't even desirable why bother? The power consumption difference is negligible with the GPU taking the bulk of it and the die area is also pretty negligible and possibly already pad limited (could in fact be a reason why they stuck with a 64-bit DRAM interface).
But yeah, this is all a big downer for third parties and probably not good for the industry at large. But none of that necessarily matters that much to Nintendo. What I find really mind boggling is why they'd hamper the battery life on the controller. They seem really determined to screw up on battery life these days, which is pretty disappointing given the history with their older handhelds. There is no way that the dollar they saved on the battery is justifiable given the bad PR and bad reaction they'll get from users. People want devices they don't have to constantly remember to charge. No one wants to reserve a wall socket for their controller (or be plugged in to the wall for that matter). All it'll take is the thing dying enough times to make some owners want to use the thing less, and while they're using it less they end up buying fewer games.