Right now the console market is based around having bleeding edge (and expensive) hardware at release. Then they shrink it, integrate parts, etc., which lowers the cost and price (but the price doesn't decline as quickly as the cost). Then, the last few years of the console generation, all the other competitors also have four-year-old designs, so the consoles become cash cows for everyone at that point.
This is something of a recent phenomenon. Successful consoles of earlier generations did not try to compete at quite as high a level.
There are other dynamics, such as Microsoft and Sony trying to push beyond gaming-specialized consoles and taking more control of the home media center that encouraged heftier hardware.
The Wii is more economical in that regard because remaining specialized allowed it to skip what wasn't critical for its role.
I guess what I'm trying to say is: what if Microsoft moves away from this "revolution every five years" model to a "evolution every 2.5 years" model. It may just be that the dynamics of the console industry would prevent that from happening (I'm not sure what those dynamics would be, but perhaps some of you can comment). Such a model would allow the consoles to not need to be on the bleeding edge at the launch (currently required right now because you want that system to last several years). In fact, it would be more like PCs (and mobile phones, and iPods, and everything else). Something that Microsoft would be pretty comfortable with. Perhaps automobiles are a counterexample, but automotive technology is developing much more slowly than computer technology.
PC hardware is sold as a product in and of itself. Its own profit margins finance the R&D needed for its evolution, and it is expected to make money almost from day one.
In addition, there is a certain level of flexibility in order to allow for different hardware combinations that adds cost but can survive on the fatter margins of the market.
Console hardware is often a loss leader or a modest earner compared to the software licensing and sales that make the money needed for design work.
What seems to be happening is that the fixed cost of a design, whether new or an evolution, is only a fraction of ongoing expenses compared to the costs of manufacturing millions of consoles.
Saving X millions of dollars up front on the design effort can be a bad thing if it means that the manufacturing lines cannot shave X dollars off of the production of tens of millions of units.
It's one thing to manufacture hardware of a given level of performance more cheaply with process shrinks and hardware revisions, and an entirely different can of worms to do the same with hardware that has even higher specifications.
Even cost-saving revisions have a cost involved in their design, so it's not a money saver to pay money to make a revision that then costs more unless it can pull in more cash.
The crux of the matter is that the cash is tied to the software being sold.
Unless the evolutionary hardware allows for a tangible improvement of the software that leads to consumers spending more money on the games, the console maker doesn't really win out.
The Wii's success is in part due to the fact that its interface allows for tangibly different software.
The others suffered initially (and possibly still do in some genres) from the problem that their new hardware didn't lead to noticeably different or improved software.
A little bit better hardware isn't particularly conducive to software people buy more of or pay more for, so it encourages greater generational changes.