Radeon 8500 "R200" compared to Wii's Hollywood

Of course a r300 or r400 wouldn't have matched the xenos and the r400 could have actualy been more powerfull in the long run than what went into the ps3. It was nintendos bad that they didn't take advantage of the years of tech progress since the gamecube was made.

Do you honestly believe that a more powerful GPU would have moved more hardware? I think N did the right thing. If they pay $10 for the GPU instead of $30 (whatever the real numbers are) that's a lot of extra profit they make.
 
Do you honestly believe that a more powerful GPU would have moved more hardware? I think N did the right thing. If they pay $10 for the GPU instead of $30 (whatever the real numbers are) that's a lot of extra profit they make.

Well they could have still got a much more powerful GPU, something like a X1300XT would have been a monster at 480p compared to Hollywood, and would hardly have raised the cost of manufacture by any meaningful amount. Hollywood was chosen for BC both with GCN software and inhouse tools and thus ease the generation transition.
 
I don't think r400 would have been a match for rsx at all, it was in nv40 performance territory and didn't even support SM3. R520 would probably have been more than a match for rsx though and certainly highly competitive with xenos. R580 would have put wii ahead of both.

Sorry I meant the r520 adn r580. As we see in games like cod mw 2 it performs much better than the 7800 series and 7900 series.

Do you honestly believe that a more powerful GPU would have moved more hardware? I think N did the right thing. If they pay $10 for the GPU instead of $30 (whatever the real numbers are) that's a lot of extra profit they make.

The r300 is a 107m tranistors. Its tiny , its a full dx 9 gpu so it would have put them at least in the same league as the 360 and ps3. Staying at 640x480 would let them stay more competitive and a r420 at 160m tranistors would have boosted performance again.

I have no idea how big hollywood is . But you also have to understand that the other end of the spectrum could hurt them also. If hollywood is to small on 65nm and 40nm they may not be able to shrink it due to being pad limited.

In the end the system did great but the r300 or any of the other chips would have given them a much bigger graphics boost
 
Last edited by a moderator:
I have no idea how big hollywood is
Probably about the same size as Flipper, ~26m logic transistors, ~25m eDRAM transistors. People forget how tiny fixed-function GPUs are in comparison to modern chips.

Nothing is free. MS learned the hard way that you can't just slap off-the-shelf PC parts in a box, mass produce it, sell it for under $250, and make money. We do know there are lots of chips more powerful than Hollywood that Nintendo could have put into something as small as the Wii. What we don't know is how well that would have fit into Nintendo's strategy, or what the cost/benefit tradeoffs were. It's not like they never looked at the possibility of a different chip. These are professional engineers.
 
considering it was released in 2006 as I said earlier they could have gotten away with a gpu all the way up to the radeon 1900. I doubt nintendo would even have considered it.

Of course a r300 or r400 wouldn't have matched the xenos and the r400 could have actualy been more powerfull in the long run than what went into the ps3. It was nintendos bad that they didn't take advantage of the years of tech progress since the gamecube was made.

If they opted for R580, it would put a requirement of form factor being PS3/360-ish, as well cooling. I highly doubt they would go down that road. And of course, the price would've been too steep for what they wanted.
 
These are professional engineers.
Undoubtedly. I also believe Nintendo's professional engineers are being ridden very hard by top management to produce something of very low cost that can be sold at a high premium. Hence they wouldn't have schlepped a more than half-decade old hardware design in their latest console when the Wii was being put together.
 
I have no idea how big hollywood is . But you also have to understand that the other end of the spectrum could hurt them also. If hollywood is to small on 65nm and 40nm they may not be able to shrink it due to being pad limited.

they don't need to shrink it, I'm not sure that would save more money than what it costs and there are not the same heat reduction benefits as on considerably bigger chips.
 
I think that it's plainly clear that the blame falls on N's desire to jump on the "value added" backwards compatibility bandwagon. You take that requirement away and the sky's the limit on what they could've done, even for cheap hardware. It was one hell of a ball and chain.
 
I think that it's plainly clear that the blame falls on N's desire to jump on the "value added" backwards compatibility bandwagon. You take that requirement away and the sky's the limit on what they could've done, even for cheap hardware. It was one hell of a ball and chain.

Absolutely, its the only logical reasoning behind going with such an antiquated fixed function design imo. I don't think performance was ever even a serious consideration, other than the fat that it had to ouperform the GCN, to what degree was simply inconsequential.

Hopefully with Nintendo being stuck on such low powered hardware should mean that even a very conservative approach should allow for a very comprehensive BC through software emulation. They need their next hardware design to be capable of HD rendering as well, surely that alone is reason enough to ditch their current design? Investing R & D into scaling up Flipper/Hollywood to be capbale of that seems like such a monumental waste of time when any low end off the shelf design would comfortably outperform it anyway. Heck, their handheld hardware is going to be much more feature rich than their home console very soon, I just can't forsee such a ridiculous scenario continuing forever more.
 
Heck, their handheld hardware is going to be much more feature rich than their home console very soon, I just can't forsee such a ridiculous scenario continuing forever more.

Do we even know if the Tegra II rumor can be trusted? Who's to say Nintendo won't get a customed stripped down version of Tegra II so they can pay pennies for it?
 
Do we even know if the Tegra II rumor can be trusted? Who's to say Nintendo won't get a customed stripped down version of Tegra II so they can pay pennies for it?

When was the last time Nvidia designed a fixed function GPU? Heck, when was the last time any major GPU player designed a fixed function unit similar to Flipper? Even if Nintendo do get a low end Tegra or SGX or similar, it'll still be much more feature rich and flexible than Hollywood.
 
To answer the OP's question, the Wii doesn't even come close.

I agree with fearsomepirate. They're professional engineers and $10 cost (vs. $30) gives a lot of extra profit. Hopefully, Nintendo will give a lot of it back for their 6th generation with a $250-300 powerhouse and a traditional controller. I'm hoping they'll use a DX11-class GPU.
 
Back
Top