TEV a keeper or a loser?

You'll have to convince me that there's a sane category in which fixed-function TEVs make sense in a Wii2 piece of hardware.
That's going to be difficult, considering that I'm not sure you even considered Wii 1 to be sane decision, despite its obvious success. However, as I said before, if Wii 2 is to Wii as DSi is to DS, then continuing with some variant of the current hardware is obviously in their best interest. If they go with that kind of strategy, they don't want to confuse the customer about whether or not she can continue using that Wii Fit she spent $80 on.

To answer your question, Iwata wanted power consumption minimized in order to minimize the profile of the machine and allow it to be always on without a significant energy footprint. Programmability simply requires more transistors. I've read estimates of Flipper's logic transistor count as being around 26M. By comparison, the X300 has 75M, and the X1300 has over 100M. Features require transistors, pure and simple.
Giving drawing a few more watts to get substantially better graphics is a choice I think pretty much everyone would make.
But it's a choice a lot of people didn't make. A lot of Wiis have been sold so far. And, by the way, I don't think programmability gives you "substantially" better graphics, not in the eyes of most people. You have to have programmability plus enough fillrate plus enough RAM to make a big difference.
Prove me wrong with a convincing example of a games-system that'll be competitive while being tied to decade old technology!
Fixed-function GPUs will turn 10 in August. So, I'll show you a Wii around then, assuming Nintendo can stay competitive for another few months.
Technology moves forwards too fast.
The only thing that matters is how fast customer demand moves.

Don't get me wrong. I do think that eventually, Nintendo will include a more advanced GPU in its products. I don't think that time has arrived. Demand for HD consoles is still stagnant, suggesting that the current, brief dip in Wii sales is not due to an accelerated demand for new graphical technology. I don't think that improved graphics will be the buzz-generator for Nintendo's next console, not at this point. Maybe in another few years it'll be time for a substantial shift.
 
Its dated, why Nintendo did it is beond me.

You can get a 6600GT really really cheap these days, all Ninty had to do was strike a deal with Nvidia or ATI and stick a last gen DX9 card in wii.

A wii with a 6600GT or a 9800Pro or a card simular would of been ALOT better and becase they would be 'old' PC cards they would of been cheap enough to produce and license :)
 
The Wii's current success has no relation to the technical merits of Hollywood. TEV was superseded a long time ago by a model that could do everything it could do and much more, but better, faster, more flexibly and more easily.
 
The more modern pixel shader is pretty much what the TEV has become, and there's really no need to keep it. The only reason the same h/w was kept was for quick, simple manufacturing, if Nintendo wanted, they could have had a more advanced, h/w shader based console, and still sold it on the cheap.

I have no reason to believe the TEV still exists today for anything more than convinience, its a neat piece of tech and can pull of a lot of great effects indeed, more than what we're seeing even, but for the sake of far more power & flexibility, its a unit to lose for the next generation.
 
Since Nintendo effectively skipped a whole technology generation, surely creating a machine that is fully capable of emulating the Wii in software is going to be relatively trivial, no?

Heck they had a very impressive N64 emulator running on the GCN (which has now been expanded upon for the Wii's VC functionality) which even improved the graphics by rendering in a higher resolution. Surely even if they're conservative in their hardware and go with something like a RV730 on the GPU side, this isn't going to prove a problem?
 
Since Nintendo effectively skipped a whole technology generation, surely creating a machine that is fully capable of emulating the Wii in software is going to be relatively trivial, no?

Heck they had a very impressive N64 emulator running on the GCN (which has now been expanded upon for the Wii's VC functionality) which even improved the graphics by rendering in a higher resolution. Surely even if they're conservative in their hardware and go with something like a RV730 on the GPU side, this isn't going to prove a problem?
Even the Wii emulator surprises me how it's able to play certain games with little to no bugs.

But why does everybody keep saying the next Nintendo system should use DirectX 9? I'm pretty sure they can't and won't support a Microsoft-specific API.
 
TEV is very neat for late 90s technology, but its clearly outdated even compared to Shader Model 2.0 GPUs such as R300 (Radeon 9700) from 2002.

For Nintendo's next console, they need to have at least modern DX9+ capabilities, if not DX10/DX11.

Flipper, the original GameCube graphics processing unit was designed for a console released in 2001. If Hollywood is virtually identical to Flipper then some of us are evangelising 8 year old technology!

Flipper (therefore everything in it, including the TEV) was thought up/brainstormed in 1998, the design was hashed out mainly in 1999. Flipper was prepared for mass production in 2000. So Flipper is a late 90s design.

IGNcube: You say you began talking to Nintendo® in 1998. So from white paper designs and initial design to final mass production silicon how long was the development process?

Greg Buchner: Well, there was a period of time where we were in the brainstorm period, figuring out what to build, what's the right thing to create. We spent a reasonable amount of time on that, a really big chunk of 1998 was spend doing that, figuring out just what [Flipper] was going to be. In 1999 we pretty much cranked out the gates, cranked out the silicon and produced the first part. In 2000 we got it ready for production, so what you saw at Space World last year was basically what became final silicon.

http://cube.ign.com/articles/099/099520p1.html


GameCube was originally going to be released in 2000 but Nintendo pushed it to 2001 because they needed more time to finish games.

So TEV and the entire Flipper is, literally, 10 year old technology. The Hollywood is really the same. Other than 50% (1.5x) faster clockspeed, some minor tweaks, and a smaller manufacturing process, Hollywood has an identical architecture to Flipper. It's little more than a speedbump, not even a typical GPU 'refresh'.

It'll be interesting to see what Nintendo chooses to use to drive HD graphics in the next console.
 
Last edited by a moderator:
That's going to be difficult, considering that I'm not sure you even considered Wii 1 to be sane decision, despite its obvious success.
That's arguing the wrong point. Unless you believe either 1) People bought Wii for the TEV or 2) Superior graphics would have diminshed Wii's sales, then you can't use Wii as an example of where old tech leads to a more successful product. An alternative GPU would have lost Nintendo a lot of money in aggregate profits, but gamers and developers would have benefited.

To answer your question, Iwata wanted power consumption minimized in order to minimize the profile of the machine and allow it to be always on without a significant energy footprint. Programmability simply requires more transistors. I've read estimates of Flipper's logic transistor count as being around 26M. By comparison, the X300 has 75M, and the X1300 has over 100M. Features require transistors, pure and simple.
Okay, so you're saying that had they used a larger GPU, the form-factor and power consumption would have turned away far more customers than the improved visuals would have attracted, and so going Hollywood was the best compromise?

But it's a choice a lot of people didn't make.
And you're saying the deciding factor was they choose a low power fixed-function GPU over an HD system?! All things being equal, if the only difference was the GPU, do you think people would pick a low-power-consumption TEV-based console or a beefy HD console?

IMO WIi's success cannot be attributed in any part to the choice of GPU. The appeal comes from the interface and software library. The appeal would be increased with better visuals. The only success Hollywood brings to Nintendo is more money per unit than if they had a more powerful GPU. The downside to that decision is harder programming, developers not trying, and a lack of eye-candy meaning an inferior experience for gamers compared to what they could have had in a cheap, small, low-power package.
 
They could extend TEV with additional modern operations; instead of just having TEV_ADD, TEV_BLEND and TEV_SUBTRACT they could add stuff like TEV_DOT3, TEV_NORMALIZE, TEV_TRANSFORM, TEV_CONDITIONAL_JUMP etc. If they provide read/write access to more registers, such as texture coordinates, I guess it would be comparable to modern hw.

But I go with what most people are saying, the current generation GPU architectures are a lot more flexible since they usually consist out of a set of general purpose processing units. I think such a architecture is cheaper to construct than extending Hollywood with vertex shaders and additional shader operations.
 
Next AMD integrated gfx will have 320 stream processors (like those found on 4670, prabably like the current one it is equal to that (3200=3450)), 3x the 360.

It "cant get" any cheaper/low power and there is no other reason to go and spend a lot of money to try getting old descarted (similar to Gefoorce2/4MX) tech to do what would be the same than this (after all you want to get better nomal mapping etc...).

Is there any reason to try update the TEV to do the same if it has been abandoned years ago.
 
An alternative GPU would have lost Nintendo a lot of money
Losing money is a bad business model. The Wii is a successful product. Nintendo has R&D resources, and the way they chose to spent them clearly paid off. I can think of two other companies that threw all their weight behind massive power upgrades, and neither of them are as successful as Nintendo.
And you're saying the deciding factor was they choose a low power fixed-function GPU over an HD system?!
I'm repeating what Iwata said. Do you not read those "Iwata Asks" interviews? They're pretty interesting and give a lot of insight into Nintendo development and marketing philosophy. I'm not going to keep repeating them. He lists a few main goals behind the design:
1. Minimize energy consumption.
2. Minimize physical size.
3. Constrain development budgets.
All things being equal, if the only difference was the GPU, do you think people would pick a low-power-consumption TEV-based console or a beefy HD console?
All things aren't equal.
IMO WIi's success cannot be attributed in any part to the choice of GPU. The appeal comes from the interface and software library.
What makes you think the interface and software library (and, let us not forget, price) would be the same if Nintendo had thrown considerable development resources behind developing cutting-edge processing hardware?
The downside to that decision is harder programming
You mean easier, right? I don't recall EA having to spend big bucks developing all-new graphics engines for Wii, considering they already have Renderware. By contrast, look at how much has been invested in UE3. Also, going with old hardware probably made it easier to launch with Twilight Princess, not harder.
developers not trying,
This is typical gamer logic: When developers do good things, it's to their credit. When they do bad things, it's Nintendo's fault. Developers have no one to blame but themselves for half-assed shovelware. Note that even when developers do try, asset creation budgets are a fraction of what they are on the HD consoles.
and a lack of eye-candy meaning an inferior experience for gamers compared to what they could have had in a cheap, small, low-power package.
You are simply assuming that Nintendo could have designed a machine to their cost, size, and energy specfications using new, larger, more energy-hungry chips, and still met all their development targets and budget constraints. I doubt you have done the kind of thorough analysis of corporate internals to make that kind of assumption. You're just going with kind of a vague impression based on the price of video cards on pricewatch.com or something.

I suggest you start here:
http://www.nintendo.com/wii/what/iw...1;jsessionid=8B32F2727E5895E5A397F8AEA16CE0A4
 
Last edited by a moderator:
Okay.
Have you SEEN the Wii library? Clearly even a 50% clock bump was a waste of money and effort when taking the bestselling bulk of titles into account. For even more economical power savings, greater hardware revenue, smaller size, and to avoid the minimal graphical R&D cost incurred by the Wii, Nintendo could have easily gone with a straight process-shrunk Flipper, or an even lower-specced part for bigger wins in those apparently key areas. So in that light, Hollywood has in fact been an overengineered failure.
 
I remember that the rumors that were going for Hollywood originally were something similar to a X1600. That would've been a monster compared to what Wii really is. An X1600 pushing 640x480 would be quite nice.
 
Haha yeah those times were great! I remember people even thought well maybe radeon 9500/9700 level specs thinking that for 250 euro's you'd still make a profit and have atleast somewhat decent hardware. Well, If it wasnt for the internet not being able to show one's face, I'd love to have seen the faces (including mine) of most people when we got a real idea of the wii speak. ''Like your fucking kidding me right?''. :LOL:
 
Megadrive1988 said:
Flipper (therefore everything in it, including the TEV) was thought up/brainstormed in 1998, the design was hashed out mainly in 1999. Flipper was prepared for mass production in 2000. So Flipper is a late 90s design.

And you could quite easily state that the GT300 and R870 is a 2006/07 design.

In my view it is best to stick to actual release dates ;)
 
Losing money is a bad business model. The Wii is a successful product. Nintendo has R&D resources, and the way they chose to spent them clearly paid off. I can think of two other companies that threw all their weight behind massive power upgrades, and neither of them are as successful as Nintendo.
The GPU isn't why though! If they ahd Wiimore type advances and fabulous graphics, versus Wii with last gen graphics, things could very well be different.

1. Minimize energy consumption.
Why?
2. Minimize physical size.
Why?
3. Constrain development budgets.
Why?
How many of Wii's sales can be attributed to the low energy consumption? How many due to the small size? And why do you need limited hardware to constrain budgets, when you can set whatever budget you want?

What makes you think the interface and software library (and, let us not forget, price) would be the same if Nintendo had thrown considerable development resources behind developing cutting-edge processing hardware?
No-one said cutting edge. No-one said it needed considerable investment. An existing, slightly modified ATi part would have done the job cheaply, and Nintendo could create the same software with the software design decisions, only with AA and better visuals overall. The choices weren't 'last gen console tech' and 'cutting edge tech'. There were a range of options in between offering different cost/performance balances.

You mean easier, right? I don't recall EA having to spend big bucks developing all-new graphics engines for Wii, considering they already have Renderware. By contrast, look at how much has been invested in UE3. Also, going with old hardware probably made it easier to launch with Twilight Princess, not harder.
UE3 was new tech for new tech. For Wii, Nintendo could have used old, existing PC tech. The result is developers writing code for ATi or nVidia shaders instead of trying to understand and work around TEVs limits. They could have used UE2 practically off the shelf.

This is typical gamer logic: When developers do good things, it's to their credit. When they do bad things, it's Nintendo's fault. Developers have no one to blame but themselves for half-assed shovelware. Note that even when developers do try, asset creation budgets are a fraction of what they are on the HD consoles.
Do you disagree that if developers could write Wii graphics code using the PC techniques and code they had used years earlier, instead of requiring the use of GC code they never really developed, it would be easier and they'd get better results?

You are simply assuming that Nintendo could have designed a machine to their cost, size, and energy specfications using new, larger, more energy-hungry chips, and still met all their development targets and budget constraints.
It's not an assumption but an educated guess. ATi had mobile parts that were low power, high perforamnce, and not too expensive, that would do a better job than TEV. Maybe they wouldn't have hit Nintendo's energy spec targets, but I'd say that if that's true, their evergy targets were wrong. Likewise, given the known cost of the hardware, Nintendo losing an extra $20 and still making many billion in profit while giving Wii owners a better epxerience would be a better choice IMO (it doesn't have to be a choice between 'total economic failure' or 'more profit than anything else'. There is asuch a good thing as providing a good product/service at a fair price so the company profits and the consumers get a good deal)



Let's put it the whole another way with a few simple questions -
  1. Would it have been possible and of reasonable cost for Nintendo to create a Wii with a more sophisticated GPU than Hollywood?
  2. Would making that choice lead to an insanely expensive machine, one with an insane power draw, a stupidly large console, or require many times the development budgets compared to current budgets?
  3. If Nintendo had made that choice, would they be struggling to make money now, or would they still be making multu-billion dollar profits?
 
The GPU isn't why though! If they ahd Wiimore type advances and fabulous graphics, versus Wii with last gen graphics, things could very well be different.

1. Minimize energy consumption.
Why?
Quote:
2. Minimize physical size.
Why?
Quote:
3. Constrain development budgets.
Why?
How many of Wii's sales can be attributed to the low energy consumption? How many due to the small size? And why do you need limited hardware to constrain budgets, when you can set whatever budget you want?

[/LIST]

You forget that Nintendo is a conservative Japanese company making a product for a Japanese market. What is that business rule- You have to please the 20% of your clients who are 80% of your business?

The trend in Japan is smaller & quieter= is better.
So the Wii, and Gamecube were designed to please that market.
It just so happens that its also popular in other markets.

This is also why Nintendo has not been as gung-ho for online like Microsoft has been.
Because the Japanese market are not as online as western markets are.
 
I think the aspects of the Wii that have caused it to be successful are its fairly novel control scheme, certain games (Wii Fit), and the way that it is marketed. I think that they've proven that the underlying hardware doesn't matter to those who buy it. At least for Wii 1 anyway as lots of people (me too) bought it unaware of how gimpy it really is and how the library would suck so extensively.
 
Last edited by a moderator:
Back
Top