WiiGeePeeYou (Hollywood) what IS it ?

Status
Not open for further replies.
Please don't take the dev's names in vain! The resolution was 1080i lines, but the rendering was not true 1080i. It was still 640 columns, or in other words rendering at 640x540. The image was stretched horizontally and interlaced vertically. That's little more effort than rendering 480p progressive. That's in contrast to rendering 720p on XB versus 480p, which is some doubling of resolution (4:3. Was 1280x720 rendering supported?).
IIRC the xbox never rendered anything over 1024*768. something about the tv-out chip (scaler?) being limited to input resolutions of 1024*768 or less. i'll see if i can find a link, i'm pretty sure i read it on this board.

hmm... i can't find the link i was looking for, i could have sworn it was mentioned then detailed in the PGR3 framebuffer thread. but i did find mention of it in a post from 2003
 
Last edited by a moderator:
From this translated article,
http://72.30.186.56/babelfish/translate_url_content?lp=ja_en&trurl=http%3a%2f%2fpc.watch.impress.co.jp%2fdocs%2f2006%2f0920%2fkaigai301.htm
It seems to imply that the
GPU has no programmable shaders in hardware because...
Nintendo wanted to keep the power consumption down-

Circumstance concerning electric power consumption and development readiness GPU is the same. As for the GPU core of Wii, in strict sense as for the Programable Shader hardware you say that it does not have. Hard it is it can program Shader in the parameter base has, but program characteristic about of hardware Shader of PC graphics is said that it does not have. When of electrical efficiency and development burden are thought, you can agree upon also this design philosophy.

If the Programable Shader hardware whose program characteristic is high in GPU is placed, that much rich graphic expression becomes possible. But, electricity consumption rises as a trade-off. Because generally programmable as for hard, fixed hard compared to, electricity consumption per processing performance becomes large.
 
Why are they so bothered about power consumption? Portable GPUs with shaders do pretty well with the thermal envelope and power consumption. Take a Mobility Radeon 9800 and put it on the latest manufacturing, and I can't see any problems.
 
IIRC the xbox never rendered anything over 1024*768. something about the tv-out chip (scaler?) being limited to input resolutions of 1024*768 or less. i'll see if i can find a link, i'm pretty sure i read it on this board.

hmm... i can't find the link i was looking for, i could have sworn it was mentioned then detailed in the PGR3 framebuffer thread. but i did find mention of it in a post from 2003

I heard something about that too, I think the geforce 3 cards may have had the same limitation for tv out. However, the xbox definetely did 1080i, now whether that was 1080i upscaled or native I don't know. (I'm guessing upscaled though, since even 720p xbox still looked rather blurry, but what would have done the scaling? programmable graphics hardware?)
 
Why are they so bothered about power consumption? Portable GPUs with shaders do pretty well with the thermal envelope and power consumption. Take a Mobility Radeon 9800 and put it on the latest manufacturing, and I can't see any problems.

Well, they've still got that big chunk of 1T-SRAM on the die, with additional transistors besides. If you look at some specs, you'll notice that the introduction of fully programmable shaders (and yes, some other features as well) added a huge number of transistors, and we see a big jump every time a new shader model is introduced. The GF3Ti 200 has more than twice as many transistors as the GF2 Ultra (57M vs 25M), and Flipper is supposed to have only around 26M logic transistors (51 total). So perhaps going to programmable shaders would have been a sufficiently large leap in transistor count as to make their desired power consumption unattainable.

Also what's the power consumption of those cards at peak 3D performance? Remember, the Wii's graphics chip will be running 3D almost the entire time the machine is on. I can't find data for the specific cards you mentioned earlier, but the GF 7300 GS consumes 16.1 W while running at peak 3D, which is almost as much as the entire Wii system, which only uses 17W (Zelda):
http://www.xbitlabs.com/articles/video/display/power-noise_7.html

Even if something like the 7100 only consumes 15W, it would appear that it is still way more than Nintendo was shooting for.
 
I can't find data for the specific cards you mentioned earlier, but the GF 7300 GS consumes 16.1 W while running at peak 3D, which is almost as much as the entire Wii system, which only uses 17W (Zelda):
http://www.xbitlabs.com/articles/video/display/power-noise_7.html

Even if something like the 7100 only consumes 15W, it would appear that it is still way more than Nintendo was shooting for.
Yes, but why?! What's the concern about power consumption? Why couldn't Nintendo stretch to 30 Watts, or 50 Watts? That's still a fairly low amount and would allow for some powerful hardware. I can't see power concerns as valid for reigning in the GPU performance so much. Is the concern the cost for users to run the Wii? Or are they trying to minimize environmental impact of their hardware? It'd be nice to know the particular reasons for their choice of target power consumption from which they determined appropriate target performance.
 
Yes, but why?! What's the concern about power consumption? Why couldn't Nintendo stretch to 30 Watts, or 50 Watts? That's still a fairly low amount and would allow for some powerful hardware. I can't see power concerns as valid for reigning in the GPU performance so much. Is the concern the cost for users to run the Wii? Or are they trying to minimize environmental impact of their hardware? It'd be nice to know the particular reasons for their choice of target power consumption from which they determined appropriate target performance.

Those are questions we may never know the answer to this side of glory. Probably so that you can stick it in an enclosed space without worrying, and so that they could use the cheapest PSU and cooling solution they could find.
 
I heard something about that too, I think the geforce 3 cards may have had the same limitation for tv out. However, the xbox definetely did 1080i, now whether that was 1080i upscaled or native I don't know. (I'm guessing upscaled though, since even 720p xbox still looked rather blurry, but what would have done the scaling? programmable graphics hardware?)

I can attest to real 1080i. On my modded Xbox, the UI for Avalaunch and XBMC is totally smooth at 720p, but at 1080i it is noticeably slower. Image quality is better at 1080i too (especially still images/UI).
 
From this translated article,
http://72.30.186.56/babelfish/translate_url_content?lp=ja_en&trurl=http%3a%2f%2fpc.watch.impress.co.jp%2fdocs%2f2006%2f0920%2fkaigai301.htm
It seems to imply that the
GPU has no programmable shaders in hardware because...
Nintendo wanted to keep the power consumption down-

Circumstance concerning electric power consumption and development readiness GPU is the same. As for the GPU core of Wii, in strict sense as for the Programable Shader hardware you say that it does not have. Hard it is it can program Shader in the parameter base has, but program characteristic about of hardware Shader of PC graphics is said that it does not have. When of electrical efficiency and development burden are thought, you can agree upon also this design philosophy.

If the Programable Shader hardware whose program characteristic is high in GPU is placed, that much rich graphic expression becomes possible. But, electricity consumption rises as a trade-off. Because generally programmable as for hard, fixed hard compared to, electricity consumption per processing performance becomes large.

For starters this is not English. This automatic translation is so bad that sentences just don´t make sense.

But I totally disagree that this is in any way conclusive. What I think is said is this:

As for the GPU core of the WII, in the strict sense it has no programable shaders hardware.
It can program shaders, but programming characteristics as they exist in PC graphics it does not have.

Then he goes on to explain why the option of Nintendo for the shaders are better when it comes to power consumption.

And all of this is correct. In the strict sense what the Wii GPU has is nothing similar to the programable shaders hardware we know and are used to know on our PC Cards.
There is no other way to interpret this since we know that even the gamecube could do shaders, but using the GPU TEV, and not a normal TMU.

The wii has shaders, and trusting some leaked information, it has 2 TEV units and they are now programable. But even if this information is wrong, at least one TEV unit is present, and shaders, even if limited are possible.
 
Yes, but why?! What's the concern about power consumption? Why couldn't Nintendo stretch to 30 Watts, or 50 Watts? That's still a fairly low amount and would allow for some powerful hardware. I can't see power concerns as valid for reigning in the GPU performance so much. Is the concern the cost for users to run the Wii? Or are they trying to minimize environmental impact of their hardware? It'd be nice to know the particular reasons for their choice of target power consumption from which they determined appropriate target performance.

I don't think they were concerned about power efficiency. They wanted cheap. It's Nintendo. They did it cuz they can't compete with the super loss consoles (N is not diversified enough) and so have to adopt a different business model. 90nm hardware that's tiny just happens to suck little power. Easy way to claim power efficiency on low cost hardware.

MS and Sony wanted seriously powerful machines that leveraged cutting edge tech. Even though the machines undoubtedly have power saving techniques in use, they probably even took advantage of that to scrape out more performance. It was all about performance. They even obviously take huge losses for that speed. It's not that they don't care about power usage. It is still a problem because more heat = more needed cooling and the need for a smarter design overall. It also raises concerns about long term reliability. It's just that power usage and cutting edge don't work out well together. And cost was almost not a priority. Not initially anyway. Die size reduction and hardware simplification will save them $$ in the future.
 
Keep in mind also that MS and Sony have been getting a lot of flak for their overheating problems. The #2 selling 360 accessory at Circuit City is the NYKO cooling fan attachment.

Nintendo almost literally did a little backflip and perfectly evaded any kind of overheating issues that the other systems may have had.

Also, note that the Wii in "standby" is still running, just without the fan, for WiiC24. Huge power consumption for a few hours a day isn't so bad, but it can add up when you're running at full power almost 100% of the time. That may have been another big factor in Nintendo's reasoning.
 
Yes, but why?! What's the concern about power consumption? Why couldn't Nintendo stretch to 30 Watts, or 50 Watts? That's still a fairly low amount and would allow for some powerful hardware. I can't see power concerns as valid for reigning in the GPU performance so much. Is the concern the cost for users to run the Wii? Or are they trying to minimize environmental impact of their hardware? It'd be nice to know the particular reasons for their choice of target power consumption from which they determined appropriate target performance.

That is a very good question and the most interesting is that they dont do any marketing about (if it is good to the environment and people wallet then you have two god selling points).

Meybe they just had better options (in their mind) from a price/power/performance/BC/noise that is overall best for their plans on whatever they planned to have in terms of HW.

But it is interesting t see that they also point as easier to dev without (new) shaders.

Anyway IMO this make me even more interested in what is inside the Wii than before.
 
Last edited by a moderator:
K
Nintendo almost literally did a little backflip and perfectly evaded any kind of overheating issues that the other systems may have had.

Also, note that the Wii in "standby" is still running, just without the fan, for WiiC24. Huge power consumption for a few hours a day isn't so bad, but it can add up when you're running at full power almost 100% of the time. That may have been another big factor in Nintendo's reasoning.

Say that to the reports of Wii running WiiConnect24 ending up broken (such as mine), with "sparks" of colored pixels appearing on the screen. :)
There are a couple others in the "Wii issues" thread on GAF. Nothing widespread, though (yet).
 
Say that to the reports of Wii running WiiConnect24 ending up broken (such as mine), with "sparks" of colored pixels appearing on the screen. :)
There are a couple others in the "Wii issues" thread on GAF. Nothing widespread, though (yet).

Was thinking more along the lines of "overheating issues while running", but this seems far more insidious. o_O My Wii has no issues with WiiC24 so far and it's been running that way since I got it two weeks ago, though it does get pretty warm. Hopefully Nintendo can, like, hotfix it to run the fan in a slower setting while idling in WiiC24 mode or something.
 
I don't think they were concerned about power efficiency.

Well, you're just not correct there. Iwata did a series of like 7 interviews with the lead engineers for Wii, and they said that miniaturization and reduction of power consumption were top concerns from very early on.

They did it cuz they can't compete with the super loss consoles (N is not diversified enough)

Now that's just patently absurd. Nintendo's engineers simply have never shown an inability to put together a competitive piece of hardware. If anything, their 2 decades' engineering experience has made them a lot better at putting together a slick little gaming box without losing their shirts, unlike a certain gigantic corporation that designs OS's, but not the PC's they run on, or a certain other corporation that puts everything plus the kitchen sink in their consoles. You honestly think that this time around, Nintendo, ATI, and IBM were comepletely unable to come up with something that had modern technology in it?

According to Iwata, the problem wasn't "We're too stupid to design this kind of machine" or "We're too poor," it was "Such a machine won't take us to commercial success." Seeing as their last machine's primary differentiation from the PS2 was its superior image quality and graphics, yet it was unable to gain much popularity in Japan (where Xbox was irrelevant), it would in fact seem that even were Nintendo to release the most powerful system this time, they still would come in 3rd.
 
Hopefully Nintendo can, like, hotfix it to run the fan in a slower setting while idling in WiiC24 mode or something.

That's what I was wondering, can they just get the fan to turn on during WiiC24 though I'm not sure they can regulate the voltage like that.

I'm guessing that during normal gameplay, the fan comes on and remains on? So it's not thermally activated? If so, that might be a bit tricky to regulate the fan that granularly. Once on, it may not turn off.
 
Well, you're just not correct there. Iwata did a series of like 7 interviews with the lead engineers for Wii, and they said that miniaturization and reduction of power consumption were top concerns from very early on.
I don't really think interviews say much. They can say what they want, right? Don't think so. Saying it any other way would be plain stupid. And of course with a smaller form factor power usage might be a concern. But it didn't need to be 20W. It coulda been 30-40W like Cube. There's just not much hardware in Wii and 90nm makes what is there very low power.

It's a great drum for Nintendo to beat; that "power consciousness" conveniently brought on by low cost hardware. More power to them. It only shows they have brains.

If anything, their 2 decades' engineering experience has made them a lot better at putting together a slick little gaming box without losing their shirts, unlike a certain gigantic corporation that designs OS's, but not the PC's they run on, or a certain other corporation that puts everything plus the kitchen sink in their consoles.
That is precisely what I mean. They can build whatever they want. Of course they can. They all just outsource anyway and hire IBM/ATI/NV to build the machine for them. But, Nintendo has to make a profit on the machine because Nintendo does not have a zillion other money makers to cover a huge loss on each console. MS and Sony can do that and so they can build much more powerful machines and go on the loss model and still not be hurting whatsoever.

I do not believe Nintendo could build the same level of machine as Sony and MS and succeed on the business level. Economies of scale are not on Nintendo's side.

360 is not loaded with extra stuff. It is a pure game machine. PS3's blueray drive is still just a different storage medium, even if it is currently very expensive..

I also believe MS and Sony have totally different future plans for their consoles than Nintendo. I think Nintendo is following the online lead right now and is unsure of what to do with it. Sony and MS aren't thinking game machines in the far future. They want to own you with their platform and have it useful for damn near everything. They will do whatever it takes to establish themselves. It's unlike anything that happened in the '90s. (well other than PS1's arrival and Sony making it clear what they wanted to do with it). 3DO was an example of the business plan I'm saying here too, but 3DO didn't have the clout/money/power.
 
I don't really think interviews say much.
Did you read them? They were pretty extensive and gave a lot of details of the development process and philosphy of Wii. http://wii.nintendo.com/iwataasks.jsp
They can say what they want, right?
Aye, there's the rub. I've seen this process many times before:
  1. Random guy on a forum speculates.
  2. Clear developer statements shown to contradict him.
  3. Random guy accuses the developer of lying.
There's no point in even having a discussion in this situation, because the only possible place we could even get data is the team that designed the hardware, and if you're going to accuse them of lying, that means every piece of "information" you put forward is just something you pulled out of a hat.
Kou Shiota said:
While you could use such cutting-edge semiconductor technology in order to facilitate this kind of extravagance, you can choose to apply this technology in other ways, such as making chips smaller. We have utilised the technology in this way so that we could minimise the power consumption of Wii.
But he's a lying liar making stuff up after the fact to save face. We have the genuine speculation of some guy from the Internet as proof!
But, Nintendo has to make a profit on the machine
They did it with Gamecube.
because Nintendo does not have a zillion other money makers to cover a huge loss on each console.
Right now, neither does Sony.
I do not believe Nintendo could build the same level of machine as Sony and MS and succeed on the business level. Economies of scale are not on Nintendo's side.
It has nothing to do with economies of scale. That didn't hurt them the last three console generations, so why should it now? Like I said, if anyone can build a console about as powerful as the X360 and still come out in the black, it's Nintendo. Remember the Gamecube? Spec-for-spec, it was around 1/3 as powerful as the Xbox, sold for $100 less, was a small loss-leader (thus turned profitable after only 1 or 2 games sold), and yet its graphics often were compared favorably with Xbox titles. Nintendo has very, very good engineers. They are both a software publishing giant and a designer of gaming hardware. Yes, other companies play a big role in the design process, but they hardly sit idly by while they just buy the best parts they can and slap them in a giant, black box with a huge green dot on it.

However, given three basically similar consoles, people are simply not going to choose Nintendo anymore. It's not about technical competence, as Nintendo's engineers appear to be a good clip sharper than Microsoft's and competitive with Sony's. It's not about finances, as half of Nintendo's engineering is about saving money, and they're a very, very rich company and are perpetually jockeying with EA to be the world's #1 software publisher. The problem is that Nintendo doesn't have the brand power to sell a console that's just like the other 2. Given 3 identical products, people default to the current leader. And right now, that's not Nintendo. It's not that they couldn't build a machine to compete graphically with X360 and PS3. It's that if they built such a machine, Iwata knew that they couldn't successfully market it or win developers to it. They needed to differentiate themselves, reimagine the brand, and go after a different market.

But yeah, you do have a point. If you ignore basically every piece of actual evidence in favor of the ranting of various forum hounds, Nintendo built the Wii the way they did because they were just incapable of making a powerful console.
 
Why did Nintendo go so cheap with the hardware? Because this is the company that removed the digital out port on the gamecube, despite the DAC circuitry being contained within the component cable base, just to save a penny or two on each system sold.
 
Why did Nintendo go so cheap with the hardware? Because this is the company that removed the digital out port on the gamecube, despite the DAC circuitry being contained within the component cable base, just to save a penny or two on each system sold.

They probably weren't selling enough component cables to make it worth keeping.
 
Status
Not open for further replies.
Back
Top