F-Zero GC possibly using bumpmapping at 60fps.

I don't think SFA uses much if any bump-mapping but rather just has a lot of really really nice textures.

In all honesty, I have not noticed much bump-mapping on the Gamecube across the board. It pops up here and there, Hoth in Rogue Leader, a single brick wall in Luigi's Mansion, and a few trophies in Smash Bros. Melee but not used to any great extent.
 
Why is bumpmapping such a big deal? I thought it was just one of those computer things from 1998, like 32bit color, that was a big deal then, but expected now. Then again, I don't think gamecube does 32bit color either.(not enough memory I think)

Anyhow, I was more impressed by the blur effects in sfa than anything else.
 
Why do things do this, downsampling it before it goes on the screen. If it's processed in 32-bit, why not output it in 32-bit. The Voodoo 3 did that too, I was kinda irritated that it did, too.
 
Peppermonkey said:
Why do things do this, downsampling it before it goes on the screen. If it's processed in 32-bit, why not output it in 32-bit. The Voodoo 3 did that too, I was kinda irritated that it did, too.

Voodoo3 doesn't do things in 32-bit internally.

Oh and it downsamples because A. there isn't enough bandwidth in a coax or S-Video cable IIRC, and B. TV's don't display a full 24 bits of colour.
 
Let's see
AVault: Is it true that the Voodoo3 chipset can do full 32-bit rendering? If so, why is the 16-bit color depth limit imposed?

Bruning: Voodoo 3 processes internal rendering at 32-bit. In order to save memory in the frame buffer and not exceed 16 MB, we have a special algorithm to dither the final image to 16-bit, which is much better than standard 16-bit.

http://www.avault.com/ea/interview.asp?name=3dfx

Rendering is consistent with Voodoo2 standards: 32-bit internal calculations dithered to 16 bit externally.
http://www.billsworkshop.com/comdex/voodoo3.html

We actually do the rendering calculations internally at 32 bits to have full precision with the16-bit operands. Then, instead of simply truncating the results to 16 bits for saving in the frame buffer we use a proprietary filtering algorithm that retains nearly the full precision of the color value.
http://www.guru3d.com/voodoo3.htm

The voodoo 3's pipeline did things in 32-bit color and dithered them down to 16-bit.
 
So does 32 bit color hurt performance, or only if there isn't enough memory? Since if voodoo cards could go twice as fast by just doing 16 bit internally, that would make more sense to me. Anyhow, I still see color banding in gamecube games, such as the clouds in the bespin level in rogue leader, or in dark areas in metroid prime. BTW, component cables should have the bandwidth for 32bit color, shouldn't they? Also, if you're just doing one color per pixel, a tv at 640x480(which most tvs aren't even at that) wouldn't have enough pixels to use every color, even in 16 bit.
 
Fox5 said:
So does 32 bit color hurt performance, or only if there isn't enough memory? Since if voodoo cards could go twice as fast by just doing 16 bit internally, that would make more sense to me. Anyhow, I still see color banding in gamecube games, such as the clouds in the bespin level in rogue leader, or in dark areas in metroid prime. BTW, component cables should have the bandwidth for 32bit color, shouldn't they? Also, if you're just doing one color per pixel, a tv at 640x480(which most tvs aren't even at that) wouldn't have enough pixels to use every color, even in 16 bit.

Wouldn't have enough pixels? So what, it still takes bandwidth to push 16- and 32-bit pixels through. Just multiply 640 * 480 * 16, 24, or 32 to get the BW requirement.

Component cables DO show slightly less banding, but most of the component bandwidth is used to display twice as much data per second to maintain progressive output.

Sidenote: My GCN with Component-out looks just short of identical in 480i and 480p, hooray for powerful flicker filters and kickass de-interlacing!
 
I have component cables(but no progressive scan or any real good tv features), and I still notice significant banding. BTW, if it is interlaced, shouldn't the extra bandwidth be used for the colors, or is the bandwidth just wasted?
 
Fox5 said:
I have component cables(but no progressive scan or any real good tv features), and I still notice significant banding. BTW, if it is interlaced, shouldn't the extra bandwidth be used for the colors, or is the bandwidth just wasted?

Wasted.

The system isn't designed to use the extra bandwidth for colour data.

Also again the TV itself isn't capable of displaying the full range anyway...
 
I purchased a Matrox G400 entirely because it featured bump mapping and 32 bits colour. It was as fast as the Voodoo 3, but the games looked beautiful on it, if they were DirectX compatible..
Yup I was a sucker for that bump mapping too, loved this demo

[youtube]
 
Yup I was a sucker for that bump mapping too, loved this demo

[youtube]
Yup, me too. The G400 graphics card (I had the basic version, the G400 MAX was a bit more powerful) was bundled with some kind of RTS game with an isometric camera that featured bump mapping, and it was amazing at the time.
 
Back
Top