32 bit colours on voodoos

nevidimka

Newcomer
why does the voodoos colour either on gaming or 2d display have much thicker/vibrant colours than the geforces colours? even if both are 32 bit cards?? to my observation the colours on geforce cards are much paler in comparision??
 
Most probably because your monitor is properly-calibrated for the Voodoo, and not for the GeForce.

In 3D, there are other things that can lead to apparently-better color vibrance, but the Voodoo5 has none of those (ex. better texture filtering).
 
Geforce cards are famous for using subpar 2d filters. Manufacturers had to cut corners to offer cheaper cards. 3dfx built its own voodoo5 cards and thus had more control over the quality of the filters used.
All the 3D filtering capabilities in the world can't help a poorly manufactured card.
 
Both my GF3 and GF4 have/had better imagequality than my old Voodoo5. No question about it. Sharper and better colors.
Every time I think about my V5 I only get pictures of dithering and 16-bit color in my head :( What a lousy card. Beside good AA it didnt really have anything.
 
Also, I believe the voodoo series of cards had hopped up gamma curves, which made a brighter picture, which apparently looks "better" or at least sells more TVs, according to market research. (Not that TVs have much to do with voodoos, of course, but the idea is the same)
 
I'll have to chime in here and disagree with Galilee. I think the V5 had much better 2D than the GF3 or R8500. At 1600x1200 it was much sharper and had better color saturation. As for "16-bit color and dithering" I'll also say nay. The V5's 16-bit 3D was quite good with the 22-bit post filter and it did have 32-bit capability. Perhaps you're thinking of the V3?

Finally, that "lousy card" (which I still have in my kids' computer long after I *don't* have my GF3) ushered in FSAA and - three years later - still has the best implementation as far as IQ goes (although the 9700 looks to have matched or beaten it now).

My two cents at least.

Mize
 
its not that my monitor is properly calibrated or what coz the geforces colours i saw are from my friends pc's.. all of them have the same pale kind of look in temrs of colour.while my voodoo on my monitor is always more vibrant even if it is the same desktop wallpaper.. 1 on my comp n abnother on my frens comp using the gf. the blue is more bluer.. the grenn is more greener.. n so on.. :D .

i hate to think that someday if i go some other graphic card.( of course not nvidia.. ).. i'll be missing those vibrant n beautiful colours. could it be because v4/5 have higher internal preciesion compared to other cards?? even higher than the 32 bit colour output?
 
Uh, no. The voodoo cards had a weird gamma setting. 90% of this "brighter and more saturated color stuff" comes from that. You can easily see it by putting a V5 and GF2 together and using a tweak-tool to set the GF2's gamma to be the same as the V5. (some of the powerstrip tools actually had "voodoo gamma" setting)

Most of the 2D arguments going around nowadays are purely subjective. It's like CD vs SACD vs LP, or audiophile arguments. Unless you put an oscillascope on the output, you're not likely to see the effects of these so-called "cheap filters". Nor will you hear the different between a tube amp and a digital amp on some newfangled audio card (but people will attest to it)

Yes, I'm sure if you put different cards side by side, they look different. But before you start talking about filters, you better equalize the driver settings. If you guys saw someone comparing a card with -2 LOD bias vs a card with 0, or 16x aniso vs 2x anisotropic, you'd say it was unfair, but very few of the people who talk about 2D deal with the gamma issue or color profiles issue. They happily post screenshots from different cards and assume they are an accurate comparison everywhere.


All of these IQ comparisons should use equal settings and double blind testing scenarios.
 
nevidimka said:
its not that my monitor is properly calibrated or what coz the geforces colours i saw are from my friends pc's..

Try your friends don't have properly-calibrated monitors, or your Voodoo just has a wierd gamma setting that you have grown to like. It's certainly possible to adjust the gamma on any GeForce card to make it look almost indistinguishable from a Voodoo's display. The only differences may come with clarity at high resolutions (low-quality filters will result in blurry or wavy images).

could it be because v4/5 have higher internal preciesion compared to other cards?? even higher than the 32 bit colour output?

First of all, all video cards have (slighly) higher internal precision for normal texture filtering ops and the like. For standard desktop display, however, the output cannot possibly be better than the input, so if it's 32-bit source, it's not going to look any better in 64-bit color. In other words, it's absolutely impossible for any sort of higher internal precision to affect the output quality when the art on your computer maxes out at 32-bit (8-bits per channel).
 
I found the Windows Voodoo 2D quality slightly better than that of the Radeon.
As for 3D, does the Voodoo actually do true 32 bit, or is it 22 bit dithered? I might be thinking of a 22-bit postfilter for 16 bit. But I also seem to remember reading VSA100 32bit was not 32bit.
As for the the FSAA, it was good but expensive. I am glad to have anisotropic which does not blur textures into submission. e.g. we have MSAA for the edges and anisotropic for the textures. At least for modern games. For older ones the V5 is very good.
 
hmm.. about calibrating monitors. not that i didnt try it.. i tried maxmimizing the monitors contrast reducing the brightness, n even changing the alpha values on the geforce.. n then reduced the seting on my monitor.. n even reducing to 16 bit output. but .. it still shows how the colours doesnt match.
i thought i read somewhere that the v5's internal precesion is 48 bit.. dithered to 32 bit?? no??


and on blurring textures due to FSAA.. when u turn on FSAA, to avoid blur textures put the LOD bias to sarp maximum at -8./ that will do the trick.
 
IIRC the VSA 100 chips had 40 bit internal precision. I thought the image quality from my Voodoo 5 was excellent as well, but if I'm honest I would say that in 16 or 32 bit colour my Kyro II has a better output, FSAA aside.
 
Above,

A couple of good old articles on 22bit post filtering techniques on Voodoos:

http://www.beyond3d.com/articles/3dfx22bit/index1.php

VSA-100 were capable of true 32bpp

--------------------------------------

As already pointed out internal colour rendering precision is irrelevant, whatever the value may be.

Maybe what people are actually refering here to is dithering?
 
Ailuros said:
VSA-100 were capable of true 32bpp

Unfortunately the performance did not often permit this setting.

I think it was a GF1-DDR I had prior to my V5, and it did not feel like a upgrade :( well well, water under the bridge. I have the noicemaker(damn two noicy fans) in a locker, maybe it can be used for something someday.
 
I noticed my 8500 has much more vibrant colours than my geforceSDR ever did. Not sure what it was but everything just seemed washed out on the gf in comparison.
 
VSA-100 did not dither when rending 32 Bit.

Also, from experimentation, much of the blurring with FSAA and Voodoo5 was actually caused by incorrect sample positions. The samples were shifted half a pixel to the bottom-right of the screen. Manually shifting the samples half a pixel back up to the top left cleans up a lot of the blurring.
 
i've seen the radeons screenshot as well. its the closer look to voodoos colour.. it seems both rertails cards have better quality colours then the oem gf's. i also seen the gf4 colour .. n it also have the similar washed out colour.. although better than the gf 2.

btw colourless nice to see u again,..what do u mean by error in FSAA taking samples? u mean the FSAA implimentation is wrong ? n how do u change it manually?
 
Back
Top