The Official S3 Discrete GPU Rumours & Speculation Thread

I always thought the t-buffer as applied to AA was nothing more than rotated grid supersampling. The hardware devoted to it was probably very minimal.
My memory must be really rusty, I thought 3dfx AA implementation was just multisampling over 2 or 4 split buffers, but yeah..they were not reusing subsamples, so I guess the hw was just supersampling the scene.
 
I always thought the t-buffer as applied to AA was nothing more than rotated grid supersampling. The hardware devoted to it was probably very minimal.

---------------------------

ballsweat, I don't know why you think so highly of 3dfx. The most important IQ feature is AF, as it affects such a huge percentage of the pixels on the screen. I will never play a game without enabling AF if possible, even if the performance hit is big (which generally means its importance is big). 3dfx never had AF in any of its chips.

With AA, the real innovation was MSAA with colour compression, and 3dfx had neither. 4xAA with a >75% performance hit is rather useless. I saw games take a 70% hit on a GF3 since it had no color compression, and being ordered grid, that was rather useless too.

Lossless texture compression is useless for a graphics chip. You'll never see it happen.

R300 is the first chip to really deliver on IQ at a usable cost. So in summary, I concur with Tim Murray and wonder what the fuck you're babbling about.

No, no, no, the T-buffer was magical, invented by Gary Taroli whilst channeling the Supreme Power of Saidin, and it had jacksquat to do with the accumulation buffer that had existed prior to that for...umm..."a while". And it was magical supersampling, because even today, when we've had rotated grid since forever, it's still far better due to the fact that it was used by 3DFx. And don`t mention compression...it`s like saying the Dark One's name out loud(yes, I like Wheel Of Time, in case nobody noticed:D).
 
I always thought the t-buffer as applied to AA was nothing more than rotated grid supersampling. The hardware devoted to it was probably very minimal.

---------------------------

ballsweat, I don't know why you think so highly of 3dfx. The most important IQ feature is AF, as it affects such a huge percentage of the pixels on the screen. I will never play a game without enabling AF if possible, even if the performance hit is big (which generally means its importance is big). 3dfx never had AF in any of its chips.

With AA, the real innovation was MSAA with colour compression, and 3dfx had neither. 4xAA with a >75% performance hit is rather useless. I saw games take a 70% hit on a GF3 since it had no color compression, and being ordered grid, that was rather useless too.

Lossless texture compression is useless for a graphics chip. You'll never see it happen.

R300 is the first chip to really deliver on IQ at a usable cost. So in summary, I concur with Tim Murray and wonder what the fuck you're babbling about.

It's not that i feel so highly of 3dfx itself, it's just that i liked their aa which, may have a large performance hit, but not in all cases. the benchmarks of the 6000 showed little performance hit, and the 5500 wasn't meant to be their flagship product.

Plus, keep in mind that another reason the 5500 wasn't as fast was b/c of it's lack of t&l. it would have come much closer to competing products' benchmarks had it had t&l.

plus, even though there may be a 75% performance hit, games that use the source engine (all of which are relatively new) for example, dual 8800 ultra's could, in many cases, do t-buffer aa with a fluid smooth frame rate.
the benchmark numbers would might look smaller, but if you're frame rate is 100%fluid and the image quality is better, then why worry about the theoretical performance hit being 75%?

Trust me, I'm not biased against nvidia, and if they make their products better, then i'll like them better than i liked 3dfx.

and i definately don't think highly of the voodoo 5 forcing bilinear in some cases, so i don't think 3dfx was perfect, and to tell you the honest truth, in late y2k, i recommended a friend get a gf 2 gts over a voodoo5, but he disagreed and said that the t-buffer aa would make the voodoo 5 better. but flashing forward to today, i think differently, b/c i want aa that works with all games, and that aa's everything (specular shimmering, and does shadows and geometry better.)


I'm generally not a biased person, i used to hate the wii as much as anyone, and nintendo b/c of it, but i've seen that the upcoming games have much better graphics than the previous ones, so i think slightly less of sony in a way, b/c they don't offer super mario galaxy.

I've boycotted ea products for 8 years, but i saw a trailer for moh airborne and now might buy it b/c it looks so good.

if a company's products blow me away, then i definately commend them for it.

add-in: i didn't actually ever expect lossless tc, i actually more so had in mind no tc. that's more than possible. i must have been misunderstood. sorry=)
 
G80 has basically solved all remaining NV image quality issues. You can get almost free AF, way better trilinear than G7x, and AA performance at 4X is quite impressive to say the least. G7x and earlier NV stuff certainly had IQ issues, tho. ATI has been pretty awesome all the way back to R300. Certainly vastly superior to anything 3dfx did. ATI's gamma-corrected MSAA was some incredible stuff in 2002 and was better than what NV could do until G80, I think.

3dfx never even achieved DirectX 7 level sophistication. So, seeing them as some prodigy of 3D acceleration is definitely missing the boat.
 
G80 has basically solved all remaining NV image quality issues. G7x certainly had its problems there. ATI, however, has been pretty awesome all the way back to R300. Certainly vastly superior to anything 3dfx did. ATI's gamma-corrected MSAA was some incredible stuff in 2002 and was better than what NV could do until G80, I think.

ati has been doing filtering cheats since the 8500.

ati's iq has always sucked b/c of that.

it really kind of depends upon what one considers an issue.
 
8500 certainly "cheats" significantly, but it blew NV's AF performance totally out of the water by doing it. AF was so much faster on 8500 than GF3/4. You got usable AF in exchange for some serious limitations (bilinear and huge angle dependency). But hey, it's still much better than no AF. The original Radeon was even worse, but hell with GF2 you could hardly even have AF at all.

And, again, 3dfx had zip AF.

So, you are making a big stink because you want absolutely mathematically by-the-book correct filtering? What would be the point? It would cost performance and would probably be imperceptibly better, especially compared to what G80 does. And on the ATI side, all the way back to R300 you can see some quality filtering even with their optimizations. I can't really tell the difference between G80 and ATI cards, honestly. Unless you want to sit around and look at screenshots for minute details.

G80 has the best filtering of any card ever made, I believe.
http://www.techreport.com/reviews/2006q4/geforce-8800/index.x?pg=6
 
No, no, no, the T-buffer was magical, invented by Gary Taroli whilst channeling the Supreme Power of Saidin, and it had jacksquat to do with the accumulation buffer that had existed prior to that for...umm..."a while". And it was magical supersampling, because even today, when we've had rotated grid since forever, it's still far better due to the fact that it was used by 3DFx. And don`t mention compression...it`s like saying the Dark One's name out loud(yes, I like Wheel Of Time, in case nobody noticed:D).

If he was channeling Saidin, was he insane, or was he able to hang on until the taint was cleansed from the Source? :D
 
8500 certainly "cheats" significantly, but it blew NV's AF performance totally out of the water by doing it. AF was so much faster on 8500 than GF3/4. You got usable AF in exchange for some serious limitations (bilinear and huge angle dependency). But hey, it's still much better than no AF. The original Radeon was even worse, but hell with GF2 you could hardly even have AF at all.

And, again, 3dfx had zip AF.

So, you are making a big stink because you want absolutely mathematically by-the-book correct filtering? What would be the point? It would cost performance and would probably be imperceptibly better, especially compared to what G80 does. And on the ATI side, all the way back to R300 you can see some quality filtering even with their optimizations. I can't really tell the difference between G80 and ATI cards, honestly. Unless you want to sit around and look at screenshots for minute details.

G80 has the best filtering of any card ever made, I believe.
http://www.techreport.com/reviews/2006q4/geforce-8800/index.x?pg=6

the g80 does have the best af.

i just think though, that the difference in b3d'd af tester between what would be perfect and the g80's, that there might be enough of a theoretical difference there to equal a real-world difference, but i don't know. the af tester looked significantly different.

i had always thought ati's af sucked ass with the 9700 pro @ quality 16x, and the optimisations bother some people, while they don't bother others. they bother me, but i guess they don't bother you that much and that fine, but there is definately a difference, maybe not percieved by everyone, between the 2900xt, and the g80, so i think ati's image quality sucks, b/c they only use 5 bit brilinear, instead of true 8 bit trilinear, plus they have more angle variance.

also, i had a 9700 pro and a geforce 4 ti 4400 (visiontek board) back in 2002, and i thought that with the same games, the geforce4's af looked much better than the 9700 pro's plus i couldn't tell any difference in frame rate with the games i played. i was also using aa on the geforce 4 and 6x aa on the 9700 pro, and the geforce 4 often seemed to me to have more fluid frame rate, so i guess i can't discern frame rate very well, while some may not be able to discern image quality as well. everyone's different.
 
Last edited by a moderator:
16 bit output not internally rendered @ 32 bit
Actually, 16 bpp output is rendered at 128 bpp internally. That's far better than anything 3dfx ever did. Maybe you miss 3dfx's "22-bit color" mode, but then you could just change your game to render at 32 bpp on any modern card. There aren't many games that only render in 16 bpp, the few examples I can think of being old console ports.

Regarding texture compression, you do know that applications provide compressed textures, right? You do know that 3dfx also supported texture compression, right? They even had their "own" compression format (which was pretty much a clone of s3tc). Yes, s3tc/dxtc is lossy, it's by design. Applications use these formats to save memory where quality can be sacrificed.
 
also, i had a 9700 pro and a geforce 4 ti 4400 (visiontek board) back in 2002, and i thought that with the same games, the geforce4's af looked much better than the 9700 pro's plus i couldn't tell any difference in frame rate with the games i played. i was also using aa on the geforce 4 and 6x aa on the 9700 pro, and the geforce 4 often seemed to me to have more fluid frame rate, so i guess i can't discern frame rate very well, while some may not be able to discern image quality as well. everyone's different.

9700 PRO was sometimes more than 2 times faster than GF4 when using AA. So yah, you must be missing some frame rate eyeball detection there. ;) NV25 was no match for R300 and I think others on this forum will happily agree with this.
http://www.anandtech.com/showdoc.aspx?i=1656&p=23

As for GF4's image quality, I dunno. I never owned one myself. Went from 8500 to 9700. I never once had a problem with the image quality produced by 9700 though. It's AA quality is superior to anything from NV short of G80. NV cards pre G80 can't even do 6X MSAA. I've used plenty of NV4x and G7x cards.

I'd like to see an example of where R300-R600's default filtering is ugly. I could pull together more than a few examples of where NV falls apart on NV40-G7x cards. But with ATI's R300 onward, I can't recall being unhappy at any point in time.
 
Last edited by a moderator:
Actually, 16 bpp output is rendered at 128 bpp internally. That's far better than anything 3dfx ever did. Maybe you miss 3dfx's "22-bit color" mode, but then you could just change your game to render at 32 bpp on any modern card. There aren't many games that only render in 16 bpp, the few examples I can think of being old console ports.

Regarding texture compression, you do know that applications provide compressed textures, right? You do know that 3dfx also supported texture compression, right? They even had their "own" compression format (which was pretty much a clone of s3tc). Yes, s3tc/dxtc is lossy, it's by design. Applications use these formats to save memory where quality can be sacrificed.

ati has the equivalent of 3dfx's 22 bit color mode, it's just that i want nvidia to render 16 bit output @ 32 bit internally, like ati does, and it shouldn't be that hard to do.

about the tc, yes i did know all of that, and i might try rivatuner and disable the tc.

also, there were quite a few games from 98 and 99 that don't have the option for 32 bit color, so it would be nice if nvidia fixed things and does what ati does about that.

16 bit color games may be old, but they shouldn't be discriminated against when it comes to making drivers.
 
Texture compression was (is) used to allow much higher resolution textures within the same amount of RAM. A good example is Unreal Tournament's S3TC texture pack on CD2. They are incredibly higher resolution textures than those used by cards without S3TC support. People went so far as to like S3's Savage cards simply because of the increased quality. ;) No other cards had S3TC support initially.

So, texture compression may be some sort of mixed blessing, but it's done way more good than bad as far as I can recall.

And yeah, ATI's quality on 16-bit games is great. But, so is NV's prior to G80. My notebook's 7800 produces fine quality on System Shock 2. You've seen what my desktop's 8800 does. lol. :LOL:
 
ati has the equivalent of 3dfx's 22 bit color mode, it's just that i want nvidia to render 16 bit output @ 32 bit internally, like ati does, and it shouldn't be that hard to do.
I'm pretty certain that nvidia cards 6800 and up will render things internally at least FP16 per component (64 bpp), that's still far beyond what 3dfx offered. I don't know what your specific complaint is.
 
he's referring to what G80 renders if you run a game with only 16-bit support.

I made some screenshots of System Shock 2 a while back:



And Quake 3 on 16-bit color depth:


As far as I can remember, no card has ever rendered an uglier 16-bit color depth. But hey, it's not exactly a popular game limitation these days. Still sort of a weird anomaly.
 
actually swaaye the 7600gt renders system shock 2 exactly the same way
and those textures arnt 16bit they are 8bit palletised
i tried changing the floor texture to 24bit and it still looked the same

ps: how did you get shock2 to run at that res ?
 
You can run shock at any 4:3 res by editing one of the .cfg files. cam.cfg, I think.

Weird that you see issues with 7600. My 7800 seems fine. Running the 84.69 drivers (latest mobile drivers). The game is still going to show some dithering and banding even on the best card for it, simply because of the texture quality and color depth... But on G80 it definitely is worse.
 
Last edited by a moderator:
still not perfect af, see the g80 iq analysis,
csaa and msaa, instead of t-buffer style aa

t-buffer = another implementation of multisampled rendering, do you know this, right?

I always thought the t-buffer as applied to AA was nothing more than rotated grid supersampling. The hardware devoted to it was probably very minimal.

---------------------------

My memory must be really rusty, I thought 3dfx AA implementation was just multisampling over 2 or 4 split buffers, but yeah..they were not reusing subsamples, so I guess the hw was just supersampling the scene.

T-buffer AA = RGSS

As for ballsweat's constant digs @ ATI IQ: pass the bong man.

No, no, no, the T-buffer was magical, invented by Gary Taroli whilst channeling the Supreme Power of Saidin, and it had jacksquat to do with the accumulation buffer that had existed prior to that for...umm..."a while". And it was magical supersampling, because even today, when we've had rotated grid since forever, it's still far better due to the fact that it was used by 3DFx. And don`t mention compression...it`s like saying the Dark One's name out loud(yes, I like Wheel Of Time, in case nobody noticed:D).

:LOL: Even though I haven't read WoT in years, my name requires that I respond. Funny stuff.
 
And everyone knows Wheel of Time looked better under glide


ps: system shock 2


edit: i just studied your picture properly (thought you were just refering to the circles on the floor tiles)
and the 7600gt is a lot better
you could try changing the depth of the floor texture to 16bit or 24bit (paintshop pro wont let me do 16bit pcx)
 
Last edited by a moderator:
Back
Top