Does the nVidia architecture still give you the 16bit edge?

What about using an S3 SAVAGE4 GT? Wouldn't that be good for 16bit rendering in games like Q3?
Why would you run Quake 3 in 16-bit mode when it supports 32-bit? Any modern card would be 10000000000000 times faster in 32-bit than a Savage 4 in 16-bit. (Slight exagerration.) Savage 4 was clocked at 125 mhz I believe and 1 pixel per clock. Even today's mobile chips are about 16 times as fast.
 
Why would you run Quake 3 in 16-bit mode when it supports 32-bit? Any modern card would be 10000000000000 times faster in 32-bit than a Savage 4 in 16-bit. (Slight exagerration.) Savage 4 was clocked at 125 mhz I believe and 1 pixel per clock. Even today's mobile chips are about 16 times as fast.

Its OC'able... and from what I have read (see below) its a little more advanced than most think for certain games.....most notably being Q3.

"They added single-pass multi-texturing, meaning the board could sample 2 textures per pixel in one pass (not one clock cycle) through the rendering engine instead of halving its texture fillrate in dual-textured games like Savage 3D."
also
"Only the high-quality texture capability from its S3TC support gave it good mind share with the gaming community. Unreal Tournament and Quake III Arena, two popular games at the time, shipped with built-in support for S3TC. The compressed textures were a vast improvement over the standard textures used on all other cards. Not only that, but S3TC allowed these much higher quality textures to be rendered with negligible performance impact."

With that I end my case....
 
most desktop LCD panels are 18-bit as well :)

Which is one of those market deceptions because those 6bit (18bit) panels only do ~200k colors, but with "hardware dithering" do 16.2M colors... which lead to some of them labelling them as 16.7M... and now no one even bothers to distinguish :???:
 
With that I end my case....
All cards today have S3TC support. Everything from NV and ATI since Radeon "R100" and GeForce 256.ref There is no advantage whatsoever to running a buggy and slow Savage4, unless you've discovered the one game that requires S3 Metal (doesn't exist AFAIK).

You don't want to own any ATI cards new than X1950 or NV cards newer than GF 7950 though because they don't have dithering support for 16-bit depth. This causes lots of ugly color banding. But anything from X1950 and GF 7950 on down are perfectly fine with regards to visual quality, and you can force on anti-aliasing and anisotropic filtering and make the game look way better than a Savage4 will ever do. You could buy a $30 GF6+ or Radeon 9+ card and it would outperform any card from those times in every way.
 
Last edited by a moderator:
Its OC'able... and from what I have read (see below) its a little more advanced than most think for certain games.....most notably being Q3.
LOL! You really think a 10 year old chip can stand a chance to any modern chip? And OCable?? C'mon... 125 mhz base engine clock, 143 mhz memory clock, 64-bit memory interface, 1 pixel per clock.
"They added single-pass multi-texturing, meaning the board could sample 2 textures per pixel in one pass (not one clock cycle) through the rendering engine instead of halving its texture fillrate in dual-textured games like Savage 3D."
Savage 4 could do 2 textures per pass... BIG DEAL! It took two clocks anyway. Any DX9 chip can do up to EIGHT textures per pass in fixed function mode (16 in programmable pixel shader mode). Please, stop living in the past.
"Only the high-quality texture capability from its S3TC support gave it good mind share with the gaming community. Unreal Tournament and Quake III Arena, two popular games at the time, shipped with built-in support for S3TC. The compressed textures were a vast improvement over the standard textures used on all other cards. Not only that, but S3TC allowed these much higher quality textures to be rendered with negligible performance impact."

With that I end my case....
All chips support DXTC these days, so you case doesn't hold water. In any event, modern cards can run Q3 without texture compression for even higher image quality. Savage 4 only had up to a 32 mb framebuffer, so it needed compression so that everything would fit in memory, not to mention it only had a 64-bit memory interface so it also benefited from the bandwidth savings.

Take any modern card, run it in 32-bit mode, disable texture compression and it will still be 10x as fast as a Savage 4 and look better as well.
 
I personally, don't use any S3 cards, I just have an AGP 4x laying around.
If I want to play Q3, I use my ATI All-In-Wonder Radeon GT (200Mhz Core/200Mhz DDR)
 
I personally, don't use any S3 cards, I just have an AGP 4x laying around.
If I want to play Q3, I use my ATI All-In-Wonder Radeon GT (200Mhz Core/200Mhz DDR)

Oh I guess I should give the specs...
ATI All-In-Wonder RadeonGT
Chipset- R100
Core Speed- 200Mhz
Memory Clock- 200Mhz DDR
Memory Interface- 128bit
Memory- 32mb
Bus- AGP 2x/4x
Thats what I use for older games that require T&L but dont require a crapload of processing power....
 
so I tried the old counterstrike on a friend's 8500GT, running in Triple 16™ mode :D (16x, 16x, 16bit) and didn't find a deal breaking quality issue, maybe it was slightly like the quake, system shock screenshots but on a much smaller scale. but, I didn't try 8xS or 16xS (useful for alpha blended vegetation and fences). I remember reading that the old style (geforce 1/2 era) oversampling used there made things worse regarding 16bit.. though I hadn't a problem with ti4200 and 6800GT.

I'll therefore likely build a MCP78 based system.
 
Last edited by a moderator:
Here's the practical result of ATI's 16 to 32bit rendering conversion mentioned by OpenGL guy


CRIMSONSkies.jpg
 
3dfx's (RIP) had 22bpp post processing for that.

OT: what colour depth are today's games for PDA/mobile devices using? Mostly 16bpp?

Well, WinMo6's pesky color limit for one, and the second being that dithering and designing for a pesky color palette in the first place wouldn't bring so much impact. :LOL:


Oh, and wouldn't nearly every 16bpp game support Glide? And I do wonder if the wrappers help at any point too.
 
Aye, I too wish they would do something about that. I for one would still rather play Thief1/2 with horrible graphics and horrible banding than the best of today's game at the highest quality. God I miss Looking Glass* :(

*I still can't believe they were in the process of hiring me as a graphics programmer 2 weeks before the whole company was shutdown. My idols were finally hiring me.. and then they got fired.
 
Back
Top