Does the nVidia architecture still give you the 16bit edge?

Interestingly revealing. Video card manufacturers getting a little ahead of themselves. It's like Abit when they tried to dump PS/2 from their "IT7" mobos. (Yea, yea. I know. I'm living in the past)

The immediate solution that comes to mind is to use a PCI supplementary card for older games. Of course, future motherboards will not have PCI slots and to my knowledge, there are no PCIe x8 or x4 video cards and I don't envision any in the future.

So, in retrospect, I'm glad I brought this topic up even in the face of some well intentioned derisive heckling. Want to play older games? Your new computer is less effective at this task than a Tualatin or Coppermine with a Geforce2 GTS...
You're kidding I hope. 16-bit will unlikely not gain any performance over 32-bit with any mid-range to high-end card in today's market, but I certainly hope any of those cards are plenty fast with 32-bit rendering for older games that do single-/dual-texturing at most.

Just because your GeForce 2 GTS gains a lot of performance with 16-bit color, doesn't mean it's even remotely as fast as any of today's cards. Most likely the main bottleneck with old games when paired with a new card will be the CPU.

Why don't you buy an X1950 XT (or any ATI card between 9500 to 1950 for that matter) and force AA on your favorite old game.? Then it will be rendered in 32-bit color and have AA to boot. Since the vast majority of old games do all of their rendering to the flip chain, force-on AA should work fine.
 
Like I said. My bro has a Radeon X1400 and it is simply not good enough to play Jedi Academy - which is a Quake3 game (Though Outcast was also Q3 and not nearly as visually appealing as Academy) - at his LCD's native resolution of 1280x800.

Is it a mobility thing? Maybe they don't make them mobile chips like they used to. My Mobility Radeon 7500 plays it somewhat good on a P4 2.0 laptop.

Why don't you buy an X1950 XT (or any ATI card between 9500 to 1950 for that matter) and force AA on your favorite old game.? Then it will be rendered in 32-bit color and have AA to boot. Since the vast majority of old games do all of their rendering to the flip chain, force-on AA should work fine.
 
Like I said. My bro has a Radeon X1400 and it is simply not good enough to play Jedi Academy - which is a Quake3 game (Though Outcast was also Q3 and not nearly as visually appealing as Academy) - at his LCD's native resolution of 1280x800.

Is it a mobility thing? Maybe they don't make them mobile chips like they used to. My Mobility Radeon 7500 plays it somewhat good on a P4 2.0 laptop.
Maybe you have something set wrong, like AA enabled, on the Mobility X1400 because there's no way a Mobility Radeon 7500 could be as fast.

Here's a quick breakdown:
- X1400
M54 chip
4 pixels per clock
445 mhz engine clock
250 mhz memory clock
Check here for more info and benchmark numbers. 3DMark01 score of 15000 is way higher than the 7500 could hope to achieve even if the 7500 had DX 8.1 support (7500 got 4137 in 3DMark01).

- 7500
R100 chip
2 pixels per clock
280 mhz engine clock
200 mhz memory clock
Check here for more information.

Perhaps your X1400 is running in 2D clock mode or something, because something is fishy. Even if you factor in that the 7500 could do up to 3 textures at once, it lacks a lot of performance features compared to the X1400.
 
Hmmm. So the Geforce was superior all along. The Geforce2MX (Addmitedly released at a significantly later time than the Radeon 7200) crushed the Radeon 7200 in 16bit and was almost on par with it at 32bit and cost less than half. Yea yea, I know, ancient news.

Boy, ATI was overpriced from the start. Too bad 3dfx was bankrupted by some unknown entity.
That's not quite how I remember it. IIRC the GeForce2MX was about on-par or slightly faster than a 7200ddr in 16bit but quite a bit slower than even a 7200sdr in 32bit (and even more slower than a 7200ddr of course). The 7200ddr was clearly more expensive back then, but the 7200sdr (which only was 10% or so slower than the ddr version) cost about the same as a GF2MX. At least, that's why I ended up with a 7200 sdr :). And the 7200 sdr was released even after the GF2MX, IIRC.
 
gf2mx was very bandwidth limited, not unlike its big bro gf2gts. NV didn't have the bandwidth saving tech going yet... And unlike the gf2mx, the Radeon SDR was not a crippled value board at all. It just had SDRAM instead of DDR but used the same R100 GPU. ATI had a efficient (if underpowered) little GPU that didn't mind having less bandwidth. DDR was probably more than it needed. On the other hand, R100 couldn't really compete with GF2GTS.

Radeon 7500 is just a Radeon DDR shrunk to 150nm. It was released alongside R8500 as a value option.

A Mobility X1400 should really mop up any of these old GPUs. In a big way.... Heck, the 2-generation-older Mobility 9600 in my laptop does.
 
Last edited by a moderator:
IMHO, the only cards that did good 16-bit were PowerVR ones, simply because the 32->16 bit conversion was only done once per pixel.
 
Are you that Simon that used to work for Videologic and posted on the PowerVR forums?
I still work for Imagination Technologies (nee Videologic ) and yes, I did post there and at Dimension3D (may it rest in peace).
 
IMHO, the only cards that did good 16-bit were PowerVR ones, simply because the 32->16 bit conversion was only done once per pixel.

3dfx's (RIP) had 22bpp post processing for that.

OT: what colour depth are today's games for PDA/mobile devices using? Mostly 16bpp?

On topic: 16bpp output would be only relevant for ancient applications that don't support 32bpp. Wherever games support 32bpp, it's utterly ridiculous on a recent GPU to use 16bpp, since there truly isn't a speed advantage anymore.
 
3dfx's (RIP) had 22bpp post processing for that.
But once the damage is done, you can't really magically fix it.

OT: what colour depth are today's games for PDA/mobile devices using? Mostly 16bpp?
Good question - I don't actually know the answer to that. I suspect it's "as little as possible".
 
3dfx's (RIP) had 22bpp post processing for that.
Which is quite good at smoothing out the dithering, but it's post processing so it won't correct any errors due to framebuffer blending.

OT: what colour depth are today's games for PDA/mobile devices using? Mostly 16bpp?
Depends on the device/screen. If you only have an 18bpp screen (and internal truecolor rendering :)) it may not be worth using a 24/32 bit framebuffer.
 
3dfx's (RIP) had 22bpp post processing for that.

OT: what colour depth are today's games for PDA/mobile devices using? Mostly 16bpp?

On topic: 16bpp output would be only relevant for ancient applications that don't support 32bpp. Wherever games support 32bpp, it's utterly ridiculous on a recent GPU to use 16bpp, since there truly isn't a speed advantage anymore.
My Dell Axim x51v's LCD is 16-bit. I'm pretty sure all games would also use 16-bit colour.
 
My Dell Axim x51v's LCD is 16-bit. I'm pretty sure all games would also use 16-bit colour.
Doubtful. Your LCD may only have 4-bit color per channel, but the video card won't see that. Games should still be able to render 32-bit color, but the LCD will convert the final output to 16-bit.
 
Doubtful. Your LCD may only have 4-bit color per channel, but the video card won't see that. Games should still be able to render 32-bit color, but the LCD will convert the final output to 16-bit.
Isn't it more likely to be 18-bit on a reasonable PDA device?
 
Back
Top