Does the nVidia architecture still give you the 16bit edge?

Discussion in 'Architecture and Products' started by mockingbird, Jan 15, 2008.

  1. mockingbird

    Newcomer

    Joined:
    Jan 15, 2008
    Messages:
    7
    Likes Received:
    0
    I remember back in the day I hated my Radeon, because one of the follies of the Radeon architecture, was that if you lowered the color depth to 16bpp, you would still get the same speed as playing in 32bpp.

    ATI touted this as a "feature". With nVidia cards, you were able to get a huge speed boost with demanding games.

    Is this still the case?

    I'm planning for a new system in the future, which will probably be a Kuma+AM3, and was wondering what my GPU option would be. Too bad, I really had faith in Trident's "surprise" which turned out to be bullshit. Looks like it's still ATI and nVidia.

    ATI has always had shoddy product support. The original Rage Theatre would actually "deinterlace" on the video in port by just removing one of the fields altogether. And DSCALER wouldn't help because this was hardwired :???:

    OTOH I hate nVidia because they've turned from the underdog into the evil narcissistic emperors of the GPU world. I'll probably still go with an ATI card, but it would be just nice to know.

    Also, in anticipation of Carmack's Rage, how is ATI's OpenGL performance still on par with nVidia's?

    Thanks
     
  2. hoho

    Veteran

    Joined:
    Aug 21, 2007
    Messages:
    1,218
    Likes Received:
    0
    Location:
    Estonia
    Do games still offer the option to use 16bit depth? Today with all the HDR craze most internally use 64bit anyways :p
     
  3. fellix

    Veteran

    Joined:
    Dec 4, 2004
    Messages:
    3,533
    Likes Received:
    492
    Location:
    Varna, Bulgaria
    If you are referring to the ALU processing precision for the shaders, G71 was the last time when 16-bit components were faster (or 32-bit being slower, depending on PoV). Since G80 and the mandatory requirements on a API level, regarding this issue, it's all 32-bit realm and hence no need to spend time and resources for this kind of optimization.
     
  4. mockingbird

    Newcomer

    Joined:
    Jan 15, 2008
    Messages:
    7
    Likes Received:
    0
    I still play some Quake3 based games like Jedi Academy and Jedi Outcast.

    But it looks like I'm way behind ;) ;) ;) Oh well, what can I say, money's been a hard thing to come buy lately.
     
  5. Arun

    Arun Unknown.
    Moderator Legend Veteran

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    302
    Location:
    UK
    Uhm, you do realize that with a modern high-end GPU, FPS in Quake3 will be capped at 999, right? I don't think 16-bit support for these games makes any sense whatsoever anymore.
     
  6. mockingbird

    Newcomer

    Joined:
    Jan 15, 2008
    Messages:
    7
    Likes Received:
    0
    I would disagree with this.

    My brother has a Radeon X1450 on his C2D laptop, and it's not powerful enough to drive Jedi Academy at the native LCD resoution of 1280x800.

    I'm referring to OpenGL 1.5 and Directx5-8.1 games.

     
  7. hoho

    Veteran

    Joined:
    Aug 21, 2007
    Messages:
    1,218
    Likes Received:
    0
    Location:
    Estonia
    What kind of GPU are you planning to get? It might be better if you tell us your budget and preferences so someone can make a possible configuration for you.
     
  8. mockingbird

    Newcomer

    Joined:
    Jan 15, 2008
    Messages:
    7
    Likes Received:
    0
    I'm planning to get a mid-range system, no SLI. What does ATI's r7xx hold in store?
     
  9. XMAN26

    Banned

    Joined:
    Feb 17, 2003
    Messages:
    702
    Likes Received:
    1

    As noone else has tackled thing points yet, I will.
    1. Still living in 2000 are ya?
    2. Stupid reason for not going with the best available, but whatever floats your boat.
    3. For the most part, ATI has improved from the Q3A days.
     
  10. Arun

    Arun Unknown.
    Moderator Legend Veteran

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    302
    Location:
    UK
    X1450 is pretty damn far from 'modern high-end'. And the bottlenecks are elsewhere nowadays so it feels pretty unlikely that 16bpp would give you a significant performance boost anyway (although I'm sure it would - just not much).

    Let's be realistic though: nowadays, if you want to play recent games, you either buy something on the level of the HD3850/8800GS as a strict minimum, or you don't and only play old stuff. It's a sad situation to be in, but whadya want. The perf/$ at that pricepoint (>$150) is incredibly higher than in the low-end, anyway. Just to give you an order of magnitude to think about: I would estimate a HD3850 to be 6 to 10 times faster than a Mobility Radeon X1450.
     
  11. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,020
    Likes Received:
    115
    No, you get that backward. Even then, both GeForce and Radeon cards would have the same performance with 32bpp and 16bpp internally. So technically, the nvidia cards (tnt-GeForce2) didn't get faster with 16bpp, they got indeed slower (a lot) with 32bpp. The reason a GeForce2 got much slower with 32bpp was that (a) it was faster to begin with (could output 4 pixels per clock) than a Radeon (2 pixels per clock) and (b) it didn't have any bandwidth saving features. Thus it was completely and utterly limited by memory bandwidth in 32bpp mode, in contrast to the radeon (the hyperz features it had were probably actually a bit overkill for the raw performance it had...).
    But anyway, that's certainly not relevant any longer. Both vendors have bandwidth-saving methods, and I bet those aren't optimized for 16bpp...
     
  12. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,180
    Likes Received:
    2,791
    Location:
    Winfield, IN USA
    Thanks McZak, I was just gonna post up about that bit too. The "speed boost" with 16bpp bit cracked me up pretty hard as a rewrite of history.

    Also gotta agree ATi had crap support back in the day, but they totally turned that around years ago and have been excellent for a long time now.

    Double-also; my C2D laptop has an X1400 and while I think it's a great card for viddy/web cruising on a 17" laptop it is by no means a gaming card.....not by a long stretch!
     
  13. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,780
    Likes Received:
    851
    Location:
    WI, USA
    Both G80/G9x and R600/RV6x0 look horrible at 16-bit color depth. They don't dither at all and you get loads of banding as a result.

    The crazed fan(atic)s of the Thief games and System Shock 2 are "up in arms" over this issue because their treasured games look terrible on the new cards. Local hero (and Oblivion hero), Timeslip, has written a hack to force D3D to render Dark Engine-powered games at 32-bit depth. It's pretty buggy though. Crashes Shock 2 when the inventory comes up.
    http://www.ttlg.com/forums/showthread.php?t=113501

    This is what Quake 3 looks like on G80 if you set the game to 16-bit color depth. :)
    [​IMG]

    System Shock 2 with 16X AA and 16X AF, in addition to loads of banding!
    [​IMG]


    So if you really want to play a game with 16-bit color-only, you'd better stick to GF 7900 and X1950 and down. I believe the NV folks have admitted that they are aware of the limitation of the GPUs, but I doubt they consider it a major issue.
     
    #13 swaaye, Jan 15, 2008
    Last edited by a moderator: Jan 15, 2008
  14. Aerows

    Regular Newcomer

    Joined:
    Nov 19, 2002
    Messages:
    317
    Likes Received:
    6
    Why do I feel as though I stumbled into a thread written by Rip Van Winkle?

    I half expected someone to come chiming in about a Voodoo card.
     
  15. PsychoZA

    Newcomer

    Joined:
    Mar 1, 2007
    Messages:
    75
    Likes Received:
    0
    I always preferred the dithering from 3Dfx cards, compared to the nVidia cards, when rendering in the 16bit days. ;)
     
  16. Aerows

    Regular Newcomer

    Joined:
    Nov 19, 2002
    Messages:
    317
    Likes Received:
    6
    LOL thank you for not disappointing me :lol:

    Swaaye, my eyes are burning from those screen shots.
     
  17. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    16bit mode is very useful on a geforce 6100 where you're awfully bandwith limited, even (or especially) in older games. and you still can use 32bit compressed textures.for the same reason you may want to run 16bpp on IGP and laptop GPUs.

    with a future (or current) midrange card though.. you're looking at a HD 3850/70, 9600GT or RV770. fast ram on a 256bit bus. From 9700 pro level and onwards 16bpp is pretty much irrelevant (and on my ti4200 it was for running AA 8xS in some older games)
     
  18. Aerows

    Regular Newcomer

    Joined:
    Nov 19, 2002
    Messages:
    317
    Likes Received:
    6
    Come to think of it, I may actually have been affected by this. I installed Might & Magic 6 recently, due to a nostalgia thread, and it looked awful being rendered by my 8800 GTS. I don't remember it looking quite that bad. Maybe I need to put it on lower end hardware LOL
     
  19. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    and on voodoo5 ypu have a driver option to force 32bit rendering even though you never really needed it. 16@22bit is great (improved on V5) and most times 16bit looks good on old game, especially if 32bit is not even supported.
     
  20. mockingbird

    Newcomer

    Joined:
    Jan 15, 2008
    Messages:
    7
    Likes Received:
    0
    Interestingly revealing. Video card manufacturers getting a little ahead of themselves. It's like Abit when they tried to dump PS/2 from their "IT7" mobos. (Yea, yea. I know. I'm living in the past)

    The immediate solution that comes to mind is to use a PCI supplementary card for older games. Of course, future motherboards will not have PCI slots and to my knowledge, there are no PCIe x8 or x4 video cards and I don't envision any in the future.

    So, in retrospect, I'm glad I brought this topic up even in the face of some well intentioned derisive heckling. Want to play older games? Your new computer is less effective at this task than a Tualatin or Coppermine with a Geforce2 GTS...

    Hmmm. So the Geforce was superior all along. The Geforce2MX (Addmitedly released at a significantly later time than the Radeon 7200) crushed the Radeon 7200 in 16bit and was almost on par with it at 32bit and cost less than half. Yea yea, I know, ancient news.

    Boy, ATI was overpriced from the start. Too bad 3dfx was bankrupted by some unknown entity.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...