ATI GPU image quality just looks better! But why?

Discussion in '3D Hardware, Software & Output Devices' started by gongo, Apr 5, 2009.

  1. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    While I'm aware that the ALUS/TMUs have seperated. The texturing increase from G80 and onward was quite significant. Though the GT200 reduced this its still in great abundance.

    The G7x could lose anywhere from 30% performance going from High Performance to High Quality. While the Geforce 8 and above will not even come close to that performance loss. While decoupling may have helped. It still isn't free yet.
     
  2. Novum

    Regular

    Joined:
    Jun 28, 2006
    Messages:
    335
    Likes Received:
    8
    Location:
    Germany
    That depends on the shader ;)
     
  3. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    I found it to be the case in texture heavy games ((low shader usage)) such as UT2003 as well. Where the performance was around 30%.

    But yes it can also depend on usage. For instance. In Oblivion ((With HDR disabled)) you could go from 33 FPS to 57 FPS @ 1280x1024 with a 7900GT SLI setup by going from High Performance to High Quality @ 4xAA/16xAF. But even on the G80 and above you could go from 53 FPS High Performance, To 61 FPS High Quality at the same settings on an 8800GTX. So its not free. I haven't done alot of recent testing on the issue. Because in all honesty. No one uses anything but High Quality mode on Geforce 8 cards and up. I dont even know why they leave the performance modes in considering their redundancy.
     
  4. Novum

    Regular

    Joined:
    Jun 28, 2006
    Messages:
    335
    Likes Received:
    8
    Location:
    Germany
    UT2003 doesn't (*) use shaders, so there is no way to hide TMU latency.

    (*) There is one PS1.4 for terrain blending, but it also just does TMU fetches and then blends the results together, so there should also be no way to hide latency.

    I know it's not free. But it depends on the shader ;)

    The CineFX-Architecture would be very inefficient today. It was a much more worthwhile target for sample count optimization than anything we have today. That's all I want to point out.
     
  5. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    16,157
    Likes Received:
    5,095
    Nvidia marketing still likes it for benchmark comparisons versus the competition, otherwise they would have done like ATI a few years back and just remove the lower quality options.

    Regards,
    SB
     
  6. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    Nobody sets "High Performance, or Performance" mode anymore on Nvidia drivers. I have never seen a reviewer do it. At worse they'll leave it at Quality which has a slightly more aggressive LoD and small Quad Trilinear/Bilinear filtering enabled by default. Which results in maybe a 1-2 FPS gain. I'm sure future architectures will probably remove the modes all together.

    Everytime a new architecture comes out. Nvidia will look at whats being used and whats not. And whether supporting it is still worth the transistor count/usage. ((IE see things like palleted textures. 16 bit dithering ect)). Right now theres no real marketing advantage to push for the performance modes. As they are redundant in most cases.
     
  7. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    14,899
    Likes Received:
    2,311
    thats sad, so in future we are going to see even more games stop working
    not good times if your a retro gamer
     
  8. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    I think you'll find its mostly relevant to pre DirectX things. Or things like "Quincunx" AA.
     
  9. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    Yes, I was talking about the physical drive mechanism. It looked and felt flimsy to me, but that's not necessarily a criticism because for the money the PS3, nor any other device, can be all things to all people...;) I was primarily interested in a BluRay drive but I picked up the PS3 simply because it offered more utility than a stand-alone BluRay drive. I'm not a console gamer though--the computer is my game box--and to me the PS3 BluRay drive was just not comparable to the physical drive in the stand-alone BluRay player.
     
  10. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    I hope they'll try to support stuff in software. I mean, if you don't want to do 16bit dithering, then please do it with shaders.
    I feel like some day we will end up using a GPGPU software rasteriser to run those elderly games :lol:.
     
  11. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    In most cases you can just silently replace all 16-bit operations with 32-bit operations or vice-versa. So removing actual 16-bit support shouldn't impact compatibility, other than games running in 32-bit even though they think they run in 16-bit.
    As far back as the old PowerVR/Kyro cards, blending was done in 32-bit even for 16-bit modes (Kyro clearly looked better than contemporary cards that still used dithering).
     
  12. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    but they don't care doing that either, I've seen the horrors of 16bit quake 3 on a recent nvidia board.
     
  13. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    Doesn't Quake 3 normally run in 32-bit though? In which case I see no reason why anyone would bother wasting time on fixing issues with the 16-bit mode.

    I guess you need a REALLY old game to find one that is strictly 16-bit, and would actually *require* good compatibility in the first place. In fact, that may be at around the time when OpenGL and D3D weren't widely used yet, and games used custom APIs like minigl and Glide, which haven't worked on any modern hardware in ages anyway.
     
  14. Sxotty

    Veteran

    Joined:
    Dec 11, 2002
    Messages:
    4,895
    Likes Received:
    344
    Location:
    PA USA
    That really is not true. The leadtek geforce 4 cards were some of the very best for colors fidelity. Easily on par with Matrox.
     
  15. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    Well, unless he means the GF4MX cards, which were really just GF2-based cards. But 'proper' GF4 cards had no image quality problems iirc. Around that time the problem pretty much seemed to be 'solved' with most big vendors.
     
  16. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    Well someone had configured the X11 server as 16bit on that particular machine.
    there were threads about that issue, a notable example being system shock 2.
    Compatibility is not a big issue, compatibility is the norm rather than the exception with old windows games (and windows apps in general). There are glide wrappers, full opengl can be used in place of miniGL, modern drivers still support the directX3/5 shit.

    Granted that's still a niche issue. The hardest part is finding something worth playing from 1997 or 1998 :razz:, that era had a pretty low signal-to-noise.

    Another niche is, running in 16bit mode with a lowest end card or an IGP. It saves bandwith, so in an older game you can spend it on something more useful like doubling the anti-aliasing or getting higher framerate. It's niche only because most people can't set up their games, and what we call gamers most often have better hardware. But it would be useful on a laptop or netbook.

    an example : running quake 3 on a geforce 6100. at 1024 32bit, it's looking like shit and running like shit (but would have been good nine or ten years ago). dropping it to 800x600 16bit, but with 2x AA and 4x or 8x AF, now it looks much better and runs much faster.
    I would actually care when buying a nvidia ION 2 + via CPU netbook.
     
  17. Novum

    Regular

    Joined:
    Jun 28, 2006
    Messages:
    335
    Likes Received:
    8
    Location:
    Germany
    The problem is, that the app could lock and read out color surfaces. The driver would then need to copy the surface when you lock it.

    While this is possible it is also an effort just for legacy support.

    Didn't AMD also remove dithering in the HD series? Perhaps D3D10 specifies that you should not do dithering anymore.
     
  18. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    In OpenGL this generally happens anyway.
    Besides, there should still be hardware-accelerated colour-conversion routines, because you also need those for many other applications (blting from one surface format to another).

    In D3D you have to specify that a texture can be read back at creation-time, so the driver can take special care of it (generally textures are stored in a swizzled way for better cache-efficiency, so you can't read them back directly anyway. The actual format is hardware-dependent).
    Most textures will be read-only, so those could just be converted to a 32-bit format on creation.
     
  19. Sound_Card

    Regular

    Joined:
    Nov 24, 2006
    Messages:
    936
    Likes Received:
    4
    Location:
    San Antonio, TX
    Generally, I find Nvidia to have too many features for their own good, while I find ATi to have too few.
     
  20. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    16,157
    Likes Received:
    5,095
    Actually no, I had a GF4 4600 TI at the time. Compared to ATI, Matrox, and even the Voodoo 5500 (not as good as the ATI or Matrox) at the time it was quite noticeably worse for 2D at 1600x1200 and up. And it just got even worse at 1800x1440.

    At 1280x1024 however, the difference wasn't really that noticeable, if at all.

    Regards,
    SB
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...