ATI GPU image quality just looks better! But why?

Discussion in '3D Hardware, Software & Output Devices' started by gongo, Apr 5, 2009.

  1. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    Yea well, I think that's beside the point. I think the point wasn't about default settings, but rather that one videocard gives better image quality than the other, even after calibrating. Historically it's been like that.
     
  2. MDolenc

    Regular

    Joined:
    May 26, 2002
    Messages:
    690
    Likes Received:
    425
    Location:
    Slovenia
    It's not beside the point. If you find such a difference you didn't calibrate well enough. Evrey single time I stumble across a topic like this people are comparing PQ without ANY calibration and even if you use same monitor same DVI cable you still have driver settings to match between boards. You can't just dump gamma table...
    Even if application you use to test this will use its own gamma table I'm not 100% sure you'll bypass all the driver tweaking.
     
  3. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    Well that's what I was trying to say.
    Historically (when we used analog connectors and ramdacs of varying quality), it was quite common to notice differences between cards.
    But these days, when the output path is all digital, and videocards are capable of very high accuracy in RGB levels, I don't think a properly calibrated system will visually be different (even though there may be tiny differences at bit level here and there).

    I've had various ATi and nVidia cards with DVI over the years anyway, and I haven't noticed a difference anymore. Purely talking about 2d image quality here, ofcourse. Eg, video playback would differ, and generally the newer card would have better quality with video acceleration, regardless of which brand it was. And 3d would also be slightly different, most notably because of slightly different mipmapping, AF and AA algorithms (and ofcourse the various cheating... erm, driver optimizations applied). But these differences get smaller and smaller every generation.
     
    #43 Scali, Apr 9, 2009
    Last edited by a moderator: Apr 9, 2009
  4. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    16,067
    Likes Received:
    5,017
    Not quite true. At least with older Nvidia cards (pre-FX). 2D quality was seriously lacking in regards to even something as lowly as a i740. And no amount of fiddling with it could bring it up to part. Geforce 4 was quite significantly better than previous cards but not as good as competing cards.

    As well, no amount of fiddling I did back then could prevent them having a "washed out" look compared to competing cards. Especially in 32 bit color. Again Geforce 4 looked quite a bit better than previous entries.

    Canopus probably made the best of the lot, with quite good 2D quality but still not as good as ATI/3dfx/Matrox...

    With that said, I really didn't notice any glaring differences going from Geforce 7xxx and up (again I skipped the FX and 6xxx series).

    So, yes, I'd certainly agree that a properly calibrated card (of fairly recent origin) and monitor should have basically the same color accuracy and 2D quality.

    Regards,
    SB
     
  5. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    I actually did the 'filter mod' on my GeForce2 GTS back in the day.
    It was my first nVidia card after a bunch of Matroxes and other decent video cards, and I couldn't stand the blurry output.
    Removing one stage of the low-pass output filter did wonders for the sharpness and contrast of the image. It put it right up there with ATi and Matrox.

    But these days... Heck, even the Intel X3100 in my laptop, via VGA, has very good image quality. And the old Intel IGPs used to be even worse than the nVidia cards :)
    I know that when ATi decided to sell their GPUs to third-party board makers, they also decided to put the filter inside the GPU itself, so the third-parties couldn't mess up the image quality. I wonder if all companies do that now, even Intel.
    If not, I guess Fujitsu-Siemens just did something very right in the VGA output of my X3100 :)
     
  6. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,456
    Likes Received:
    578
    Location:
    WI, USA
    I have a pair of GF2MX OEM cards that put out very nice 2D. But yeah there's no denying that there are zillions of GF256, GF2 and GF3 cards that are seriously fugblurry. :)

    Any card can be beautiful on VGA as long as it has decent circuitry powering the output.

    I also have a Matrox Millennium G200 that gets blurry above 1152x864. Shocking isn't it? I'll probably be flamed for claiming such.
     
  7. I.S.T.

    Veteran

    Joined:
    Feb 21, 2004
    Messages:
    3,174
    Likes Received:
    389
    I had a GF2MX that was pretty good in 2D output. However, it was a model made years after the card debuted, so it's quite possible that things changed...
     
  8. Humus

    Humus Crazy coder
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,217
    Likes Received:
    77
    Location:
    Stockholm, Sweden
    It's not from scaling. More like different gamma curves.
     
  9. vazel

    Regular

    Joined:
    Aug 16, 2005
    Messages:
    992
    Likes Received:
    3
    Something to consider is that the PS3 is considered the best Blu-ray player on the market. If Nvidia really did have problems wouldn't one of the anal A/V enthusiasts have noticed something by now.
     
  10. I.S.T.

    Veteran

    Joined:
    Feb 21, 2004
    Messages:
    3,174
    Likes Received:
    389
    Not really. Hypothetically, RSX would have had any problems it had fixed. Sony really wanted Blu-Ray to sell via the PS3. Do you think they'd have left any image quality problems in the chip?
     
  11. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,079
    Likes Received:
    648
    Location:
    O Canada!
    Is the PS3 even using RSX for the display pipeline element?

    http://www.beyond3d.com/content/articles/16/2
     
  12. Novum

    Regular

    Joined:
    Jun 28, 2006
    Messages:
    335
    Likes Received:
    8
    Location:
    Germany
    I never noticed a difference.

    It also would be really bad for professional graphics if the default settings in Windows would not result in linear output from the framebuffer over DVI.

    You would have to do all the calibration again when you change cards. That doesn't sound very likely to me.

    If there is a difference then I assume one IHV is "enhancing" the output curves to suggest "better" colors, but as I said I never saw a difference over DVI.
     
  13. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    All I know is that a friend of mine is a photographer, and she only uses ATi, on recommendation from Kodak.
    However, she's done this for ages (long before DVI and floating-point colour processing and all), so this recommendation may just be a relic of the past.
    She uses Macs by the way, because these have a better, standardized way of calibrating displays, printers and other devices.

    I recall asking her if photographers ever used Matrox cards... This was in the Parhelia age, where Matrox was the only one offering 10-bit per component colour resolution. Matrox also had a good reputation for image quality... but somehow they were never adopted by photographers on a large scale.
     
  14. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    I returned my PS3 precisely because I was so unimpressed with the flimsiness of its BluRay drive. It looked to me like a problem fest waiting to happen. I bought a Sony stand-alone BluRay player instead, and am wondering why you think Sony's PS3 BluRay player might be better than Sony's stand-alone BluRay players...? The drive mechanism in the stand-alone unit seems to me to be far superior in terms of durability.
     
  15. homerdog

    homerdog donator of the year
    Legend Veteran Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,133
    Likes Received:
    905
    Location:
    still camping with a mauler
    Flimsy? Maybe you got a faulty system; mine feels pretty much like every other slot loading drive I've ever used...
     
  16. vazel

    Regular

    Joined:
    Aug 16, 2005
    Messages:
    992
    Likes Received:
    3
    The PS3 has been widely heralded as the best BD player since it came out. Criterion uses the PS3 as their reference player.
     
  17. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    I think WaltC is talking about the physical drive more than the actual decoding of the movie data and transfering it to a screen.
    The PS3 hardware does look a tad flimsy, but perhaps the decoder is best-in-class.
     
  18. vazel

    Regular

    Joined:
    Aug 16, 2005
    Messages:
    992
    Likes Received:
    3
    Nah I don't see anything to indicate the Blu-ray drive is flimsy. The console has been infinitely more reliable than the 360. The PS3's Blu-ray drive loads Blu-rays the fastest of any Blu-ray player btw.
     
    #58 vazel, Apr 15, 2009
    Last edited by a moderator: Apr 15, 2009
  19. Skinner

    Regular

    Joined:
    Sep 13, 2003
    Messages:
    871
    Likes Received:
    9
    Location:
    Zwijndrecht/Rotterdam, Netherlands and Phobos
    Allthough this topic is maily about HW IQ quality I wanted to share my story ;)

    I'm experiencing the best IQ (renderIQ) now I have gone back to nVidia and enjoying the pleasure of SSAA. Damn this is so beautiful.
    It not only anti aliase better, but textures are much crispier and the detail stretches much further away(like 32xAF).

    It's defenitely not an efficient way (have two 700 mhz GTX285's, so that isn't any problem anyway) and not all games let you enable it, but when it works, oh dear.....
     
  20. vazel

    Regular

    Joined:
    Aug 16, 2005
    Messages:
    992
    Likes Received:
    3
    Actually I think ATI has the better AF. Nvidia's SSAA modes are awesome.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...