ATI GPU image quality just looks better! But why?

Discussion in '3D Hardware, Software & Output Devices' started by gongo, Apr 5, 2009.

  1. nutball

    Veteran Subscriber

    Joined:
    Jan 10, 2003
    Messages:
    2,147
    Likes Received:
    472
    Location:
    en.gb.uk
    Nope, I disabled that specifically for that photo.
     
  2. Humus

    Humus Crazy coder
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,217
    Likes Received:
    77
    Location:
    Stockholm, Sweden
    Yes, but that doesn't mean it's necessarily accurate. This is especially true for output devices, in particular TVs, but there's no guarantee that there aren't any shenanigans going on on the other end of the cable too. For instance if you compare consoles you'll find that the same values in the framebuffer does not produces the same output over HDMI.

    Also, TVs generally do major processing of the digital signal. Sharpening filters are common and IMHO destroys the image. I would never buy a TV that wouldn't at least let me disable the digital sharpening. If you walk around in any electronics store and compare TVs you'll find that many of them exhibit halos around contrast rich areas, especially text. I'm sure the sharper look sells to the average consumer though. Another common thing they do is boost the contrast, and consequently often clip the darkest black and brightest white. Computer monitors are better in general and much more accurate, which is natural given that they are primarily used for work. Although many come with stupid "game" or "movie" modes that will destroy the image just like on any TV.

    Many TVs boost the saturation as well. And on the GPU side you have color controls in the control panel. These affect the output even if you have a digital connection, it's just that it modifies the digital signal. There's nothing saying that these on default will output a digital signal that's exactly identical to the values stored in the framebuffer (even though IMHO it should).
     
  3. Tchock

    Regular

    Joined:
    Mar 4, 2008
    Messages:
    849
    Likes Received:
    2
    Location:
    PVG
    I agree, color accuracy has to be tested with a calibrator or such.

    But I think it could very well show us the issue with "2D stability" - no demosaicing nor AA filters screwing around to play with it, and getting a prime Zeiss would minimize lens issues.


    I did notice quite a striking 2D difference when moving to my R600 from an FX5200 and even later on a 780G (analog signals for all) - the ATI output just looked much sharper, though it took a little tweaking to get it consistent over the 780G.

    I have to say though, it's a real PITA to quantify stuff like this.
     
  4. hoom

    Veteran

    Joined:
    Sep 23, 2003
    Messages:
    2,934
    Likes Received:
    487
    I recall mention way back regarding reference spec differences between ATI & NV, where NV let the board seller decide whether to use a high quality or cheaper part so some cards would be poor quality but others excellent, while ATI specified a high quality part & got uniform high quality.

    But I thought those circuits had been integrated onto the GPU or something later on.

    Could be that ATI just did a better job with those circuits & NV has been too busy making 3d glasses etc to spend a bit of time tidying up such a basic functional bit.

    Also possible that ATI just has a good default setting that works nicely with LCDs/Windows/Cleartype? (but again what is NVs excuse? Dynamic Vibrance?)
     
  5. LunchBox

    Regular

    Joined:
    Mar 13, 2002
    Messages:
    901
    Likes Received:
    8
    Location:
    California
    I could kind of relate... It's hard to explain this without being accused of being an ATI fanboy but it's just that the colors seem to "pop out" more on ATI cards.
     
  6. gongo

    Regular

    Joined:
    Jan 26, 2008
    Messages:
    582
    Likes Received:
    12
    I am getting obessed with this visual quirkies so i went to this famous site and did a test i did before when i used the NV 6800GT.

    http://www.lagom.nl/lcd-test/white.php

    Whats changed? I am surprised once more by the HD 3300! Aside from the white saturation test where i cannot see the 253, 254 squares, all tests look as good as my PS3 through HDMI calibrated!! Unbelievable! Okies, there seems to be a slight dark crush where the darker shades appear slightly darker than it should but still visible.

    When i tested with the NV card VGA, i failed the gamma test, the gradient banding test, the contrast test passed after tweaking via NVCP. I used to hear going 1920x1200, you have to use the digital interface, but what i seen, i can say i am fairly comfortable with ATI VGA!
     
  7. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    14,859
    Likes Received:
    2,277
    ive a 260 with a dirt cheap lcd and mine passed :)
     
  8. vazel

    Regular

    Joined:
    Aug 16, 2005
    Messages:
    992
    Likes Received:
    3
    The Geforce FX line was noticeably inferior to ATI in colors. With the introduction of the Geforce 6 series Nvidia became on parity with ATI but some still think ATI has an edge.
     
    #28 vazel, Apr 6, 2009
    Last edited by a moderator: Apr 12, 2009
  9. I.S.T.

    Veteran

    Joined:
    Feb 21, 2004
    Messages:
    3,174
    Likes Received:
    389
    nVidia has probably gotten better in PQ since the GeForce 6 series. That would explain both gongo and Davros' posts.
     
  10. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    16,071
    Likes Received:
    5,018
    Interesting to say that. As GF4 and earlier always seemed more "washed out" than other vendors cards (Matrox, ATI, 3dfx, etc...). Skipped both the FX series and 6 series. But with the 7 series, I didn't really notice it being much worse than ATI. Then again I wasn't comparing them side by side at the time either. And the 7 series board was a Quadro, so maybe the colors were tweaked a bit differently there for the professional market.

    Regards,
    SB
     
  11. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,456
    Likes Received:
    578
    Location:
    WI, USA
    I have a Radeon SDR PCI that puts out a fantastic SVIDEO image. It looks better than the eternally praised Matrox G400. And it's definitely better than some older GeForce cards that I've used.
     
    #31 swaaye, Apr 7, 2009
    Last edited by a moderator: Apr 7, 2009
  12. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    You do know that, when it comes to digital cameras, cheaper models usually have more 'pop' than more expensive ones, right? That's because more color saturation is often perceived as more pleasant to the eye, even if you're over-processing the picture.

    Same thing goes for post-processing of audio: most correct reproduction doesn't necessarily equals the best perceived listening experience.

    I don't know how relevant this is wrt video cards, but it's something to keep in mind.

    I've always color-calibrated my monitors with a HW calibrator, both with Nvidia and ATI cards and what always amazes me is how much one needs to modify the default settings of a monitor before it's considered calibrated, no matter which card I've used.

    Once calibrated, I can't say I'm seeing a lot of differences between brands, but that may be just me...

    Anyway, my personal opinion on this whole matter is that, as long as you're not comparing calibrated setup's, these discussions aren't very meaningful.

    BTW: DVI is not necessarily digital. A DVI-I connect has both digital and analog signals on its pins. It's up to the monitor to decide which pins to use...
     
  13. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    Are there still differences then?
    I thought DX10 spec had not only rules for rasterizing accuracy, but also for colour accuracy and texture filtering... Which should mean that any DX10 card, be it ATi, nVidia or whatever other brand, should give the EXACT same colours on a 2d desktop (especially in the case of Aero, which uses 3d acceleration).

    Also, I thought Digital Vibrance was abandoned by nVidia years ago, with the 8800 series.
     
  14. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    I think the specs determine what ends up in RGB format in the frame buffer. (Even there, it's not bit-exact, especially wrt texture filtering.)

    But the question is what happens on the way from the frame buffer to the output connector. I doubt that's heavily regulated by the spec (if only because it can't be tested independently anyway...)
     
  15. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    Well, in this age of DVI and HDMI, nothing should happen.
     
  16. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    Because the introduction of digital audio CD's made the output of all audio system identical?

    How are you going to handle:

    Color calibration profiles?
    Gamma correction?
    Video scaling?

    DVI doesn't magically solve that fact that all monitors display incoming data differently.
     
  17. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    Well, I was under the assumption that we were talking about different image quality of GPUs on the *same* monitor.

    In which case your analogy would be: the introduction of SPDIF output makes the output of all CD players equal (ofcourse assuming that they are not doing any special processing).
     
  18. Humus

    Humus Crazy coder
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,217
    Likes Received:
    77
    Location:
    Stockholm, Sweden
    In reality though, stuff is happening after the framebuffer. I don't think there's much of a difference in general between GPUs on PCs, but there's certainly a difference between consoles, even with identical framebufers and across HDMI.
     
  19. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,166
    Likes Received:
    1,836
    Location:
    Finland
    Don't both of them use scalers after the framebuffer, before output? that explains it quite easily
     
  20. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    The same objections apply:
    - gamma correction will be present in both.
    - support for more detailed calibration
    - video scaling

    Right out of the box, it's likely that default settings will be different.

    (I never did the test though, so this is all IMHO...)
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...