ATI GPU image quality just looks better! But why?

Discussion in '3D Hardware, Software & Output Devices' started by gongo, Apr 5, 2009.

  1. gongo

    Regular

    Joined:
    Jan 26, 2008
    Messages:
    582
    Likes Received:
    12
    I have always heard about how ATI image is more vibrant and striking but always skeptical myself. How was this possible, at this day and age, since the imaging hardware should be of the same quality. The ramdacs runs at speed past normal monitors, dvi is digital is digital, 2d rendering is cheap, etc. etc.

    That is until what i have experienced that I couldn't believe my eyes initially but, but it is true! The colors and 2D stability between my 6800GT and HD3300 on my Dell 2407WS monitor via VGA, the ATI integrated chipset actually provides a clean 2D, text especially there are no faint double images, the colors are indeed more vibrant and the contrast is more balanced, and all are without any calibration! The old 6800GT i had to constantly go to Nvidia image calibrator and tweak my contrast and gamma and even so my image on screen was never this clear!

    I have tried DVI and same thing, the 6800GT while no longer have faint double images, exhibits a faint white glow overlay while the HD3300 just look right! I have not touched any calibration, at all, since using my new PC for a month!

    Here comes my question, a simple one that dates back to the start of my thread, how is this possible? Driver level problems or Nvidia cheaping out on imaging hardware?
     
  2. air_ii

    Newcomer

    Joined:
    May 2, 2007
    Messages:
    134
    Likes Received:
    0
    Without relating to the validity of your claim, you might want to compare adapters from more or less the same generation.
     
  3. Florin

    Florin Merrily dodgy
    Veteran

    Joined:
    Aug 27, 2003
    Messages:
    1,644
    Likes Received:
    214
    Location:
    The colonies
    Or the subjective perception of a test group with a sample size which is 1 so far (but c'mon B3D, here's a chance to testify!).
     
  4. Neb

    Neb Iron "BEAST" Man
    Legend

    Joined:
    Mar 16, 2007
    Messages:
    8,391
    Likes Received:
    3
    Location:
    NGC2264
    True. 7900GT/8800GTX -> 4870. Improved color representation and image clarity right out of the box. With NV I had to calibrate the sharpness and colors in the NVCP.
     
  5. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    14,852
    Likes Received:
    2,268
    Dave is that you...
     
  6. no-X

    Veteran

    Joined:
    May 28, 2005
    Messages:
    2,297
    Likes Received:
    247
    I know a professional photographer, who says, that Radeons at default have more accurate colours, than anything what he can get out of GeForce even with custom tweaks in CP (so he use only Radeon cards no matter if they are fater or slower in 3D).

    I don't know, what's the cause of the IQ, but the engineer, who is responsible for that, does his job right :smile:
     
  7. gongo

    Regular

    Joined:
    Jan 26, 2008
    Messages:
    582
    Likes Received:
    12
    If i get what you are saying that that is my question, just how much is it hardware relatedl? And how come? The 6800GT is by no means a budget sku while my onboard video probably counts as one...
     
  8. gongo

    Regular

    Joined:
    Jan 26, 2008
    Messages:
    582
    Likes Received:
    12
    I swear i am not seeing things! There is a perceptable difference in clarity especially when it came to VGA. The faint double image caused me lots of headache when i switched to VGA with the 6800GT. I had tried tweaking NV digital vibrance too but the vibrance just felt... not as nice... as ATI default video out!
     
  9. nutball

    Veteran Subscriber

    Joined:
    Jan 10, 2003
    Messages:
    2,146
    Likes Received:
    472
    Location:
    en.gb.uk
    Photos, or it didn't happen.

    Seriously. Get a decent SLR (Canon/Nikon), a decent macro lens (a *macro* lens, not a lens with a macro setting), and a tripod. Shoot RAW, convert to TIFF/PNG, crop hard and post the results.
     
  10. gongo

    Regular

    Joined:
    Jan 26, 2008
    Messages:
    582
    Likes Received:
    12
    Hahahha..i am stumped by your request, but honestly, i can tell a difference and i stand by that. I was very skeptical about such "fanboy" claims but i am a convert.

    Could it be my 6800GT vga controller is bad? What caused it to be "bad"? As for the colors and contrast, i really have no idea what is the caused for ATI default superiority! The image engineers must have been paid very well.
     
  11. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,428
    Likes Received:
    426
    Location:
    New York
    There is a big difference between accurate and "looks good". You're never going to be able to find some concrete answer for why you think something looks better. It's all subjective, just stick with what you like.
     
  12. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    14,852
    Likes Received:
    2,268
    the point is a bit moot now as most people use dvi
     
  13. Albuquerque

    Albuquerque Red-headed step child
    Veteran

    Joined:
    Jun 17, 2004
    Messages:
    3,845
    Likes Received:
    329
    Location:
    35.1415,-90.056
    Not only that, but some "subjective" testing of various audio compression schemes that I did on this forum a few months back shows that accurate != preferable.

    I can't imagine it being any different for video.
     
  14. no-X

    Veteran

    Joined:
    May 28, 2005
    Messages:
    2,297
    Likes Received:
    247
    Sorry, this is nonsense. How can you judge per-pixel image quality by using a tool, which is based upon low-pass optical filter and bayer mask, which creates 2/3 of color information by digital interpolations? (while the rest completely process by several algorithms reducing aliasing, noise, changing levels, dynamic range and other imperfections of CMOS/CCD)

    Davros: There is difference in colours even when using DVI. I don't know what the two companies do in a different way, but the difference in colours is mentioned persistently and often by people, who works as photographers, designers, ets. so definately not fanboys.
     
  15. Tchock

    Regular

    Joined:
    Mar 4, 2008
    Messages:
    849
    Likes Received:
    2
    Location:
    PVG
    ^

    Foveon X3 to the rescue! :grin:



    Anyway the Chiphell guys were particular on pre-G80 nVidia adapters in 2D. Before the F word is thrown around, seems like the whole issue was resolved post G80. The 2D display engine being completely virtualized in R600+ chips has an effect on this, too?
     
  16. nutball

    Veteran Subscriber

    Joined:
    Jan 10, 2003
    Messages:
    2,146
    Likes Received:
    472
    Location:
    en.gb.uk
    It would be nice though don't you think if folks could make at least *some* attempt to document and demonstrate what they say they think they're seeing? Obviously this isn't possible for all elements of the image quality discussion (particularly colour-related), but for things like ghosting and glows it's not *that* hard.

    But if someone is claiming for example "oh, my 40" Samsung HDTV seems to show a sort of halo/glow/smearing effect around text even when I'm using HDMI, here's a photo I took to try to demonstrate it..."

    [​IMG]

    Obviously I've had to de-focus a little to avoid resolving the individual colour elements, but I think the effect is pretty clear to see, especially if you stand back and/or use averted viewing.
     
  17. no-X

    Veteran

    Joined:
    May 28, 2005
    Messages:
    2,297
    Likes Received:
    247
    Foveon is good for capturing per-pixel detail and per-pixel colour information, but because of the specific approach, sensitivity of the 3 colour layers isn't identical (their curves are anything, but aligned and a lot of corrections are applied, too) - so I don't thing DSLRs are optimal tools for measuring of colour accuracy :)
     
  18. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,456
    Likes Received:
    578
    Location:
    WI, USA
    Well there's no denying that GFFX - GF7 were deficient in the 3D quality. But for 2D, I don't think I've ever seen a difference when dealing with DVI. Even VGA output from modern mid/high-end cards is great. If you look at IGPs or cheap cards, VGA can get ugly because the signal quality wasn't a big concern but DVI still works great.
     
  19. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    Reminds me of years ago when, in 3d games, of course, my 3dfx card would display proper oranges and reds so that you could discern the difference between the two colors, whereas my TNT displayed orange and red as red, on the same monitor. In fact, at first I thought there was something wrong with the 3dfx card until I realized that there was no orange to see on the TNT display-just red! I tinkered with the TNT's color controls and alleviated it to some degree, but then doing so threw off other colors in the display.
     
  20. OpenGL guy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,357
    Likes Received:
    28
    Do you have ClearType fonts enabled?
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...