Which DX11 video card meets the best price/performance/power efficiency?

Discussion in 'Architecture and Products' started by Shtal, Dec 19, 2010.

?

Which DX11 video card meets the best price/performance/power efficiency?

  1. AMD Radeon 6970

    1 vote(s)
    1.1%
  2. AMD Radeon 6950

    34 vote(s)
    37.4%
  3. AMD Radeon 6870

    9 vote(s)
    9.9%
  4. AMD Radeon 6850

    23 vote(s)
    25.3%
  5. ATI Radeon 5970

    0 vote(s)
    0.0%
  6. ATI Radeon 5870

    2 vote(s)
    2.2%
  7. ATI Radeon 5850

    1 vote(s)
    1.1%
  8. ATI Radeon 5830

    1 vote(s)
    1.1%
  9. ATI Radeon 5770

    2 vote(s)
    2.2%
  10. ATI Radeon 5750

    0 vote(s)
    0.0%
  11. Nvidia GeForce GTX580

    1 vote(s)
    1.1%
  12. Nvidia GeForce GTX570

    8 vote(s)
    8.8%
  13. Nvidia GeForce GTX480

    0 vote(s)
    0.0%
  14. Nvidia GeForce GTX470

    0 vote(s)
    0.0%
  15. Nvidia GeForce GTX 460 1GB

    8 vote(s)
    8.8%
  16. Nvidia GeForce GTX 460 768MB

    0 vote(s)
    0.0%
  17. Nvidia GeForce GTS 450

    0 vote(s)
    0.0%
  18. Other

    1 vote(s)
    1.1%
  1. I.S.T.

    Veteran

    Joined:
    Feb 21, 2004
    Messages:
    3,174
    Likes Received:
    389
  2. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    4
  3. I.S.T.

    Veteran

    Joined:
    Feb 21, 2004
    Messages:
    3,174
    Likes Received:
    389
    *Checks that FS review*

    Jesus christ that is Tom-esce levels of fail. They don't list their full system, for one thing.

    Also, much of the power difference can be explained by the fact that they're using Intel's top of the line CPU, which has six cores. The CPU is also massively overclocked.
     
  4. liolio

    liolio Aquoiboniste
    Legend

    Joined:
    Jun 28, 2005
    Messages:
    5,724
    Likes Received:
    195
    Location:
    Stateless
    I chose the GTX 460 1GB.
     
  5. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    4
    But for some reason it ONLY effects Nvidia with massive power draw, ATI does NOT suffer in this configuration.
     
  6. I.S.T.

    Veteran

    Joined:
    Feb 21, 2004
    Messages:
    3,174
    Likes Received:
    389
    *checks*

    Correct on the load power measurement, but on the idle, it's much higher than TR no matter which company they're using.

    http://www.techreport.com/articles.x/20088/12 TR's Radeon HD 6970/6950 review. The power consumption page. Their test set up can be found on page five.

    FS also does not list what they were testing exactly when they got that measurement.

    I suspect it's Furmark, given the insanely high numbers. Few if any apps other than Furmark can get that high. Given that basically nothing gets that high, Furmark power consumption numbers cannot be relied upon for either AMD or NVIDIA. Find a few reviews that tests games or benchmarks that are commonly used, and that'll give you the best idea of power consumption.

    Edit: To best point out how the massively OCed and six core CPU effects things, compare the no card idle result on FS's review to the GTX 580 SLI result in the link I gave above. It will enlighten things greatly.
     
  7. chavvdarrr

    Veteran

    Joined:
    Feb 25, 2003
    Messages:
    1,165
    Likes Received:
    34
    Location:
    Sofia, BG
  8. CRoland

    Newcomer

    Joined:
    Jan 19, 2010
    Messages:
    114
    Likes Received:
    0
    This. If 6950 were a bit cheaper they'd be tied IMHO.
     
  9. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,470
    Likes Received:
    1,686
    Location:
    Treading Water
    This is a good point. I wonder if a 1GB 6950 won't end up winning. It would save a few watts and a few dollars
     
  10. caveman-jim

    Regular

    Joined:
    Sep 19, 2005
    Messages:
    305
    Likes Received:
    0
    Location:
    Austin, TX
    how?
     
  11. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,470
    Likes Received:
    1,686
    Location:
    Treading Water
    Memory uses power.
     
  12. nyt

    nyt
    Newcomer

    Joined:
    May 14, 2003
    Messages:
    80
    Likes Received:
    0
    Location:
    Mtl
    #32 nyt, Dec 22, 2010
    Last edited by a moderator: Dec 22, 2010
  13. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,541
    Likes Received:
    964
    30W? Where do you get that figure?
     
  14. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,022
    Likes Received:
    122
    People are looking at power differences between HD 5870 1GB and 2GB. These use 16 1gbit chips instead of 8 1gbit chips, unlike the HD69xx which use 8 2gbit chips, so that's really comparing apples and oranges.
    Too bad hynix hasn't released the datasheet for the 2gbit parts, so we could finally put that rumor about 2GB using (much) more power to rest. For all I know, it could use less power if it's made on a newer process...
     
  15. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,418
    Likes Received:
    10,311
    Nvidia have been capping Furmark since at least the GTX 2xx series and ATI starting with the 4xxx (app detection) series.

    Nvidia, with GTX 580, their hardware power monitoring and containment wasn't sufficient and they added app detection to cope with runaway power.

    ATI with 5xxx put in hardware monitoring and containment similar to what Nvidia had with GTX 2xx and then further evolved that with more elegant power containment algorithms and monitoring with 6xxx in addition to allowing end users to adjust the aggressiveness of the power containment.

    Furmark should never ever be used as a benchmark of power useage. It's ONLY benefit is to see how well various companies power monitoring and containment solutions are at dealing with runaway power situations. But that benefit is completely gone if you have to rely on app detection (GTX 580) in order to prevent run-away power use.

    Regards,
    SB
     
  16. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    4
    I do appreciate for the votes and it helped making right decision for upgrading video card, I bought Radeon 6950 :)

    The only small problem I have - running Radeon 6950 on Intel Quad Q9650 OC @ 3.6GHz is still to slow, plus PCI-Express X16 version.1.1 and NOT 2.0 since I still have Asus P5K-Deluxe motherboard.
     
  17. chiadog

    Newcomer

    Joined:
    May 21, 2008
    Messages:
    21
    Likes Received:
    0
    ^Too slow for? I don't think PCI-E v1.1 make much difference in frame rates. Did you try unlocking your 6950 to -70? It may give you enough performance gains to be happy (given that your chip can be successfully unlocked).
     
  18. Sxotty

    Legend

    Joined:
    Dec 11, 2002
    Messages:
    5,496
    Likes Received:
    866
    Location:
    PA USA
    Well they don't necessarily need to be the same though and then you would need to weight the performance and power efficiency. It could be that a cheap one had cruddy power characteristics, or one that had superior power characteristics might end up some dog slow performer.
     
  19. Sound_Card

    Regular

    Joined:
    Nov 24, 2006
    Messages:
    936
    Likes Received:
    4
    Location:
    San Antonio, TX
    The HD 4850 has Tessellation. :razz:
     
  20. Babel-17

    Veteran

    Joined:
    Apr 24, 2002
    Messages:
    1,073
    Likes Received:
    307
    Huh, within a minute of reading your post I stumbled on this.

    http://www.rage3d.com/board/showthread.php?t=33972379
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...