HD 4870 X2 (R700) review thread

Discussion in '3D Hardware, Software & Output Devices' started by willardjuice, Jul 14, 2008.

  1. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,793
    Likes Received:
    1,076
    Location:
    Guess...
    Huh? No its not. The 9800GX2 should be quite a lot faster than the 4870 in most situations. Its faster than the GTX280 afterall which is in turn faster than the 4870.

    Comparing the 9800GX2 to the 4870 really is no different than comparing the 4870X2 to the GTX 280.

    Except when it comes to pricing NV looks much better:

    HD 4870 = $270
    9800GX2 = $285

    GTX 280 = $430
    HD 4870X2 = $550

    Those prices are from Newegg based on the lowest cost GPU in each catagory.

    In both cases the dual GPU solutions beat the single GPU solutions but in the 4870/GX2 comparison the price difference is a mere $15 while in the 280/X2 comparison the difference is a pretty huge (and IMO completely unjustified) $120!

    What is perhaps most interesting from the above is the price premium both companies charge for having the fastest solution. In NV's case, the fastest single GPU premium = $160 and in ATI's case the fastest dual GPU premium = $265. Thats nearly twice the cost of a GX2 for no-where near twice the performance.

    ATI have clearly done a great job with the 4xxx series and they are now completely competitive with NV again however I think people are getting way too carried away with their enthusiasm for this part. The pricing and performance are now at best, in parity with NV but to read most of the threads around the topic you would think NV was dead and buried already!
     
  2. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,793
    Likes Received:
    1,076
    Location:
    Guess...
    Another important point to add to the above is this:

    GTX260 = $255

    Again, lowest price at Newegg.

    Thats $15 cheaper than a single 4870 for a GPU which is arguably at least as fast across the board.

    Or you could put a pair of them in SLI for $510 which is $40 less than the X2 and again, is likely to be about the same speed.
     
  3. Chris123234

    Regular

    Joined:
    Jan 22, 2003
    Messages:
    306
    Likes Received:
    0
    There is nothing interesting about that because there is no such thing as a "fastest dual gpu premium" and "fastest single gpu premium". The only premium there is is the
    "fastest card premium".

    Also, the GTX 280 was something like $650 when it was released and was only reduced because the X2 is beating the hell out of it while the 4870 is matching it. AMD know they can charge the money so they are going to do it and considering how much cards like the 8800 Ultras cost under nVidias reign, I don't think it's too unreasonable.
     
  4. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,793
    Likes Received:
    1,076
    Location:
    Guess...
    Plenty of people aren't interested in dual GPU configs because of the hassles associated with them and thus there very much is a distinction to be made between dual GPU setups and single GPU setups even when they both use a single board.

    Which has zero baring on the current situation. When the GTX280 was released the fastest single GPU from ATI was the 3870 so of course NV could afford to charge a huge premium.

    No, the 4870 is quite a bit slower than the GTX280, thats a fact. The X2 is faster but hardly to the point of "beating the hell out of it" (although thats obviously a matter of work inpterpretation).

    Yes, there's nothing unreasonable about charging more for a faster GPU. I'm just saying that for me personally, I don't think the X2 is worth such a large price premium when its performance isn't that far ahead and it comes with all the problems of a dual GPU setup.

    Especially when you can get an 260 SLI setup for $40 cheaper that would be just as fast.
     
  5. hoom

    Veteran

    Joined:
    Sep 23, 2003
    Messages:
    3,022
    Likes Received:
    547
    OK maybe not 'mostly' but on many benchmarks 4870 is winner vs. 9800GX2.
    GTX260 beats the 9800GX2 in plenty of benchmarks too.

    Pricing wise, cheapest available in my end of the world: (NZ$)
    9800GX2 $655.88
    GTX260 $465.53
    GTX280 $683.10
    4870 $414.59
    4870X2 $898.99

    My 3870 was $650. :cry:
    The GTX280 was over $1000 on launch (someone is still trying to sell for $1200!) & X2 is still being price-gouged.

    When the design team made the choice from the start (& included optimisations) to go double-chip at the top-end & even the 2 chip card uses less silicon area than the single chip opposition I see no reason not to compare whether that worked out better than the monolithic path.
     
  6. Dooby

    Regular

    Joined:
    Jul 21, 2003
    Messages:
    478
    Likes Received:
    3
    I have had what would be bordering on “a lot” of gfx cards over the last few years (and played with MANY more) including all variations of SLI and CF. The X2 doesn’t come close to being as painful to use as those solutions. In fact, it’s not even a little bit more painful than a single card. I took out a 8800GTX, pushed this in, installed drivers, loaded up WoW, put settings to twice what they were and still got 124fps vs. 35fps.

    I have since played Crysis, COD4, WoW, WIC, STALKER, CnC3, HL2, SupCom and GoW. Every game (bar Crysis) is running ridiculously fast, even at my settings (2560x1600, 8+xFSAA, 16xAF, max game detail). Also, Crysis runs at 1920x1200 4xFSAA 16xAF Very High at 25-30fps, which is absolutely playable. Please explain these “problems of a dual GPU setup” to me, who uses it every day.

    When you bear in mind that people that are looking at expensive gfx cards are also looking for the best performance, most people are gonna go with a X48 etc chip. Which means no SLI. Also, looking around, a GTX260 is £184, and I picked up a X2 for £343. They were £375 elsewhere at first, which could signal a price drop coming soon.

    There are lots of tests out there that show X2 beating 280GTX SLI at best, and drawing with 260GTX SLI at worst. I'm not fully sure PowerPlay was enabled in the reviews online, cos they seem to be using 8.52.2 driver, yet I'm on 8.52.6, and there’s *no* way my card is as loud as they are reporting. As I’ve said, it’s quieter under full load than my 8800GTX single, and if my gf turns on her X1900, you can hear that over the X2. 2x 260GTX is gonna sound a HECK of a lot louder, even more so at idle, from what I can see online.

    The drivers are a heck of a lot better than Nvidia’s. I suffered while I had my 8800GTX. Settings that didn’t stick, image scaling that only worked 1 time in 10, NVCP asking me to confirm something every ten minutes dragging me out of games, nnnnnnnnnngh.

    If I had to be a fussy pussy my one gripe about AT-MD, is that for some reason FSAA doesn’t work on windowed programs. As I only run WoW in a window, it’s not that much of a problem for me, but still a niggle. I can set FSAA from within WoW, but it’s much slower than the AMD CCC FSAA.
     
  7. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,793
    Likes Received:
    1,076
    Location:
    Guess...
    Don't get me wrong, its a valid comparison. I'm just saying its no more or less valid than the 9800GX2 vs 4870 which often goes forgotten in these types of discussions.

    At US prices anyway, your prices obviously change things quite a bit.
     
  8. OpenGL guy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,357
    Likes Received:
    28
    CCC FSAA does work in on windowed apps, but you can't have any apps running if you want the setting to stick. Make sure you exit all 3D apps, change the CCC setting, then run your app and AA should be working fine.

    Note that we don't allow CCC AA for WoW, but you're free to enable custom filters and such.
     
  9. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    9,051
    Likes Received:
    2,924
    Location:
    Finland
    AA speed shouldn't be any different when you set it from the game settings rather than CCC, it's the same hardware doing the same MSAA anyway?
     
  10. OpenGL guy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,357
    Likes Received:
    28
    It's likely that CCC AA is faster in WoW because CCC AA isn't allowed on WoW ;) Use in-game AA.
     
  11. Broken Hope

    Regular

    Joined:
    Jul 13, 2004
    Messages:
    483
    Likes Received:
    1
    Location:
    England
    I was under the impression that CCC AA worked in WoW as long as you had full screen glow disabled in the video options, or do you guys now disable CCC AA with Catalyst AI?

    Speaking of WoW I've noticed there is a fair bit of crawling with the textures, I'm not sure if it's due to bad textures or due to the filter optimizations in Catalyst AI, I really wish the game detection/bug fixes were separate to the filter optimizations, for a card as fast as the 4800 series it's a shame we have to choose between bug fixes and image quality, Nvidia users don't have to make that choice since it's a separate option.
     
  12. apoppin

    Regular

    Joined:
    Feb 12, 2006
    Messages:
    255
    Likes Received:
    0
    Location:
    Hi Desert SoCal
    There is a bug [?] i am finding in Crysis with Cat 8.7 with 4870. AA settings do not affect FPS when set from CCC but do work when set from in-game.
    :???:

    At any rate, i am finding a lot of weirdness by changing setting in Crysis and am going to retest with 8.8s.

    i also bought a VT HD4870x2 from Best Buy [$469!!] which will arrive next week while i am at Nvision08. So i will be benching with CrossfireX3 and change out my MB from p35 to x48 and my CPU from e4300 [@3.25] to E0 stepping of Q9550. Should make for some improvement at 19x12. i will also post some test results.
     
  13. OpenGL guy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,357
    Likes Received:
    28
    It's disabled with Catalyst AI.
     
  14. OpenGL guy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,357
    Likes Received:
    28
    Yes, we don't allow forced-AA in Crysis. The app's rendering path is complicated and since the app offers in-game AA, there's no need for us to spend time trying to make forced-AA work.
     
  15. Arnold Beckenbauer

    Veteran

    Joined:
    Oct 11, 2006
    Messages:
    1,433
    Likes Received:
    368
    Location:
    Germany
    + if you enable AA in Crysis, the game disables its EdgeAA (r_useEdgeAA).
     
  16. Dooby

    Regular

    Joined:
    Jul 21, 2003
    Messages:
    478
    Likes Received:
    3
    This is, and has been a major complaint about WoW for a LONG time, with both card makers. Its an engine problem IIRC.

    TBH EdgeAA sucks. It "blurs" edges by adding a texture halo-type thing, which looks no where near as good as FSAA. With my 4870X2 at 1920x1200, 4xFSAA set from in game takes less of a hit, but provides better OVERALL image quality, if you dont mind the leaves being un-AA'd.
     
  17. Broken Hope

    Regular

    Joined:
    Jul 13, 2004
    Messages:
    483
    Likes Received:
    1
    Location:
    England
    Sad thing is in the WotLK beta the problem seems worse if anything. Though there does seem to be improvement with AI disabled.
     
  18. AnarchX

    Veteran

    Joined:
    Apr 19, 2007
    Messages:
    1,559
    Likes Received:
    34
  19. apoppin

    Regular

    Joined:
    Feb 12, 2006
    Messages:
    255
    Likes Received:
    0
    Location:
    Hi Desert SoCal
    Is there any way to enable Edge AA in Crysis? Off or on, it does not appear to impact FPS [testing with either 4x or 8x AA enabled in-game]

    i am finding that there is about a 5% across-the-board improvement with Cat 8.8 over Cat 8.7 with a single HD4870 and some of the glitchy visuals are improved

    My Hd4870x2 will be here next week when i am back from Nvision. I am planning on testing CrossfireX-3 as i upgrade my MB from p35 to x48 and my e4300@3.33Ghz to hopefully a Q9550 near 4.0Ghz.
     
  20. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    9,051
    Likes Received:
    2,924
    Location:
    Finland
    Before that someone needs to go adjust PCI Express specs to allow over 300W consumption

    http://www.xbitlabs.com/articles/video/display/radeon-hd4870-x2_14.html#sect1
    Interesting results btw, what's limiting the Radeons FPS there? I mean, on lower resolutions GF's are far beyond the Radeon speeds but the CF solutions lose practicly no performance at all no matter how much you raise the res while GF's FPS cuts in half or under
     
    #340 Kaotik, Aug 23, 2008
    Last edited by a moderator: Aug 23, 2008
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...