AMD Radeon R9 Fury X Reviews

Discussion in 'Architecture and Products' started by gamervivek, Jul 18, 2015.

  1. UniversalTruth

    Veteran

    Joined:
    Sep 5, 2010
    Messages:
    1,747
    Likes Received:
    22
    What's interesting to me is how Fury X will behave with new modern games. Usually AMD cards do better several years into their lifecycle.
    Imagine the picture in 2017-2018.

    BTW, if we add to those 12 games (in which R9 Fury X beats or slightly beats) the other:

    Batman: Origins
    Battlefield 4
    GTAV
    Metro LL
    Watch Dogs

    5 games, in which Fury X is slightly behind, then we have 17 out of 22 games in which either it wins or is very slightly slower.

    The other games are troublesome with the driver optimisation and need urgent work from AMD. At that time Fury X might turn into the undisputable leader.
     
  2. Dr Evil

    Dr Evil Anas platyrhynchos
    Legend Veteran

    Joined:
    Jul 9, 2004
    Messages:
    5,766
    Likes Received:
    774
    Location:
    Finland
    But who cares what the reference model 980ti does at stock settings?

    [​IMG]

    I'm pretty sure a large majority of 980tis sold are the 3rd party models. They need to find quite a bit in the drivers to catch up to that, and that's at 4k, the most favorable resolution for Fury X.
     
    gamervivek likes this.
  3. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY

    And how about the 5 games that the Ti is slightly behind the Fury X (3% or less) in that game suite, will you give those the same stigma as you just did with the 5 games you just listed?

    Life cycle of high end cards people that buy high end cards is usually one generation and that's it. So if they don't perform well for this generation of games, forget about waiting another generation of games to come out to show the true power of hardware. AMD might as well just throw themselves in to a grave along with the shovel if that is what they want from their customers, because no one will wait on them, they already waited 9 months....
     
  4. UniversalTruth

    Veteran

    Joined:
    Sep 5, 2010
    Messages:
    1,747
    Likes Received:
    22
    Nothing, I do not know.

    I want to see a very deep detailed review of image quality because I think Nvidia cheats with this to enable higher performance through compromises.
     
  5. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    why don't you do it then, because I don't think anyone truly believes that anymore, it happened with the FX and since the 68xx series, both companies have similar IQ (adaptive AF) and the g80 was better then the r600 with full AF all the time which AMD equalized a generation later.
     
  6. Albuquerque

    Albuquerque Red-headed step child
    Veteran

    Joined:
    Jun 17, 2004
    Messages:
    3,845
    Likes Received:
    329
    Location:
    35.1415,-90.056
    Razer, I've completely lost who you're replying to. Sorry.

    In any case, I generally agree the Fury X seems to be only marginally different than the 980Ti at 4K resolutions. Unfortunately, purely for my own use cases, 4K is not interesting to me. Given the majority of games will need to come down out of "uber" quality settings at that resolution to have fluid framerates, I still do not concede that 4K is reliably attainable with any single card on the newest games.

    As such, I personally find the results at 1440p a better point (for me), as I can keep every graphics option at the top edge while maintaining good framerate. And at that point, the Ti is outpacing the FuryX. Combned with the extra $30 I paid for a Gigabyte G1 flavored 980Ti, and there isn't a FuryX out there that's going to touch me.
     
  7. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    sorry i was replaying to universaltruth.

    Yeah AMD just came out too late with the Fury X (possibly to do HBM's mass production scehdule) if it came out lets say 3 months before the 980 Ti which would have put it up against Titan X and still priced as it is right now, this would have been a different picture. But with the 980 ti and its overclocked versions the 980 Ti's are better cards right now.
     
  8. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,865
    Likes Received:
    192
    Location:
    Seattle, WA
    What do you mean?

    DX12 incorporates many of the advantages found in AMD's Mantle API for use with all video cards. AMD cards have very little difference between running Mantle and running DX12, while nVidia gains a significant performance boost. I don't expect the relative difference between AMD and nVidia cards to change all that much between DX11 and DX12.

    Considering that Microsoft is giving out Windows 10 for free for current Windows 7/8 users, whether legal are not, and that all indications are that it will be one of the better Windows releases, I expect DX12 adoption to be very fast, which will hopefully mean good things for games going forward.
     
  9. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,865
    Likes Received:
    192
    Location:
    Seattle, WA
    I don't think you can generalize like that. I usually don't buy a new video card every generation, for example. I only upgrade when there's a big enough performance boost to see a large difference in my games.
     
    Razor1 likes this.
  10. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    true its mostly personal preference when upgrading, I tend to do the same but usually every generation there is a 80% increase in performance, outside of a few here and there, the gtx 980 wasn't enough for me to upgrade but the Titan X was even though it didn't have the typical 80% instead more like 50% but the need for vram pushed me to upgrade.
     
  11. Rys

    Rys PowerVR
    Moderator Veteran Alpha

    Joined:
    Oct 9, 2003
    Messages:
    4,156
    Likes Received:
    1,430
    Location:
    Beyond3D HQ
    NVIDIA will support DX12 too, levelling the low-overhead API playing field. Unlike Mantle, it's not an exclusive advantage for AMD.
     
  12. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,780
    Likes Received:
    4,431
    To be honest, the mini-ITX standard is becoming rather mainstream. Boards and cases aren't that much more expensive than their micro-ATX or ATX equivalents anymore, which makes sense because the manufacturers end up spending less money on materials and connectors, and the PCBs are smaller too. One could argue that the motherboards need to be more compact so the PCB takes more layers, but ever since the northbridge went into the CPUs there's a whole lot less to put in there.
    Indeed, the SFX PSUs are more expensive (some ~20-30% than the ATX equivalent?), but most mini-ITX cases take ATX PSUs anyways.

    Also, I do question the need for a very small graphics card for a mini-ITX case. It seems to me that most mini-ITX cases nowadays are rather long, in order to fit larger graphics cards.
     
    homerdog likes this.
  13. Moloch

    Moloch God of Wicked Games
    Veteran

    Joined:
    Jun 20, 2002
    Messages:
    2,981
    Likes Received:
    72
    While I believe the Fury's performance will improve with driver tweaks I'm a bit more excited for the real next gen parts, no more of this 28nm nonsense and their high end cards having twice the ram of their halo product,. Bit odd doubling the memory for a re-badge (with some tweaks, some of which make it even worse on the perf/watt scale) to make it "future proof" and for high res gaming and their halo product have half the ram of the high end card but assure everyone they have a driver guy working on vram usage optimization.
     
    BRiT, pharma and homerdog like this.
  14. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,865
    Likes Received:
    192
    Location:
    Seattle, WA
    Right. I'm just trying to say that DX12 erases most, if not all, of the advantage AMD has with Mantle. Of course, we'll know for sure whether or not this is true within a few months once we have more DX12 benchmarks, but overall I'm just saying that I don't think it makes sense to weight Mantle too highly when making a purchase decision.
     
  15. homerdog

    homerdog donator of the year
    Legend Veteran Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,124
    Likes Received:
    902
    Location:
    still camping with a mauler
    This may be a chicken and egg problem. The GPU is basically the only reason why an mITX case would need to be that long (PSU maybe to a lesser extent), so if there were more small and powerful GPUs on the market we may see more short mITX cases.
     
    Lightman likes this.
  16. gamervivek

    Regular Newcomer

    Joined:
    Sep 13, 2008
    Messages:
    715
    Likes Received:
    220
    Location:
    india
    Using mantle itself?

    I do.

    I don't, maybe nvidia's maxwell will mean that they use the feature levels more but dx11 will continue for quite a while. WDDM2.0 might be a boon though, at least in some games.
     
  17. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,865
    Likes Received:
    192
    Location:
    Seattle, WA
    I seriously doubt that many game devs will bother to support Mantle when DX12 offers almost all of the benefits. Besides, Mantle's performance benefits are often pretty small in real games today. Those gains will be almost non-existent when DX12 comes around.

    Why?

    To clarify: I meant in terms of people having systems that support DX12. I expect that will occur much faster than the ramp-up time for DX10. Hopefully this will influence game devs to roll out DX12 support sooner.
     
  18. Infinisearch

    Veteran Regular

    Joined:
    Jul 22, 2004
    Messages:
    739
    Likes Received:
    139
    Location:
    USA
    I remember in a presentation concerning BF4 and mantle, they mentioned that they didn't rewrite the engine around mantle and there was performance to be had in such a case. Have any of the mantle games been mantle exclusive or in a dev-diary or postmortem said to have been designed around mantle first?
     
  19. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,865
    Likes Received:
    192
    Location:
    Seattle, WA
    No, no Mantle-exclusive games, I don't think. The closest is probably Star Swarm, which shows a huge benefit (but for a situation that is pretty contrived). In Star Swarm, nVidia does far better than AMD in DX11. The race is much closer in DX12 or Mantle.

    http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm/5
     
  20. CarstenS

    Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    4,796
    Likes Received:
    2,054
    Location:
    Germany
    I say unknown because they are unknown.
    Do you know actual power consumption? Do you know Retail Price? Do you know noise output? Do you know gaming performance? I don't, so I say unknown. Quite easy.

    Now, don't get me wrong - I'm a big fan of unusual graphics cards, having bought many for unintuitive reasons myself and a Fury Nano, with decent performance, modest noise levels and a price that is not reflecting it's targetting a very niche market would be an interesting product.

    It's just: I don't have more than an idea of how in my layman's decision finding I would design such a product (select fully functioning dice, bin them for lowest of leakage, enforce (and by that I mean not only state it in a PDF, but actually enforce it through systems already at hand to AMD) a strict power limit on them to keep noise bearable. Hence unknown characteristics.

    And I don't see either steam machine like gaming rigs really taking off at the moment nor do I see a large number of people that might have been holding back their purchase decision for a fast but very small gaming card in the last year or so after having the choice between a R9 285 and GTX 970 in small form factor.
     
    #440 CarstenS, Jul 21, 2015
    Last edited: Jul 21, 2015
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...