AMD Radeon R9 Fury X Reviews

What's interesting to me is how Fury X will behave with new modern games. Usually AMD cards do better several years into their lifecycle.
Imagine the picture in 2017-2018.

BTW, if we add to those 12 games (in which R9 Fury X beats or slightly beats) the other:

Batman: Origins
Battlefield 4
GTAV
Metro LL
Watch Dogs

5 games, in which Fury X is slightly behind, then we have 17 out of 22 games in which either it wins or is very slightly slower.

The other games are troublesome with the driver optimisation and need urgent work from AMD. At that time Fury X might turn into the undisputable leader.
 
But who cares what the reference model 980ti does at stock settings?

perfrel_3840.gif


I'm pretty sure a large majority of 980tis sold are the 3rd party models. They need to find quite a bit in the drivers to catch up to that, and that's at 4k, the most favorable resolution for Fury X.
 
What's interesting to me is how Fury X will behave with new modern games. Usually AMD cards do better several years into their lifecycle.
Imagine the picture in 2017-2018.

BTW, if we add to those 12 games (in which R9 Fury X beats or slightly beats) the other:

Batman: Origins
Battlefield 4
GTAV
Metro LL
Watch Dogs

5 games, in which Fury X is slightly behind, then we have 17 out of 22 games in which either it wins or is very slightly slower.

The other games are troublesome with the driver optimisation and need urgent work from AMD. At that time Fury X might turn into the undisputable leader.


And how about the 5 games that the Ti is slightly behind the Fury X (3% or less) in that game suite, will you give those the same stigma as you just did with the 5 games you just listed?

Life cycle of high end cards people that buy high end cards is usually one generation and that's it. So if they don't perform well for this generation of games, forget about waiting another generation of games to come out to show the true power of hardware. AMD might as well just throw themselves in to a grave along with the shovel if that is what they want from their customers, because no one will wait on them, they already waited 9 months....
 
Nothing, I do not know.

I want to see a very deep detailed review of image quality because I think Nvidia cheats with this to enable higher performance through compromises.
 
why don't you do it then, because I don't think anyone truly believes that anymore, it happened with the FX and since the 68xx series, both companies have similar IQ (adaptive AF) and the g80 was better then the r600 with full AF all the time which AMD equalized a generation later.
 
Razer, I've completely lost who you're replying to. Sorry.

In any case, I generally agree the Fury X seems to be only marginally different than the 980Ti at 4K resolutions. Unfortunately, purely for my own use cases, 4K is not interesting to me. Given the majority of games will need to come down out of "uber" quality settings at that resolution to have fluid framerates, I still do not concede that 4K is reliably attainable with any single card on the newest games.

As such, I personally find the results at 1440p a better point (for me), as I can keep every graphics option at the top edge while maintaining good framerate. And at that point, the Ti is outpacing the FuryX. Combned with the extra $30 I paid for a Gigabyte G1 flavored 980Ti, and there isn't a FuryX out there that's going to touch me.
 
sorry i was replaying to universaltruth.

Yeah AMD just came out too late with the Fury X (possibly to do HBM's mass production scehdule) if it came out lets say 3 months before the 980 Ti which would have put it up against Titan X and still priced as it is right now, this would have been a different picture. But with the 980 ti and its overclocked versions the 980 Ti's are better cards right now.
 
Then the general disadvantage will likely disappear as well.
What do you mean?

DX12 incorporates many of the advantages found in AMD's Mantle API for use with all video cards. AMD cards have very little difference between running Mantle and running DX12, while nVidia gains a significant performance boost. I don't expect the relative difference between AMD and nVidia cards to change all that much between DX11 and DX12.

Considering that Microsoft is giving out Windows 10 for free for current Windows 7/8 users, whether legal are not, and that all indications are that it will be one of the better Windows releases, I expect DX12 adoption to be very fast, which will hopefully mean good things for games going forward.
 
Life cycle of high end cards people that buy high end cards is usually one generation and that's it. So if they don't perform well for this generation of games, forget about waiting another generation of games to come out to show the true power of hardware. AMD might as well just throw themselves in to a grave along with the shovel if that is what they want from their customers, because no one will wait on them, they already waited 9 months....
I don't think you can generalize like that. I usually don't buy a new video card every generation, for example. I only upgrade when there's a big enough performance boost to see a large difference in my games.
 
true its mostly personal preference when upgrading, I tend to do the same but usually every generation there is a 80% increase in performance, outside of a few here and there, the gtx 980 wasn't enough for me to upgrade but the Titan X was even though it didn't have the typical 80% instead more like 50% but the need for vram pushed me to upgrade.
 
What do you mean?

DX12 incorporates many of the advantages found in AMD's Mantle API for use with all video cards. AMD cards have very little difference between running Mantle and running DX12, while nVidia gains a significant performance boost. I don't expect the relative difference between AMD and nVidia cards to change all that much between DX11 and DX12.

Considering that Microsoft is giving out Windows 10 for free for current Windows 7/8 users, whether legal are not, and that all indications are that it will be one of the better Windows releases, I expect DX12 adoption to be very fast, which will hopefully mean good things for games going forward.
NVIDIA will support DX12 too, levelling the low-overhead API playing field. Unlike Mantle, it's not an exclusive advantage for AMD.
 
people in that space are ready to accomodate premiums (see pricing for mITX motherboards, SFX PSUs etc.).

To be honest, the mini-ITX standard is becoming rather mainstream. Boards and cases aren't that much more expensive than their micro-ATX or ATX equivalents anymore, which makes sense because the manufacturers end up spending less money on materials and connectors, and the PCBs are smaller too. One could argue that the motherboards need to be more compact so the PCB takes more layers, but ever since the northbridge went into the CPUs there's a whole lot less to put in there.
Indeed, the SFX PSUs are more expensive (some ~20-30% than the ATX equivalent?), but most mini-ITX cases take ATX PSUs anyways.

Also, I do question the need for a very small graphics card for a mini-ITX case. It seems to me that most mini-ITX cases nowadays are rather long, in order to fit larger graphics cards.
 
While I believe the Fury's performance will improve with driver tweaks I'm a bit more excited for the real next gen parts, no more of this 28nm nonsense and their high end cards having twice the ram of their halo product,. Bit odd doubling the memory for a re-badge (with some tweaks, some of which make it even worse on the perf/watt scale) to make it "future proof" and for high res gaming and their halo product have half the ram of the high end card but assure everyone they have a driver guy working on vram usage optimization.
 
NVIDIA will support DX12 too, levelling the low-overhead API playing field. Unlike Mantle, it's not an exclusive advantage for AMD.
Right. I'm just trying to say that DX12 erases most, if not all, of the advantage AMD has with Mantle. Of course, we'll know for sure whether or not this is true within a few months once we have more DX12 benchmarks, but overall I'm just saying that I don't think it makes sense to weight Mantle too highly when making a purchase decision.
 
Also, I do question the need for a very small graphics card for a mini-ITX case. It seems to me that most mini-ITX cases nowadays are rather long, in order to fit larger graphics cards.
This may be a chicken and egg problem. The GPU is basically the only reason why an mITX case would need to be that long (PSU maybe to a lesser extent), so if there were more small and powerful GPUs on the market we may see more short mITX cases.
 
What do you mean?

DX12 incorporates many of the advantages found in AMD's Mantle API for use with all video cards.

Using mantle itself?

AMD cards have very little difference between running Mantle and running DX12, while nVidia gains a significant performance boost. I don't expect the relative difference between AMD and nVidia cards to change all that much between DX11 and DX12.

I do.

Considering that Microsoft is giving out Windows 10 for free for current Windows 7/8 users, whether legal are not, and that all indications are that it will be one of the better Windows releases, I expect DX12 adoption to be very fast, which will hopefully mean good things for games going forward.

I don't, maybe nvidia's maxwell will mean that they use the feature levels more but dx11 will continue for quite a while. WDDM2.0 might be a boon though, at least in some games.
 
Using mantle itself?
I seriously doubt that many game devs will bother to support Mantle when DX12 offers almost all of the benefits. Besides, Mantle's performance benefits are often pretty small in real games today. Those gains will be almost non-existent when DX12 comes around.

Why?

I don't, maybe nvidia's maxwell will mean that they use the feature levels more but dx11 will continue for quite a while. WDDM2.0 might be a boon though, at least in some games.
To clarify: I meant in terms of people having systems that support DX12. I expect that will occur much faster than the ramp-up time for DX10. Hopefully this will influence game devs to roll out DX12 support sooner.
 
Besides, Mantle's performance benefits are often pretty small in real games today.
I remember in a presentation concerning BF4 and mantle, they mentioned that they didn't rewrite the engine around mantle and there was performance to be had in such a case. Have any of the mantle games been mantle exclusive or in a dev-diary or postmortem said to have been designed around mantle first?
 
I remember in a presentation concerning BF4 and mantle, they mentioned that they didn't rewrite the engine around mantle and there was performance to be had in such a case. Have any of the mantle games been mantle exclusive or in a dev-diary or postmortem said to have been designed around mantle first?
No, no Mantle-exclusive games, I don't think. The closest is probably Star Swarm, which shows a huge benefit (but for a situation that is pretty contrived). In Star Swarm, nVidia does far better than AMD in DX11. The race is much closer in DX12 or Mantle.

http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm/5
 
Hmm, why do you say "unknown"?
I say unknown because they are unknown.
Do you know actual power consumption? Do you know Retail Price? Do you know noise output? Do you know gaming performance? I don't, so I say unknown. Quite easy.

Now, don't get me wrong - I'm a big fan of unusual graphics cards, having bought many for unintuitive reasons myself and a Fury Nano, with decent performance, modest noise levels and a price that is not reflecting it's targetting a very niche market would be an interesting product.

It's just: I don't have more than an idea of how in my layman's decision finding I would design such a product (select fully functioning dice, bin them for lowest of leakage, enforce (and by that I mean not only state it in a PDF, but actually enforce it through systems already at hand to AMD) a strict power limit on them to keep noise bearable. Hence unknown characteristics.

And I don't see either steam machine like gaming rigs really taking off at the moment nor do I see a large number of people that might have been holding back their purchase decision for a fast but very small gaming card in the last year or so after having the choice between a R9 285 and GTX 970 in small form factor.
 
Last edited:
Back
Top