AMD Radeon R9 Fury X Reviews

@flopper said:

It's hard not to compare the FuryX to the 980Ti when that is it's competitor. I'd say it will be quite difficult to get it out of the thread.

AMD marketing mistake big time as how bad can they be at their jobs?
total amateurism
 
@Chalnoth said:

I have found this video card launch to be more interesting than any launch in a while. The HBM tech is really amazing. The cooler design is also pretty innovative.

But ultimately, this launch was extremely disappointing. The Fury X at best trades blows with the 980 Ti (in some reviews, it does consistently worse), and with higher power consumption. The cooler's noise problems have also been disappointing (though solvable: buy a card in a few weeks to months and I'm sure it won't be an issue). It's awesome that we have a card that is not significantly louder when it is in heavy operation than when it is at idle, and that the idle noise isn't too terrible (when the cooler is working properly). But combining HBM and this amazing cooler and not being able to beat the 980 Ti is really unfortunate. It makes me think that AMD spent too much time on the design of the memory and not enough time figuring out how to make best use of the additional memory bandwidth. Here's hoping that the next designs from AMD and nVidia that use this kind of memory technology are much better-balanced.
 
@Gregster said:

Guys, I was the author of the original video and after speaking with several people, there is something up with my setup or at least there was. Everything is correct and present now and all settings are identical. I was very new to using a capture card as well and hence a few silly errors.

I have since redone the video's and the IQ is identical as far as I can tell.


That is one of the many vids I have done and feel free to give it a once over :)
 
@3dilettante said:

But combining HBM and this amazing cooler and not being able to beat the 980 Ti is really unfortunate. It makes me think that AMD spent too much time on the design of the memory and not enough time figuring out how to make best use of the additional memory bandwidth. Here's hoping that the next designs from AMD and nVidia that use this kind of memory technology are much better-balanced.

The design process for the memory standard would need to be decoupled from graphics development. Hynix wouldn't appreciate AMD saying that it had to make a standard meant to be used as an industry standard less useful because it needed to figure out its graphics IP--unless we count HBM1 as being that compromise. The foundations laid down for HBM would have been decided before the realities of GPU stagnation at 28nm really hit.
 
@Razor1 said:

Guys, I was the author of the original video and after speaking with several people, there is something up with my setup or at least there was. Everything is correct and present now and all settings are identical. I was very new to using a capture card as well and hence a few silly errors.

I have since redone the video's and the IQ is identical as far as I can tell.


That is one of the many vids I have done and feel free to give it a once over :)


Thank you Gregster, nice overclock btw on the Titan X.
 
@Rys said:

I have found this video card launch to be more interesting than any launch in a while. The HBM tech is really amazing. The cooler design is also pretty innovative.

But ultimately, this launch was extremely disappointing. The Fury X at best trades blows with the 980 Ti (in some reviews, it does consistently worse), and with higher power consumption. The cooler's noise problems have also been disappointing (though solvable: buy a card in a few weeks to months and I'm sure it won't be an issue). It's awesome that we have a card that is not significantly louder when it is in heavy operation than when it is at idle, and that the idle noise isn't too terrible (when the cooler is working properly). But combining HBM and this amazing cooler and not being able to beat the 980 Ti is really unfortunate. It makes me think that AMD spent too much time on the design of the memory and not enough time figuring out how to make best use of the additional memory bandwidth. Here's hoping that the next designs from AMD and nVidia that use this kind of memory technology are much better-balanced.
Off-topic, but hey Chalnoth, good to see you posting again. It's been a while! :smile:
 
@Gregster said:

Thank you Gregster, nice overclock btw on the Titan X.
Thanks but there isn't any overclock on it. It is the EVGA SC Titan X. It will sit at 1430Mhz 24/7 and my Fury X will sit at 1100Mhz (both stock volts). I will get round to doing some overclock results soon :)
 
@CarstenS said:

So, which reviewers are making sure that they are testing with equivalent IQ?
I didn't even know it was necessary to mention such an implicitness, but: Of course we do and test all cards at HighQuality filtering settings in the respective drivers.
 
@3dilettante said:

Guys, I was the author of the original video and after speaking with several people, there is something up with my setup or at least there was. Everything is correct and present now and all settings are identical. I was very new to using a capture card as well and hence a few silly errors.

I have since redone the video's and the IQ is identical as far as I can tell.

Transient issues can hit any system with as many factors as a gaming rig. That you took feedback, invested the extra time to re-test, and provided the output is very much appreciated.
 
@Ryan Smith said:

Guys, I was the author of the original video and after speaking with several people, there is something up with my setup or at least there was. Everything is correct and present now and all settings are identical. I was very new to using a capture card as well and hence a few silly errors.

I have since redone the video's and the IQ is identical as far as I can tell.


That is one of the many vids I have done and feel free to give it a once over :)
It's not really your fault. The problem is that Battlefield 4 is a complete and utter pain to test. It does a lot of things behind the scenes that aren't obvious (or don't make a lot of sense), it automatically adjusts draw distance based on VRAM, it's LOD system isn't great, and for that matter even when things work right its texture use can leave something to be desired. At the end of the day it's not a very good test for image quality, though it's always good to have someone checking it anyhow.
 
@silent_guy said:

I have since redone the video's and the IQ is identical as far as I can tell.
Thanks for the clarification!

(I'm sure a bunch of webmasters are thankful as well: these kind of controversies are great at creating traffic! ;) )
 
@Chalnoth said:

The design process for the memory standard would need to be decoupled from graphics development. Hynix wouldn't appreciate AMD saying that it had to make a standard meant to be used as an industry standard less useful because it needed to figure out its graphics IP--unless we count HBM1 as being that compromise. The foundations laid down for HBM would have been decided before the realities of GPU stagnation at 28nm really hit.
I won't speculate as to what the impact on the memory standard might have been if AMD's engineering management had a somewhat different focus, but whatever the situation it seems really clear that they didn't make optimal use of the additional bandwidth. It might have been better, for example, if they had waited on implementing HBM until their next architecture and instead gone with a more traditional setup this time around.

I really hope that the next generation of HBM cards do a much better job of making use of the additional headroom.
 
@Rurouni said:

I won't speculate as to what the impact on the memory standard might have been if AMD's engineering management had a somewhat different focus, but whatever the situation it seems really clear that they didn't make optimal use of the additional bandwidth. It might have been better, for example, if they had waited on implementing HBM until their next architecture and instead gone with a more traditional setup this time around.

I really hope that the next generation of HBM cards do a much better job of making use of the additional headroom.
They already have, it's called the 3xx series, and the respond is meh.
 
@pharma said:

So the ASUS STRIX Radeon R9 Fury, the non X model has surfaced on the web. This non-X model has a Fiji Pro GPU and should be hitting the streets by end of week.

The ASUS STRIX Radeon R9 Fury is based on the same DirectCU III cooler which has been featured on the STRIX GeForce GTX 980 Ti graphics cards. Interesting to see is that the leaked photo shows a DVI port, something lacking on the reference cards and Sapphire model.
...
At this point it look like only Sapphire and ASUS are to release a Fiji Pro based Fury card, I've talked to Gigabyte and they are not planning this SKU. MSI as well will not release one short term, but internally they are still discussing it.

http://www.guru3d.com/news-story/asus-strix-radeon-r9-fury-surfaces.html
 
@CarstenS said:

The picture looks suspiciously like the R9 390 Strix that I have lying here on my desk. That includes the placement and curvature of the heatpipes, which would most probably be somewhat different due to Fijis larger package compared to Hawaii. Additionally, I don't think we'll see a DVI-equipped Fiji card (at least in the first batch).

To be perfectly clear: The original source of these pictures has a third one available, which would make it perfectly edit: no, make that painfully obvious, that this is indeed R9 390 Strix, not Fury. But of course wise guys at WTF tech has omitted that one for credibility.

So, please wait until it's at least double confirmed(tm)
 
Back
Top