AMD Radeon R9 Fury X Reviews

@gamervivek said:

Hmm I just did a quick look at 4k benchies from Hexus, and it's 4-4, 1fps being decider in a few and 980Ti winning one game for having 1fps more minimum. Maybe they mistook 3DMark for a game. :p
Or they are lumping 1440p results together with 4k. ;-)

Though they echo TR's Fury hiccuping.

Maybe that's Fury's famed HBM optimisation :runaway:


edit: and I see an overclocked 980Ti in maximumPC review, are ArsTechnica straight up incompetent?
 
@Ethatron said:

Which leads to the question: why are so few games scaling by >2x? Crysis 3 and Witcher 3 being the only games with real scaling.

CryEngine has a bandwidth and texture-instruction bound z-pass and an ALU bound lighting pass, both part of a deferred shading pipeline. More incoming bandwidth - and possibly more TMUs - will help the first, and more CUs will help the latter. AFAIK it has no big problems with outgoing bandwidth.
 
@CarstenS said:

I don't understand how a pre-z-pass can be TEX-bound? Could you explain a bit further?
 
@silent_guy said:

I wonder how many people would be able to identify a FuryX from a GTX 980 Ti during game play. Probably not many? (Except for games that run out of memory.) How of a frame rate difference do you really need to have a different experience?

From that point of view, for many, the FuryX is just as good of buy as a 980 Ti.

I get that AMD wants to get away from the label of being the cheaper alternative, but you need the raw performance credentials to be able to do that.

By not having the undisputed top performance GPU, yet pretending that it was, they completely mismanaged expectations.

I think they should have kept the price at $650. (Supply will probably not meet demand for a while) and stressed all the good points, but played down the fastest ever angle. I suspected then that this was a reference to the paper FuryX x2 product, but they kept that deliberately vague.

What's the point of tricking an audience when the truth will inevitably come out a week later? It'd be fun to be a fly on the wall during their marketing post-mortems...
 
@UniversalTruth said:

I wonder how many people would be able to identify a FuryX from a GTX 980 Ti during game play. Probably not many?

I will very easily since the image quality is different.

AMD cards have always given better calibration - like contrast, brightness, etc, while the competition's image is always too bright and needs adjustments. :D
 
@silent_guy said:

I will very easily since the image quality is different.

AMD cards have always given better calibration - like contrast, brightness, etc, while the competition's image is always too bright and needs adjustments. :D
Newsflash: the FuryX doesn't have a VGA port anymore.
 
@jacozz said:

I wonder how many people would be able to identify a FuryX from a GTX 980 Ti during game play. Probably not many? (Except for games that run out of memory.) How of a frame rate difference do you really need to have a different experience?

From that point of view, for many, the FuryX is just as good of buy as a 980 Ti.

I get that AMD wants to get away from the label of being the cheaper alternative, but you need the raw performance credentials to be able to do that.

By not having the undisputed top performance GPU, yet pretending that it was, they completely mismanaged expectations.

I think they should have kept the price at $650. (Supply will probably not meet demand for a while) and stressed all the good points, but played down the fastest ever angle. I suspected then that this was a reference to the paper FuryX x2 product, but they kept that deliberately vague.

What's the point of tricking an audience when the truth will inevitably come out a week later? It'd be fun to be a fly on the wall during their marketing post-mortems...

I agree, but to be fair the AMD reps didn't say that Fury was the fastest gpu, they said that their dual-gpu-card was theworldfastestgraphicscard. At least thats what the powerpoint slides said at the launch. But they did say that the furyX was theoverclockersdream. :(((
 
@Ethatron said:

I don't understand how a pre-z-pass can be TEX-bound? Could you explain a bit further?

Not pre-z-pass, just z-pass. In CE the z-pass is the one filling the g-buffers, doing pretty much every texture fetch necessary, in a tight spot.
 
@gamervivek said:

They also tested with Windows 7 instead of 8.1.

Nice catch! I read that win8 does better* than win7 in cpu utilization so that would seem to play in nvidia's favor, or perhaps nvidia got a great boost from 8.1?

Apparently, techpowerup are using windows7 as well, now I'm not sure.
 
@pharma said:

Nice catch! I read that win8 does better* than win7 in cpu utilization so that would seem to play in nvidia's favor, or perhaps nvidia got a great boost from 8.1?

Apparently, techpowerup are using windows7 as well, now I'm not sure.

Techpowerup I believe used newer drivers (ver. 353.06) and ixbt version 352.86 (for the GTX 980 Ti - ver. 352.90). Even though not the newest driver used by some other sites (ver. 353.30), I imagine there might be some performance differences.
 
@3dilettante said:

I wonder how many people would be able to identify a FuryX from a GTX 980 Ti during game play. Probably not many? (Except for games that run out of memory.) How of a frame rate difference do you really need to have a different experience?
In general, Fury X has taken a rather assertive change in terms of product quality, aside from some inconsistencies. (clarification: inconsistencies in mostly non-physical facets like new drivers)
Its acoustics are a plus, as even Nvidia in these power ranges starts to lose its sonic polish.
That it's a water cooler could be a minus for some, but it seems like Fury X would have a problem getting to this parity if it were air-cooled.

Some demerits might have more weight for certain buyers, and since this is a smaller market it adds up.
Lapses like the outdated output hurt it for some versus the competition, negating its benefit to some buyers that would like a physically small card that might plug into a big TV.
There are potential difficulties in fitting the radiator in a case or multiple radiators with crossfire. The limited RAM capacity might solve that multiple radiator problem in a negative way, and buyers in this range would notice.

The really limited tweaking options hurt at this tier, so hopefully AMD has a way of fixing this. I think it will be important that this gets fixed before the dual-GPU card launches if it wants to get any higher in the price range.

What's the point of tricking an audience when the truth will inevitably come out a week later? It'd be fun to be a fly on the wall during their marketing post-mortems...
Since one of the most notable references to the dual-GPU Fastest Card ever involved a handoff to the CEO, it might be that disappointed upper-tier enthusiasts weren't the only audience. AMD had already painted a picture of turning its fortunes around with a refreshed product stack, and things like 8GB, 4K, "12K", and a 500W cooler taken in isolation might distract a few investors.
 
@swaaye said:

I am impressed by the cooling. AMD has put out some horrendous cooling solutions on just about every high end card they've made after Radeon 9700. Noisy awful hair dryers. 290X was probably the worst - noisy AND insufficient capacity to adequately cool and allow the chip to fully turbo up. This water cooler and their reliance on third party coolers for the rest of the 300 series appears to be a major improvement on the cooling front. What does an aftermarket water cooler rigging of this caliber cost? $100?

I think Fury X looks decent but too expensive. As others have said. Unfortunately it's bleeding edge manufacturing and memory tech so it mostly likely can't go cheaper. Two huge dies too. I also read that Fiji isn't better equipped because there is simply no room for a larger die on the interposer.

I also wonder just how many they plan to produce. How long will they try to sell this? How far off is 16nm now?
 
@eastmen said:

A kraken g10 is about $30 and then a AIO cooler is $50- $200 depending on what you want. I'm going to assume the cooler they used is similar to an h50 or h60 so like $60-$70 at the most and sometimes $30 or so on sale.

http://www.amazon.com/NZXT-Technolo...pebp=1435255962331&perid=1TSATAQ8ZKQZP5YJBC1K


http://www.amazon.com/Corsair-Hydro...TF8&qid=1435255949&sr=8-4&keywords=kraken+g10


The above is the set up I have on my 7950 . I the ngot some small coper heatsinks to cool the vrm
 
@swaaye said:

A kraken g10 is about $30 and then a AIO cooler is $50- $200 depending on what you want. I'm going to assume the cooler they used is similar to an h50 or h60 so like $60-$70 at the most and sometimes $30 or so on sale.

http://www.amazon.com/NZXT-Technolo...pebp=1435255962331&perid=1TSATAQ8ZKQZP5YJBC1K


http://www.amazon.com/Corsair-Hydro...TF8&qid=1435255949&sr=8-4&keywords=kraken+g10


The above is the set up I have on my 7950 . I the ngot some small coper heatsinks to cool the vrm
You still need VRM cooling too.

How does the cooling capacity compare I wonder....
 
@eastmen said:

You still need VRM cooling too.
yea small copper heatsinks , I think I got them for $10 bucks online . Seems to work fine. Remember the kraken comes with a fan that blows down onto the vrms . So the small copper sinks work fine
 
@swaaye said:

It makes one wonder how much the air coolers are worth. Even the visually fancy NV coolers. $20?
 
@Gandahar said:

GeForce FX 5800 Ultra v Radeon 9700 pro

"The most stunning part of the GeForce FX 5800 Ultra is its utterly insane cooling system"

And it had that new fangled DDR2 memory ! But with limitation on interface being only 128 bit. Hmmm.

Summary

"So there you have it, NVIDIA's response to ATI's Radeon 9700 Pro - but does anyone else feel unfulfilled by the GeForce FX? A card that is several months late, that is able to outperform the Radeon 9700 Pro by 10% at best but in most cases manages to fall behind by a factor much greater than that."

Anandtech 12 years back.....

Cough.
 
Back
Top