AMD Radeon R9 Fury reviews

I was wondering, do you know why so many Mantle reviews show Fuji having issues with Mantle games? It seems to be a big plus in favor of the 390X if you compare the 390X and the R9 Fury...
GCN 1.2 has not and will not receive optimized Mantle drivers.
 
...Well, that's unexpected(For me, anyway).

What about the sound DSPs? I think only Thief used them...
Mantle is no more (it sort of morphed into Vulkan).

The sounds DSPs will likely only ever be used by a select few titles or maybe none at all going forward for obvious reasons (AMD GPU marketshare is very low and even on the brand new 300 series not all their cards support it).
 
Mantle question: Did any titles use it for anything other than speedups? I'm thinking any sort of graphical enhancement.
 
These examples of low-level API games using the legacy excuse of "don't have optimizing/bug fix drivers" bring up my earlier questions about every time a Mantle game needed a driver release.

Shouldn't it be the case with a low-level API, being the break from the bad old days where drivers had to fix bugs or provide performance, that the developers could go back and do whatever an optimized driver should do? That's how I keep getting it sold to me.

Where is that line again between the driver domain and the developer's code?
 
Well I think that is an issue, it puts the burden on the developers even more so with a lower level API, but it doesn't drop the need of optimizing drivers, its just less overhead due to removal of abstraction layers and what not, this might actually increase the need for driver optimizations because more things can go wrong in the developers hands. The up side is the developer can make the optimizations if time is taken, that usually the driver teams take care of.

I think Humus actually mentioned this a while back when talking about last gen consoles and their low level API's.
 
Well I think that is an issue, it puts the burden on the developers even more so with a lower level API, but it doesn't drop the need of optimizing drivers, its just less overhead due to removal of abstraction layers and what not, this might actually increase the need for driver optimizations because more things can go wrong in the developers hands.
Similar assessments have met some resistance in API discussions.
There seems to be an impression that game developers in general are being held back by IHVs that insist on working around their bugs or cleaning up after their neglect.
 
True both sides have to work hand in hand, no way really around that, but at least this does give the developer the access so they can, driver developers will have to do what they always do.

I don't think developers were not really held back by IHV's, its mostly held back by the API and OS, both of them dictate what features the hardware is required and how the drivers are created and of course on the developers end the API programming rules hold them back.
 
These examples of low-level API games using the legacy excuse of "don't have optimizing/bug fix drivers" bring up my earlier questions about every time a Mantle game needed a driver release.

Shouldn't it be the case with a low-level API, being the break from the bad old days where drivers had to fix bugs or provide performance, that the developers could go back and do whatever an optimized driver should do? That's how I keep getting it sold to me.
That's the idea. Hard to say if the few Mantle enabled games out there can be taken as representative of what we should expect with D3D12 and Vulkan since Mantle was aborted before reaching v1.0.

And then of course you have to assume the drivers aren't full of bugs like they have been for pretty much always for every IHV.

On another note, I think NVIDIA should definitely lower the MSRP for the GTX980 at this point. It was overpriced before IMO compared to the 290X, and now it shares an MSRP with Fury which is clearly faster. $450 would be much easier to stomach.
 
Last edited:
I will miss the unused potential of TrueAudio...
Do you think with fast Intel CPU, games can still emulate TrueAudio performance?

AFAIK TrueAudio is processing power and not sound quality...
 
I will miss the unused potential of TrueAudio...
Do you think with fast Intel CPU, games can still emulate TrueAudio performance?

AFAIK TrueAudio is processing power and not sound quality...

The DSPs in TrueAudio are very simple cores that provide very little processing power relative to a desktop core.
It might matter more for some kind of constrained environment like a low-core count mobile device or a console with comparatively weak CPUs--although early analysis of where Sony wanted to take its audio processing indicated it was still better off with the CPU.

Perhaps there are some latency benefits to where the hardware is positioned in a discrete setup, maybe VR?
I honestly don't recall a real-world example being provided of where this is beneficial, and we get so much more CPU now with low-level APIs.
 
Sounds like TrueAudio is a leftover from console development...and it will be on its way out come next product cycle....oh well.
 
There seems to be an impression that game developers in general are being held back by IHVs that insist on working around their bugs or cleaning up after their neglect.

It is not dissimilar from assuming that if you can hit the punching bag and handle the jumprope like a boss you are going to be a street-fighting machine. History appears to indicate that in general this type of self-assessment tends to be...in error.
 
Sounds like TrueAudio is a leftover from console development...and it will be on its way out come next product cycle....oh well.

The semicustom effort for the consoles appears to be significantly involved in it, although I don't know which is the chicken or egg.
The overall area cost does not appear to be that prohibitive to keep it along in some form, and AMD may have used the initial R&D for some longer-term play in integrating separate IP.

The PS4's major sticking point is that a significant fraction of its DSP resources were used up with decoding media, and the audio resources were hidden behind a secure API that added latency beyond what could be done on a CPU--with the CPU generally providing the most flexibility.
This was the early days, but the PS4 GPU's loaded latency was unacceptable for audio, and the audio DSPs were still too high latency for the most demanding scenarios. It may be the case that in the absence of the console platform's requirements that the DSP resources could have been useful.

For AMD's push with VR and with a discrete GPU setup, possibly there's still a use case AMD wants to sell, although if it were dropped I see few indications it would be mourned.
 
For Thief the Trueaudio DSPs were only used to perform a not-so-impressive convolution reverb. Did another game do anything else that was actually beneficial?
 
There's a TrueAudio thread with the most recent game-related data being Lichdome: Battlemage.
I think Star Citizen was stated to be able to use it?

Not much really new in that thread since 2013. I think the officially-dead Mantle has had more relevance thus far to Fury since a game or two managed to not crash while running it and this was mentioned or listed for it in a few reviews.
 
The real problem with this type of technology as TrueAudio, you need to get many developpers who goes on it, or it will take forever to start..
Let say, if it was part of DirectX or PS4 API.... and developpers could have 1 year for start using and was sold as a big features on new console generations sound output, you will get it explode. Then Nvidia will use it, and Intel will get onboard audio who use it.

Is there any games who support it on consoles right now ?

But im pretty sure, in a way or the other we will see similar tech coming thoses next years... Sound need a real reshape on gaming, we have got different things with audio software processing ( the 3-4 modes on Battlefield as example ), or some games who was seems to put a lot of work on the audio ( Alien isolation maybe ?... ) not really much than that.
 
Back
Top