AMD GPU gets better with 'age'

1. Basically, you need at least a GTX 980 or R9 290X to get a comfortable 60 FPS, and even then, probably not a steady 60 FPS, so you'd either need Free/G-Sync or a faster card for smooth gameplay. And that's with only 2X of MSAA, albeit at a pretty decent 2560×1440 definition.
While the chart certainly shows that, also keep in mind that they've moved a LOT of the sliders to the top (or quite near so) for this benchmark. There's a lot of room to scale downward from the settings used. With that said, I'm running without MSAA in order to pick up extra details and framerate tradeoff. :)

2. I wonder if there's any link between the presence of GCN in console hardware and its performance in this console port.
I agree this is certainly something to consider.
 
I guess the engine really likes GCN. A couple of thoughts:

2. I wonder if there's any link between the presence of GCN in console hardware and its performance in this console port.

Well, that, and memory can surely play a role too, i find similarity on amount of ram and performance ( 280x - 780/ 3GB, 980-290x / 4GB, etc.. )
 
My vanilla HD7970 works fine at 2560x1440 in GTAV, although I wasn't using any MSAA. I tried to benchmark various settings, but I wasn't exactly sure what I was doing as I thought I would just be watching the gameplay, but then I "failed" a mission and the benchmark stopped. I didn't see a score reported.
 
My vanilla HD7970 works fine at 2560x1440 in GTAV, although I wasn't using any MSAA. I tried to benchmark various settings, but I wasn't exactly sure what I was doing as I thought I would just be watching the gameplay, but then I "failed" a mission and the benchmark stopped. I didn't see a score reported.

Yeah it's a bug; you have to start the main story before benchmarking or it tries to start both simultaneously. You get a mission failure because you (well the benchmark) left the main story's mission area.
 
Yeah it's a bug; you have to start the main story before benchmarking or it tries to start both simultaneously. You get a mission failure because you (well the benchmark) left the main story's mission area.
Thanks for the tip. I'll try benchmarking again now that I've played the game some.
 
Digital Foundry has a feature on what it takes to run the game at 60fps. A 760ti can still run it at 1080p@60 with equal or higher than PS4 settings apparently.
 
I admit there may be singular games that expose higher driver CPU usage, but I'm not convinced that's a global driver "issue". And it goes both ways -- NVIDIA provided a higher-threaded optimization to their driver for a recent game (was it Civ? Or was it Star Swarm?) to deliver better performance, but the caveat was more CPU usage.
Actually the same pattern repeats again in GTA V, 750Ti delivers better performance than R9 280 when both are paired with a Core i3. In fact, recent DX11 vs DX12 tests have demonstrated how weak AMD handles DX11 threads right now compared to NV. It is pretty much a fact now.
http://www.eurogamer.net/articles/digitalfoundry-2015-grand-theft-auto-5-pc-performance
 
Actually the same pattern repeats again in GTA V, 750Ti delivers better performance than R9 280 when both are paired with a Core i3. In fact, recent DX11 vs DX12 tests have demonstrated how weak AMD handles DX11 threads right now compared to NV. It is pretty much a fact now.
http://www.eurogamer.net/articles/digitalfoundry-2015-grand-theft-auto-5-pc-performance

I dont say you are not right but..
We know that 7000 series GCN1 was need to be feeded, looking at the number, i ask me if dont come more than a cpu bottleneck in this case more than anything. look at the minimum fps who dive, by a ratio of 2 times. On an another note, i will not be surprised to see driver increase performance on some particular case. ( older gpu ).

Looking at the video more in details, a funny thing is when the fps increase on 280 + 4970K when finishing the flight, before zooming on the black car. you see the 4970K nearly double the fps, when strangely at the same time, with the I 4310 system, the fps dont increase much. they even decrease when the fighter plane is gone. ( 80 to 50fps, when in the same time, the other system goes from 80 to 100fps ) ( at this moment, the camera is fix and the fighter plane disappear )

Here with max setting and 2x msaa just for get the number.
index.php
 
Last edited:
Wow, looking at that I realize it has been a long time since I owned a PC graphics card that was in the top 6 rather than the bottom 6 of these kind of lists, ha ha.
 
Mine's on top 4. Ha!
And it's a > year-old graphics card to boot.

Come VR, I'm getting more and more convinced I should just get another 290X and upgrade the PSU.
 
Actually the same pattern repeats again in GTA V, 750Ti delivers better performance than R9 280 when both are paired with a Core i3. In fact, recent DX11 vs DX12 tests have demonstrated how weak AMD handles DX11 threads right now compared to NV. It is pretty much a fact now.
http://www.eurogamer.net/articles/digitalfoundry-2015-grand-theft-auto-5-pc-performance

It looks like the opposite in regards with the dx11/dx12 benchmarks though, AMD needs more/better threads here. Unless it's the clockspeed advantage that is coming into play with the i7 they used. Nevertheless, nvidia makes more sense for the folks in the price range of those cards who obviously don't have the latest cpu from intel overclocked to ~5Ghz in reviews.
 
Actually the same pattern repeats again in GTA V, 750Ti delivers better performance than R9 280 when both are paired with a Core i3. In fact, recent DX11 vs DX12 tests have demonstrated how weak AMD handles DX11 threads right now compared to NV. It is pretty much a fact now.
http://www.eurogamer.net/articles/digitalfoundry-2015-grand-theft-auto-5-pc-performance
Directly where you quoted me, I mentioned that very specific games may exhibit unique performance challenges where one card is favored over another. Your reply focused specifically on GTAV, which fits the description I gave.

I provided multiple links with data that supported my claim of general ambivalence regarding CPU power versus GPU output. You provided one link specific to one game. Until you can properly support your position, I will continue to refute it.
 
I provided multiple links with data that supported my claim of general ambivalence regarding CPU power versus GPU output. You provided one link specific to one game. Until you can properly support your position, I will continue to refute it.
I urge you to read the Mantle thread, where this specific matter has been discussed and analyzed to death, the conclusion is clear: AMD has higher CPU overhead than NV in DX11.

Here you also have a DF article where they tested Ryse, Far Cry 4, The Crew, and COD Advanced Warfare, all showed the same issue. And now GTA V, of course.
www.eurogamer.net/articles/digitalfoundry-2015-graphics-card-upgrade-guide?page=2

And here you have DX11 Vs DX12 tests showcasing the aforementioned phenomenon:
http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm/3
http://www.anandtech.com/show/9112/exploring-dx12-3dmark-api-overhead-feature-test/3
 
I urge you to read the Mantle thread, where this specific matter has been discussed and analyzed to death, the conclusion is clear: AMD has higher CPU overhead than NV in DX11.
If you follow the history of those games, NVIDIA was losing all of those battles until Mantle was shown to help. Then, as if by magic, NVIDIA whips out some new drivers to "fix" the problems that they purportedly never had, because their drivers were always less CPU dependant? Care to look up the definition of "revisionist history"?

Here you also have a DF article where they tested Ryse, Far Cry 4, The Crew, and COD Advanced Warfare, all showed the same issue. And now GTA V, of course.
www.eurogamer.net/articles/digitalfoundry-2015-graphics-card-upgrade-guide?page=2
Interestingly enough, the R260x (and 270x) both end up matching (or beating) the 750Ti in the lowest CPU case, and then exceeding the 750Ti in the higher CPU case. So somehow this proves that AMD's driver is "heavier" than NVIDIA's? The only single game where AMD loses (on both cards) is Call of Duty: Advanced Warfare. One data point does not a trend make.

Oh hey, look, it's StarSwarm again. See also: NVIDIA's need to "fix" that application issue, because their driver had... uh... some sort of strange fluke that nobody paid attention to until Mantle showed up. I had earlier mentioned individual applications that get tuned-in for certain video card manufacturers, you found one that was quite prevalent in the press for a while. No wonder why NVIDIA chose to focus on that...
 
Oh hey, look, it's StarSwarm again. See also: NVIDIA's need to "fix" that application issue, because their driver had... uh... some sort of strange fluke that nobody paid attention to until Mantle showed up. I had earlier mentioned individual applications that get tuned-in for certain video card manufacturers, you found one that was quite prevalent in the press for a while. No wonder why NVIDIA chose to focus on that...

Yeah, this is something I've noticed. I consider Nvidia's and AMD's drivers to be roughly on par. Each has features the other doesn't. Each has times when they are just horribly buggy.

But for popular titles (popular being key here), they spend a lot more resources optimizing for that title.

This is very noticeable in the MMO space. Where quite often, an Nvidia card that should be faster or similar speed to an AMD card performs worse with more bugs (crashes, black screens, hung systems, etc.). Except if it's a popular MMO at the time (WoW for instance) in which case it usually does better as expected.

Regards,
SB
 
I would wager, all things considered, that NVIDIA's drivers are going to be further ahead of AMD's simply because they have the people and the budget to do more tightly-coupled work with the individual development houses. To be blunt, there are a large number of games with specific NVIDIA optimizations for no other reason than NV offered to allocate resources to make it happen.

Nevertheless, I still find that AMD's drivers will perform better in the "general case", where NVIDIA couldn't be bothered to focus on a specific game or engine. I also acknowledge that there are always outliers and exceptions.
 
For years it was a constant experience of users of emulators (obviously the immensively cpu demanding 6th/7th gen console emulators) that amd cards performed worse than nvidia cards. It was consistant and constant among amd card owners. At least for prior generations of cards, AMD drivers unquestionably had higher overhead in the past. As of more recent generations I'm not too sure, but yes past generations no doubt AMD cards had higher overhead.
 
Back
Top