AMD Vega Hardware Reviews

Well, take it at face value then or dismiss it if you don't agree. edit:Apparently, other players are not 100 % in line with your assumptions. It was a game test, not a GPU test, so maybe that's why my colleague did not force highest possible detail? Don't know. :)
I think part of the reason people argued your post is that somehow you presented the results as a Vega advantage despite the myriad of problems in the comparison that definitely point to no Vega advantage:
1-Tesing with old drivers for nvidia
2-Testing with no Ultra settings
3-Having almost all geforce cards basically with the same performance on 1080p and 1440p
5-Having Vega cards performance fall off a cliff on 1440p and 4K
5-Not investigating the issue further

I think there is a massive problem with your testing there, as it is contradictory to other sites out-there maybe a system bug or something similar
Also being a game test doesn't excuse these issues, your site always performs individual game benchmarks at Highest settings. Dont know what changed this time.
All in all very sloppy work from PCGAMESHARDWARE. And very unusual of the site.
 
I'm sorry, but who says all tests have to be performed with Ultra settings? Or any given kind of settings in particular, for that matter?
Because ultra settings are the maximum visual quality available in a game? it is why people buy powerful graphics cards to enjoy pretty graphics at good fps? because pretty graphics are the reason we stuff GPUs and CPUs full of silicon? because pretty graphics are the source of immersion in games?

If you wont test powerful GPUs at maximum graphics then every one will choose whatever mix of settings that present their hardware in the best of light, then what is the point of making gpu comparisons at all? simply every vendor should present the mixture of settings that best suit their hardware, Card A can't do 60 fps in destiny 2? play at medium and get your 60fps. Want 90? play a mix of lows and mediums. Card B don't do tessellation/high poly well? remove it from the game and play with blocky objects! Card C is bad in compute? remove every compute effect from the game and play with no post processing! Card F cant do MSAA? play your game with flickering edges! Now it turns into a console with predetermined visual experience!! There would be no point of benching graphics cards against each other whatsoever.
 
I think part of the reason people argued your post is that somehow you presented the results as a Vega advantage despite the myriad of problems in the comparison that definitely point to no Vega advantage:
1-Tesing with old drivers for nvidia
2-Testing with no Ultra settings
3-Having almost all geforce cards basically with the same performance on 1080p and 1440p
5-Having Vega cards performance fall off a cliff on 1440p and 4K
5-Not investigating the issue further

I think there is a massive problem with your testing there, as it is contradictory to other sites out-there maybe a system bug or something similar
Also being a game test doesn't excuse these issues, your site always performs individual game benchmarks at Highest settings. Dont know what changed this time.
All in all very sloppy work from PCGAMESHARDWARE. And very unusual of the site.
Thank you for your assessment.

Your points, basically, have all been adressed or at least been honorably mentioned in the course of the discussion with the execption maybe of your „second 5th“ point, so I won't go over them again.
Why this was not investigated further, I cannot tell. You know, I just work there, I don't do the timetables for everyone. Maybe there where other issues deemed more important at that time? As i said, i just handed over the cards and did find the results interesting enough to post here, especially because they are deviating from the norm.

And FWIW, PC Games Hardware is not that kind of company that needs all caps, except maybe when you use the shorthand PCGH, then we're gladly accepting big letters. ;)
 
Because ultra settings are the maximum visual quality available in a game? it is why people buy powerful graphics cards to enjoy pretty graphics at good fps? because pretty graphics are the reason we stuff GPUs and CPUs full of silicon? because pretty graphics are the source of immersion in games?

If you wont test powerful GPUs at maximum graphics then every one will choose whatever mix of settings that present their hardware in the best of light, then what is the point of making gpu comparisons at all? simply every vendor should present the mixture of settings that best suit their hardware, Card A can't do 60 fps in destiny 2? play at medium and get your 60fps. Want 90? play a mix of lows and mediums. Card B don't do tessellation/high poly well? remove it from the game and play with blocky objects! Card C is bad in compute? remove every compute effect from the game and play with no post processing! Card F cant do MSAA? play your game with flickering edges! Now it turns into a console with predetermined visual experience!! There would be no point of benching graphics cards against each other whatsoever.

I would agree if we were talking about vendor-supplied benchmarks, but we're talking a about independent reviews.

If a reviewer feels that a specific set of settings provides the best quality/computational_cost ratio, testing at these settings is relevant. If a particular set of settings provides a competitive edge (best viewing distance, reduced ability for enemies to use graphical elements as camouflage, higher framerates, etc.), testing at these settings is relevant. If a game ships with quality presets (low, medium, high, ultra, whatever) testing at these settings is relevant.

In a nutshell, if a particular set of settings will be appealing enough to actual gamers that they will use it, testing it is relevant.
 
Surely it seems there is some CPU bottleneck on NV cards, but it is strange, notoriously it's the AMD drivers having an higher overhead and at 1080p and anyway the AMD cards do not seem limited in the same way (which is two times as strange).
But as 1080P performance is quite good, the hit at 1440p it is not really undestandable, the game seems not to be SO taxing, even with AA, and even if taking in account the NV's efficiency in bandwidth usage, RX Vega 64 has practically DOUBLE the bandwidth of a 1070...

For myself, I've noticed a higher dependence on CPU speed for NV cards than for AMD cards as you reach CPU saturation and if there are other programs contending for CPU processing time. The problem here is that NV does a great job of spreading the CPU load across available cores, meaning that 1 core isn't particularly burdened. AMD appears to do less well at spreading the CPU load across multiple cores and ends up with a single core more heavily utilized.

So what ends up happening for me is that if all cores are loaded 100% with multiple programs running, performance drops significantly more on my 1070 than on my 290. Enough that if the game suddenly hits a CPU heavy section, the 1070 will actually dip below the 290 in performance. This produces really unpleasant effects that range anywhere from really sluggish performance in all apps that are open on the 1070 vs the 290, or occasional juddering on the 1070, or feeling like you are moving through molasses on the 1070.

In GW2 with WvWvW battles (extremely CPU dependant with many on screen players) performance with the 1070 becomes bad enough that it actually interferes with network traffic though Ethernet. Impacting both the game and other applications. It's extremely frustrating. Reduce CPU load to under 100% and suddenly the NV driver isn't somehow interfering with the network traffic. This never happens with the 290.

However, if there is a little bit of CPU headroom or if there are no other programs contending for CPU processing time then the 1070 is quite obviously significantly faster than the 290.

In other words, from my experience, NV cards (at least the 1070) uses more CPU time, but uses multiple cores to reduce the load on any given CPU core compared to AMD (at least with the 290).

Regards,
SB
 
Last edited:
Vega looks much better than Fury at lower resolutions, that destiny benchmark has the Fury X not even 10% ahead of 580 while the 56 is more than 30% ahead of Fury X.

Might explain the higher hit for MSAA for Vega cards than Fury, the performance hit on the latter was masked due to more bottlenecks.
 
Vega looks much better than Fury at lower resolutions, that destiny benchmark has the Fury X not even 10% ahead of 580 while the 56 is more than 30% ahead of Fury X.
Not really. That's all explained simply from clock speed.
 
Primitive discard and clocks would be the difference. The MSAA performance still unknown, but cache thrashing, register spilling, etc could account for it. Driver tweaks for bin sizes most likely, otherwise L2 would be flushing almost the same as prior generations.
 
if a particular set of settings will be appealing enough to actual gamers that they will use it, testing it is relevant.
Who would determine if they are appealing or not? A popular vote? The safest bet is to test at the highest settings and work down from there, if you want to test High, Medium, and Low in addition to Ultra. Be my guest, but to leave Ultra and go for a single preset because it might be popular? That's not a good practice, what about people who want to run the game at max settings? How would they determine if a GPU is sufficient enough to run the game as the developer intended? If a GPU ran Ultra with sufficient fps, then you know for sure it can handle lower settings with even more fps. But if you test lower settings you are leaving the graphics enthusiasts to dry. A more intelligent way is to test Ultra next to whatever preset you want to choose.
If a reviewer feels that a specific set of settings provides the best quality/computational_cost ratio, testing at these settings is relevant.
I massively disagree here, in fact this is extremely dangerous, as you open up the door for bias and personal preferences to predetermine the outcome of the test by selectively adding or removing settings. How would you feel if a reviewer tested Fallout 4 on AMD GPUs with PhysX on because it adds quality effects?
 
It's kind of the point of having lots of reviewers from lots of sites: they will all test (slightly) different things, so you can get a better idea of the big picture. If everyone tested the same games at the same settings using the same setup, well, we wouldn't learn much from reading more than one review now, would we?
 
It's kind of the point of having lots of reviewers from lots of sites: they will all test (slightly) different things, so you can get a better idea of the big picture. If everyone tested the same games at the same settings using the same setup, well, we wouldn't learn much from reading more than one review now, would we?
data is only important if it shows your side winning :).

In other news my vega56 is now sitting next to me at work :)
This is the most sparse gpu package i have ever seen, literally just the card and like a 5 page booklet....
 
question
what is the point of benching games that are still on early access?
Given that the label „early access“ is getting more and more abused to excuse bugs („Hey, we're still in development and not yet in beta, it's early access!“), but more and more people buy before they try, I'd say the same point as with every other benchmark: Inform the reader what kind of performance can be expected.

Of course, you'd need to re-evaluate rather often, making those benchmarks not suitable for long-term stable benchmark parcours from which ratings are derived.
 
This is the most sparse gpu package i have ever seen, literally just the card and like a 5 page booklet....
Good. I can do without even more of those damn 4-pin molex to PCIe power connector adaptors cluttering up my life.

Not to mention ancient drivers on that included CD which I will never ever even touch, except to move aside so I can grab the real goods... :p
 
Mine (Xfx) had two 6-pin-to-8-Pin PCIe adaptors. How many PSUs are out there, that have 4 6-pin PCIe but no 8-pin?
 
It's kind of the point of having lots of reviewers from lots of sites: they will all test (slightly) different things, so you can get a better idea of the big picture. If everyone tested the same games at the same settings using the same setup, well, we wouldn't learn much from reading more than one review now, would we?
But multiple sites won't test with the exact same systems, nor the exact same areas of the game. And again, running games at console settings on High End GPUs is not useful, we already know they will run the game fast enough. The smart thing to do in this case is test multiple presets, but not to exclude Ultra altogether.
 
Last edited:
Back
Top