AMD Vega Hardware Reviews

Can anyone confirm if those drivers are the same used on Vega FE or newer?
Driver for this FireStrike result is 22.19.640.2
It's seemingly newer than the current FE's public driver around (22.19.384.2), but it's older than the 3dmark11 scores that appeared 3 weeks ago (22.19.653.0).


Seems odd. Unless it's priced the same as the Ti, I don't know why you'd compare that way. I suppose the thought is best on best (aligning flops?), but really it comes down to price brackets.
I'm surprised AMD is giving Kyle the time of their day. Him having the emotional maturity of a 12-year-old apart, how popular is HOCP these days anyway?

Regardless, a couple of hours after AMD leaves with the card, a pic appears:

8IBLKN8.jpg
 
Last edited by a moderator:
Seems odd. Unless it's priced the same as the Ti, I don't know why you'd compare that way. I suppose the thought is best on best (aligning flops?), but really it comes down to price brackets.

Price isn't definitively known, but that aside I believe this is a case of being contrarian. Kyle made a statement to that effect. If AMD were up on Br'er Rabbit, they would have begged him not to test it against the 1080.

Regardless of whether a blind test is structured to make Vega look better than it should or stacked to potentially keep it inferior, it still seems like two sides of the same pointless coin.
 
Price isn't definitively known, but that aside I believe this is a case of being contrarian. Kyle made a statement to that effect. If AMD were up on Br'er Rabbit, they would have begged him not to test it against the 1080.

Regardless of whether a blind test is structured to make Vega look better than it should or stacked to potentially keep it inferior, it still seems like two sides of the same pointless coin.

If anything it's just an interesting experiment in terms of figuring out if gamers can actually recognize small variances in framerate on variable refresh monitors, or particular graphics options in real time. Depends if he keeps settings aligned in his experiment, I guess.

But yes, I don't know what the experiment is meant to prove in terms of value for $ if you're buying a gpu. You're still probably going to pick the one that either gives you the best performance margin you can afford, or maybe consider other factors like fan noise, power consumption to fit your room.
 
I think that's a good choice to compare it too from a raw technical point of view, matching the two closest chips in terms of die size and transistors while not forgetting AMD is at 14nm while NVidia is at 16nm, that's what actually makes it unfortunately "odder" for AMD, since their chip is bigger and at smaller node than NVidia and unfortunately everything so far seems to indicate its behind the 1080ti in gaming.

Now from the consumer point of view its always price/performance/Watt.
 
Seems odd. Unless it's priced the same as the Ti, I don't know why you'd compare that way. I suppose the thought is best on best (aligning flops?), but really it comes down to price brackets.
Mostly cause its hardocp and its favoring of Nvidia is legendary.

Honestly tho if it does well against a 1080 and costs similar it may not be a huge problem. I think i may still upgrade my cpu first tho and wait till next year for a new card. I'm not sure whats on deck that would stress my 290 at 1080p that badly this year
 
Seems odd. Unless it's priced the same as the Ti, I don't know why you'd compare that way. I suppose the thought is best on best (aligning flops?), but really it comes down to price brackets.
It's not about the price, AMD used the 1080 on their road shows (and asked Kyle to use it as well), there was hardly a difference there. I think Kyle wanted the 1080Ti to accentuate the difference so that the blind test tie can be broken.
 
If anything it's just an interesting experiment in terms of figuring out if gamers can actually recognize small variances in framerate on variable refresh monitors, or particular graphics options in real time. Depends if he keeps settings aligned in his experiment, I guess.
Once it starts injecting human variability, there are a lot of potential gotchas and difficulty in providing reproducibility or easily communicable results.

I do not know the methodology employed, but it sounds like a testing situation with limited time for analysis or rigor in sampling and controls. The probable outcome seems to have been predetermined by the tester's choices as well.
Tweaking things would be limited by the time it would take to redo observations, and how much the observers' perceptions vary over time.
It doesn't seem like AMD's events were double-blind, and it isn't clear how much data was gathered to determine if the outputs were consistent over time.
There could be ways to consciously and unconsciously influence results, and without a lot of transparency and attention to detail we may not get something that really helps us understand.

It would be nice if there weren't the confounding factor of incompatible VRR methods, where the monitors or the standards cannot be switched.

One idea, if it were possible, would be a benchmark or playthrough recording with constant frame contents coupled with something like a simple compute shader looping on a time check prior to the end of a frame to get the cards to handle what would appear the generally the same output.
Perhaps the timing mode could provide something that would be more broadly reproducible if that were carried across sessions.
 
It's not about the price, AMD used the 1080 on their road shows (and asked Kyle to use it as well), there was hardly a difference there. I think Kyle wanted the 1080Ti to accentuate the difference so that the blind test tie can be broken.

To me it sounds more like they are mending the relationship. Who will be able to tell the difference between 90 and 110fps on FS/G-Sync monitor?

[H]'s legendary NV bias was mentioned. I remember it more like love hate drama.

Seems odd. Unless it's priced the same as the Ti, I don't know why you'd compare that way.

Wouldn't be the first time

https://www.hardocp.com/article/2011/01/11/amd_69706950_cfx_nvidia_580570_sli_review/1
 
So the actual Vega-driver is the same as used in leaks before launch?
Nope, it's older than that one. See post above yours.

Depends if he keeps settings aligned in his experiment, I guess.
Kyle/HardOCP are famous for not doing so. They usually present some "best achievable settings" throughout their reviews (they conspicuously change settings between systems to reach similar framerates), and only at the end they do a couple of apples-to-apples comparisons.

I think that's a good choice to compare it too from a raw technical point of view, matching the two closest chips in terms of die size and transistors
Discrete card market moves according to price/performance and brand recognition, not "raw technical points of view".


while not forgetting AMD is at 14nm while NVidia is at 16nm, that's what actually makes it unfortunately "odder" for AMD, since their chip is bigger and at smaller node than NVidia and unfortunately everything so far seems to indicate its behind the 1080ti in gaming.
GF's "14FF" and TSMC's "16FF+" are actually similar in transistor/gate size.
The GP102 doesn't have 2*FP16 throughput so it can't reach the same amount of markets that Vega can. Tradeoffs were made because AMD doesn't have the resources to develop as many different chips as nvidia.
 
I rather liked his explanation of "Because I wanted to. AMD wanted me to use a 1080.", nice to see some things never change and probable never should.

I'm looking forward to the video, should be interesting if nothing else. :)

To me, with the information on hand at this point, it seems like this will be as informative as having two separately stacked decks, whose direction and inaccuracy are wholly up to the level of sliminess or bile of the respective testers.

If we were think AMD's preferred testing scenario is dubiously predetermining an outcome, defiantly doing something that may just as likely force an opposite outcome just as strongly is in absolute terms equally disreputable.
The video would provide things of interest to the extent one wants to see defiance in place of discernment.

I'll take the opposite position, where I think that perhaps it's not that some things shouldn't change, but that more should change on more than one side.
 
Seems odd. Unless it's priced the same as the Ti, I don't know why you'd compare that way. I suppose the thought is best on best (aligning flops?), but really it comes down to price brackets.

Kyle loves to stack the deck against AMD. Read any of the reviews and the wording is always against AMD in the tone. AMD slightly faster? "About the same" or "can't notice". AMD slower by 1-2 fps? "Nvidia is clearly faster here".

I mean just look at the whole VR Leaderboard. Let me take Early Access games and test them on their first public release, then never remove those tests or re-test the game when its better optimized. They still plaster it onto new articles like the 580 testing even though its almost year old and inaccurate results. But its a great image to show how poor AMD GPUs did back then and will clearly influence people as most people just skim articles and look at the pictures.

Its odd to me that so many people are wanting Vega to fail, do you guys like slower progress and higher prices? Competition is great for everyone. Nvidia supporting Adaptive-Sync because of pressure from Vega would be amazing for everyone.

The whole red vs green teams thing is just odd. PC Gaming used to be about picking the best for the price and no segregation in software/hardware. VR is starting to do it as well which is sad (games locked to Rift for instance), and Hardware features locked out by GPU vendors is just as bad, let alone things like Physx or Gameworks features that are hardware locked to specific vendors.
 
Last edited:
Seems odd. Unless it's priced the same as the Ti, I don't know why you'd compare that way. I suppose the thought is best on best (aligning flops?), but really it comes down to price brackets.

To hammer the message "you can't even tell the difference between VEGA and a 1080Ti!...basically same card!" before the hard numbers come out in reviews.
 
Seems odd. Unless it's priced the same as the Ti, I don't know why you'd compare that way. I suppose the thought is best on best (aligning flops?), but really it comes down to price brackets.
It may be priced the same, but we should know shortly. Could possibly have something to do with Vega's introduction and "Poor Volta". Guess we will find out soon enough if there is any truth to that slogan when compared to the 1080 Ti.
 
Thought they would have tested at a higher resolution considering the hardware ....
 
Back
Top