AMD Vega Hardware Reviews

As I said, pure speculation as enabling those "features" may actually make the card slower and possibly the reason they are not working correctly. One would think AMD would want to introduce the card in the best possible light but seems unable due to perhaps architectural mishaps -- kind of stupid developing technical superior features that you never know will work until consumers get the product.
Technically they haven't introduced the gaming card yet. That's in a few days as you said. Tying gaming drivers to that release isn't all that unexpected, so hard to say the card was shown in a bad light as it hasn't been shown beyond these FreeSync and some initial performance tests at the end of last year.

If features are slower then don't use them. That doesn't change the fact that a faster product on paper would likely be faster until proven otherwise.

So they try to push the clock as much as possible to be competitive, and with that, the power draw take a hit.
Clocks don't seem pushed though. They needed 1600MHz for the 25 FP16 TFLOPs that has been known for a while. Maybe voltages we're pushed, but clocks don't seem unreasonable.
 
If features are slower then don't use them. That doesn't change the fact that a faster product on paper would likely be faster until proven otherwise.

It would not be anything new, if true. Historically NVidia has been able to extract more performance from the same or less theoretical FLOPS than AMD. The gap closed when AMD moved from VLIW to GCN, but its still more or less there.

Titan X at 6 TFLOPS
Fury X at 8 GFLOPS

Fury X performance was within 10% of Titan X with 33% more theoretical FLOPs.
 
It's also interesting in that it calls into question how much we spend on performance we cannot actually perceive. I liked that part.
I think that's a bit beside the point. You probably won't notice any difference between any cards sold right now while playing minesweeper. You might start to notice some difference playing CS: GO. A bit more while playing Overwatch. More when playing Doom. And maybe even more when playing The Division, Wildlands or Ark.

Point is: Of course there is a point where you have "enough" performance for given circumstances and you won't notice much difference. But that's testing with the brakes on. WHAT IF from the two systems in question one would have sufficed for feeding a display at 144Hz compared to one at 100 Hz. Still equivalent feeling? What if one system was enough for 100 Hz at UHD, the other only 60 Hz at UHD?

We don't know that and yet we are fed with impressions that will lead many to believe "all equal", because that's what most people will remember, not how much more the brakes were on for one system than the other.
 
Last edited:
I think that's a bit beside the point. You probably won't notice any difference between any cards sold right now while playing minesweeper. You might start to notice some difference playing CS: GO. A bit more while playing Overwatch. More when playing Doom. And maybe even more when playing The Division, Wildlands or Ark.

Point is: Of course there is a point where you have "enough" performance for given circumstances and you won't notice much difference. But that's testing with the brakes on. WHAT IF from the two systems in question one would have sufficed for feeding a display at 144Hz compared to one at 100 Hz. Still equivalent feeling? What if one system was enough for 100 Hz at UHD, the other only 60 Hz at UHD?

We don't know that and yet we are fed with impressions that will lead many to believe "all equal", because that's what most people will remember, not how much more the brakes were on for one system than the other.

Does the whole "this is not a review," and the overly-repeated 'this is just one stupid test for fun' part elude you? I mean literally all of that verbiage is up front and repeated ad nauseam and yet there are seemingly endless comments now about how they should have done this and it wasn't good because of that. Sheesh. AMD gave Kyle less than a workday to do a blind test. You might have set it up differently. I would have too. Both of us would have failed to appease the ifs and buts crowd.

If only Kyle had somehow slowed time down and also had a collection of comparable G-sync and Freesync monitors and ten different games, etc. etc. but, alas, physics held, time was not stopped and Kyle did something that pleased nobody in the time he was given. For shame!
 
Does the whole "this is not a review," and the overly-repeated 'this is just one stupid test for fun' part elude you? I mean literally all of that verbiage is up front and repeated ad nauseam and yet there are seemingly endless comments now about how they should have done this and it wasn't good because of that. Sheesh. AMD gave Kyle less than a workday to do a blind test. You might have set it up differently. I would have too. Both of us would have failed to appease the ifs and buts crowd.

If only Kyle had somehow slowed time down and also had a collection of comparable G-sync and Freesync monitors and ten different games, etc. etc. but, alas, physics held, time was not stopped and Kyle did something that pleased nobody in the time he was given. For shame!
No. But you seem quite upset about it. Maybe more than what's good for sanity? Cool down.
 
Sapphire Radeon RX Vega 64 Card SKU names surface
From what we can derrive from the information, there will be two cooling solutions for Vega cards, liquid (LCS) and obviously air-cooled ones. Sapphire seems to be releasing three models:

  • SAPPHIRE RADEON RX VEGA 64 8G HBM2 HDMI + TRIPLE DP, LIQUID COOLING 2048-bits - Water Cooler
  • SAPPHIRE RADEON RX VEGA 64 8G HBM2 HDMI + TRIPLE DP LIMITED EDITION 2048-bit - 2 slot active
  • SAPPHIRE RADEON RX VEGA 64 8G HBM2 HDMI + TRIPLE DP 2048-bit - 2 slot active
The Limited edition probably is a higher clocked edition. We cannot really verify this information whatsoever, however it would make no sense to make up naming like that. RadeoN RX Vega 64.If you think that "64" suffix is a little weird, it isn't really when you think about it. A full Vega chip has 64 shader processor clusters with 64 shader processors each = 4096 shader processors.

There is more info though, El Chapuzas mentions a 699 Euro and 899 Euro pricing in-between the cheapest and that liquid cooled edition. That would be ex VAT here in the EU with prices roughly similar in USD. El Chapuzas states that their source is reliable. As always take news like this with a grain of salt, but looking at the naming schema, it would not surprize me if it is correct.
http://www.guru3d.com/news-story/sapphire-radeon-rx-vega-64-card-sku-names-surface.html
 
Nope, If AMD had EnhancedSync enabled as well then the experience should feel similar to NV's FastSync. But Kyle didn't enable EnhancedSync for AMD. It wasn't there in the drivers. Nobody knew it even existed for AMD till yesterday.

You do realize that RX Vega is using drivers that no one has access to right? EnhancedSync was probably used, and if not, then it would be even worse as normal VSync would have been used.

VRR technologies (FreeSync/GSync) do provide massively better experience IMO, provided that you have the fps to stay under the monitor refresh rate. Because if you exceed it, VRR is practically disabled. And you are left with a regular panel, tearing will come up again.

FastSync / EnhancedSync supplement FreeSync/GSync. They aren't a replacement. They replace VSync for when you are rendering above the max refresh rate.
 
Yes? Of course they are. So we should expect all the review sites to only test subjective playing experience and give a RealFeel™ score out of 10 on how they thought the performance was for them personally? Screw frame times, how did it feeeeeeeel??

You do realize that [H] wanted to completely remove FPS from their results last year?

Also FPS is not the same as frame times or actual FCAT or similar results.

FPS can lie, it was proven back in the microstuttering mGPU issues. FPS was great, but games "felt" bad. It was because of bad frame pacing.
 

Pft... they're copying the wrong console vendor.

Mg1oFPX.png


(You now only read Vega like this.)
 
No. But you seem quite upset about it. Maybe more than what's good for sanity? Cool down.

??
Not in the least. I was literally laughing over the whole indignation thing after the video was so overtly couched as not serious. You have to realize that it takes a pretty monumental thing to upset me anymore. One of the blessings of a life turned upside down. I don't ruffle easily.
 
Seems unwise of AMD to encourage this, since their future plans for revenue in the higher segments is rather dependent on people spending as much as AMD can get them to...
Unless that's an indication of AMD's thoughts on their future competitiveness...? [runs away] ;-P for CarstenS
 
FastSync / EnhancedSync supplement FreeSync/GSync. They aren't a replacement. They replace VSync for when you are rendering above the max refresh rate.
Where did I imply they are replacement for VRR? I am seriously asking because this is the 2nd you misquote me!

You do realize that RX Vega is using drivers that no one has access to right? EnhancedSync was probably used, and if not, then it would be even worse as normal VSync would have been used.
Nope, just regular VSync On.
 
Last edited:
I guess that indeed remains a possibility.

But then the discussion will turn into what technology offers lower latency? FastSync or EnhancedSync?

It's effectively a modified triple buffering solution. NV measured it's latency and found it higher than disabled V.Sync.

PascalEdDay_FINAL_NDA_1463156837-041_575px.png


You would only notice if you compare to another identical system which has no VSync nor FastSync active.

What is that chart from? What's the x-axis?
 
Kind of weird that this thread circled to, what I'd summarize as, having a high-end gpu can adversely affect user experience if your frame rate is too high for your monitor because you'll either have to deal with micro-stutters and tearing, or deal with the increased latency of vsync or fastsync/enhancedsync. I've never seen matching a gpu to a monitor mentioned in a review before, and I've also never had a gpu where I needed to worry about it. Last time I had a gpu delivering framerates well above my monitor refresh was CS 1.6 on a CRT ;)
 
I've never seen matching a gpu to a monitor mentioned in a review before, and I've also never had a gpu where I needed to worry about it. Last time I had a gpu delivering framerates well above my monitor refresh was CS 1.6 on a CRT ;)

It's matching a GPU to the monitor, system, application, graphics settings, game revision, driver revision, specific level, mods, temperature, ASIC variation, phase of moon, etc.
Even if Vega happens to latch to this specific threshold in this one game, we see enough variability based on level to potentially throw either system above or below the threshold in different combinations.
 
Kind of weird that this thread circled to, what I'd summarize as, having a high-end gpu can adversely affect user experience if your frame rate is too high for your monitor because you'll either have to deal with micro-stutters and tearing, or deal with the increased latency of vsync or fastsync/enhancedsync. I've never seen matching a gpu to a monitor mentioned in a review before, and I've also never had a gpu where I needed to worry about it. Last time I had a gpu delivering framerates well above my monitor refresh was CS 1.6 on a CRT ;)
welp with 1080p monitors still being the majority and hardware moving on to making 4k work its only a matter of time before even low end cards will run new games at crazy high frame rates at 1080p
 
As mentioned above, all the FE gaming tests haven't shown any of the new features making a real difference. It also stands to reason the launch wouldn't be pushed back unless something wasn't working or AMD wanted more inventory.
FE tests have also shown that at least some of the major new features are not implented in those FE drivers, like rasterization which is identical to older models
 
welp with 1080p monitors still being the majority and hardware moving on to making 4k work its only a matter of time before even low end cards will run new games at crazy high frame rates at 1080p
Surely you can always switch on some poorly tuned "Ultra" setting, designed to sell new video cards, and remedy those high frame rates. ;)
 
Last edited:
Seriously? Think about that for a moment. 1080p used to work well with a Radeon 9700 too..
Thas not what i'm saying . I'm saying it will work too well. You didn't have issues with 970s getting hundreds of frames per second at 1080p . The logical step for gamers is higher resolution monitors with fluid hz like gysnc or freesync . But not everyone will jump to that , many gamers may have invested alot into monitors and don't want to do that again for a long time.
 
Back
Top