AMD Vega Hardware Reviews

Kind of weird that this thread circled to, what I'd summarize as, having a high-end gpu can adversely affect user experience if your frame rate is too high for your monitor because you'll either have to deal with micro-stutters and tearing, or deal with the increased latency of vsync or fastsync/enhancedsync.
It has because when you buy a VRR monitor, you expect it to perform better than a regular monitor (no tearing, no stutter or lag). This will not happen if your GPU is outputting fps that are far higher than your refresh rate. So you need to take this into account before you buy a VRR monitor. Also when you are trying to subjectively judge experience on a VRR monitor, it's crucial to know whether VSync (or any other similar technique) is on or off.

For example, Kyle now confirmed he had VSync on for both cards. As previously stated this will set the 1080Ti back significantly, since it's fps are always higher than 100 (in the range of 150 actually), which will manifest as noticeable lag. On the other hand, Vega will probably not suffer as badly (based on Vega FE results) since it's fps is around 100 or lower during heavy action scenes.

In the end, the comparison is biased against the more powerful card, you are forcing it to operate equally to the lower card. That will not reflect well even from a subjective point of view.

What is that chart from? What's the x-axis?
NV's slide deck from Pascal launch, the X-axis is probably frame numbers.
http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/13
 
Last edited:
It has because when you buy a VRR monitor, you expect it to perform better than a regular monitor (no tearing, no stutter or lag). This will not happen if your GPU is outputting fps that are far higher than your refresh rate. So you need to take this into account before you buy a VRR monitor. Also when you are trying to subjectively judge experience on a VRR monitor, it's crucial to know whether VSync (or any other similar technique) is on or off.

For example, Kyle now confirmed he had VSync on for both cards. As previously stated this will set the 1080Ti back significantly, since it's fps are always higher than 100 (in the range of 150 actually), which will manifest as noticeable lag. On the other hand, Vega will probably not suffer as badly (based on Vega FE results) since it's fps is around 100 or lower during heavy action scenes.

In the end, the comparison is biased against the more powerful card, you are forcing it to operate equally to the lower card. That will not reflect well even from a subjective point of view.


NV's slide deck from Pascal launch, the X-axis is probably frame numbers.
http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/13

I find this whole thing kind of odd. So if someone owns a 3860x1440 monitor or even something smaller, buying a high-end gpu could potentially be a very bad choice? This basically never dawned on me. Or would fastsync/enhancedsync be the required solution? As far as I understand those two don't have monitor dependencies. I suppose there's not much difference at any framerate. Basically if you outpace your variable refresh monitor, you're kind of stuck in the same boat as if your monitor didn't have variable refresh monitor at all?
 
I find this whole thing kind of odd. So if someone owns a 3860x1440 monitor or even something smaller, buying a high-end gpu could potentially be a very bad choice? This basically never dawned on me. Or would fastsync/enhancedsync be the required solution? As far as I understand those two don't have monitor dependencies. I suppose there's not much difference at any framerate. Basically if you outpace your variable refresh monitor, you're kind of stuck in the same boat as if your monitor didn't have variable refresh monitor at all?

The solution is obvious: play turns-based strategy games.
 
Performance is not always about present is also about future and even if its true that AMD GPUs old better than envidia not everyone would want to buy a GPU and wait years to get the real potential of it.
There is always the settings thing. With a more powerful GPU you can use higher settings, ergo better visual quality even thins like the noise a card produce when a game push the gpu harder in one card than the other.

Either way we are not discussing the performance of the Vega card since no1 know or have proof about it, we are discussing the uselessness of this blind test. Its like having a drag race with the more powerful card braking to keep pace with the other car.
 
Either way we are not discussing the performance of the Vega card since no1 know or have proof about it[...]
True, though I would bet my larger testicle that the "gaming" Vega card will hardly differ from the Vega FE. Ryan and the crew at PCPer seem fairly certain about this as well. It would take an epic level of dumbassery for AMD to release to market a significantly performance hobbled FE Vega shortly before a much better performing mass market card. Especially when the FE is supposed to be their response to Titan. So if you expect the gaming Vega to be significantly better than the FE, you must assume AMD's Radeon team is staffed with some of the most horrifically incompetent imbeciles in the business world.
 
Last edited:
It has because when you buy a VRR monitor, you expect it to perform better than a regular monitor (no tearing, no stutter or lag). This will not happen if your GPU is outputting fps that are far higher than your refresh rate. So you need to take this into account before you buy a VRR monitor. Also when you are trying to subjectively judge experience on a VRR monitor, it's crucial to know whether VSync (or any other similar technique) is on or off.

For example, Kyle now confirmed he had VSync on for both cards. As previously stated this will set the 1080Ti back significantly, since it's fps are always higher than 100 (in the range of 150 actually), which will manifest as noticeable lag. On the other hand, Vega will probably not suffer as badly (based on Vega FE results) since it's fps is around 100 or lower during heavy action scenes.

In the end, the comparison is biased against the more powerful card, you are forcing it to operate equally to the lower card. That will not reflect well even from a subjective point of view.


NV's slide deck from Pascal launch, the X-axis is probably frame numbers.
http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/13

SO no one of this gpu could pass through V-sync ? ( V-sync set at 60hz. 60fps or 75-144 fps ) but you prefer say this was limit the 1080TI instead of the AMD one ... fair ..
 
Thas not what i'm saying . I'm saying it will work too well. You didn't have issues with 970s getting hundreds of frames per second at 1080p . The logical step for gamers is higher resolution monitors with fluid hz like gysnc or freesync . But not everyone will jump to that , many gamers may have invested alot into monitors and don't want to do that again for a long time.
Yeah I'm not sure when I'll get around to buying one of those gaming monitors myself. I've got a 12 year old 1920x1200 24" Dell and a 4 year old 32" 1440p Benq here. Just not very excited by the gsync/freesync thing honestly. Need something else exciting to go with it. I thought I was gonna buy into VR but that's going nowhere.
 
True, though I would bet my larger testicle that the "gaming" Vega card will hardly differ from the Vega FE. Ryan and the crew at PCPer seem fairly certain about this as well. It would take an epic level of dumbassery for AMD to release to market a significantly performance hobbled FE Vega shortly before a much better performing mass market card. Especially when the FE is supposed to be their response to Titan. So if you expect the gaming Vega to be significantly better than the FE, you must assume AMD's Radeon team is staffed with some of the most horrifically incompetent imbeciles in the business world.
26 pages of speculation about it though! Man gossip overload!
 
Yeah I'm not sure when I'll get around to buying one of those gaming monitors myself. I've got a 12 year old 1920x1200 24" Dell and a 4 year old 32" 1440p Benq here. Just not very excited by the gsync/freesync thing honestly. Need something else exciting to go with it. I thought I was gonna buy into VR but that's going nowhere.
doesn't help that alot of them are tn panels.
 
I find this whole thing kind of odd. So if someone owns a 3860x1440 monitor or even something smaller, buying a high-end gpu could potentially be a very bad choice? This basically never dawned on me. Or would fastsync/enhancedsync be the required solution? As far as I understand those two don't have monitor dependencies. I suppose there's not much difference at any framerate. Basically if you outpace your variable refresh monitor, you're kind of stuck in the same boat as if your monitor didn't have variable refresh monitor at all?
If the game doesn't have its own frame capping, yes.
The card could simply clock down when it detected itself blasting through frames. Stay in the upper variable sync range and gamers probably wouldn't notice.

Then there is Chill, which does something similar when high FPS isn't required.

True, though I would bet my larger testicle that the "gaming" Vega card will hardly differ from the Vega FE. Ryan and the crew at PCPer seem fairly certain about this as well.
That doesn't seem unreasonable. Half the ram, maybe higher core and memory clocks, and probably better cooling and power limits.

The problem is the assumption that FE won't see an uplift in gaming performance from software and drivers. Compute and pro favoring a different compiler tool chain. RX and FE can both see a driver boost with RX still being faster.
 
On the other hand, Vega will probably not suffer as badly (based on Vega FE results) since it's fps is around 100 or lower during heavy action scenes.

What results are these exactly? I've only seen 4k results which have it running >60+


What review has 3440x1440 Doom Vulkan Testing?

Also Kyle was the one who picked a 1080 Ti instead of the 1080 AMD recommended.
 
Basically, yes.


If the game doesn't have its own frame capping, yes.

I have to say as an owner of 144hz 1440p gsync monitor: hitting vsync at 144hz isn't going to be something you notice. It's not a problem, though optimally you cap the framerate to 142 ingame. Vsync at 144hz equals about the same input lag that a 60hz monitor has without vsync.

Here: https://www.blurbusters.com/gsync/gsync101-input-lag/ its 44ms vs 39ms. I guess if you are a professional player you might notice input lag jumping from 21ms to 44.
 
Kind of weird that this thread circled to, what I'd summarize as, having a high-end gpu can adversely affect user experience if your frame rate is too high for your monitor because you'll either have to deal with micro-stutters and tearing, or deal with the increased latency of vsync or fastsync/enhancedsync. I've never seen matching a gpu to a monitor mentioned in a review before, and I've also never had a gpu where I needed to worry about it. Last time I had a gpu delivering framerates well above my monitor refresh was CS 1.6 on a CRT ;)
Can i throw in a bad car comparison? If you drive through the city at allowed 50 kph (here in germany), you might find yourself in the optimal rpm-range of your car's engine in 3rd gear for a small car that only goes up to 150 kph max. When driving a more potent car, you might find that at the same 50 kph you are slightly too high for 2nd gear and slightly to low rpm for 3rd gear. When driving a near-racing car (MC Laren F1 or a nice Lambo for example), you might find yourself in 1st gear all the time, giving you a less-than-optimal experience while driving through the city. Ok, your might have enough torque in that case to switch to 2nd or 3rd gear anyway, but you'd be missing the optimum rpm-range.

When driving on country roads, things change at 100 kph allowed, while only at "ze djerman autobahn" you can truly use all the horsepower you got (Vsync off). But I guess you knew that anyway, I just wanted to brag about our autobahns. :)


---
edit: And yes, there ARE sweet-spots of GPU-performance, monitor refresh and all that stuff. But one truth is: You have multiple ways of lowering Fps on your setup to match for example a 144 Hz display if you have excess Fps. But your options are more limited when it's the other way around: reduce details, overclock.
 
The problem is the assumption that FE won't see an uplift in gaming performance from software and drivers. Compute and pro favoring a different compiler tool chain. RX and FE can both see a driver boost with RX still being faster.
The PCPer guys basically put this whole "driver not meant for gaming and/or buggy" line to rest. They even dedicated a whole page of the review to Answering questions before you ask. A couple of relevant bits:

PCPer said:
This isn’t a gaming card.
Calling this “not a gaming card” is a fair statement, as long as you also agree then the GTX Titan, Titan X, Titan Xp are also “not gaming cards.”

The driver isn’t optimized for gaming.
First, that’s not the case and AMD has confirmed that. The driver has all the gaming optimizations that the other Radeon drivers would include up until at least the driver branching mentioned above. After that time, optimizations may or may not have made it in, as AMD tells it. The games we are using for this review were not released in the last 30 days or anything like that. GTA V, Rise of the Tomb Raider, Witcher 3; these are all games that have been out for some time, were around for AMD to address in both Radeon RX 500 and Vega-series drivers for many, many months.

There is also talk of the tiled rasterization not working at the moment, and this will make a significant difference when it is turned on in the driver. First, we don't even know if they will ever decide to enable it. It could be broken in hardware (unlikely IMO but similar things have happened before). More likely it will either work in future drivers and will provide a slight benefit in some games at higher resolutions, or it actually works fine but the performance gained/power saved is so insignificant they've decided not to bother with it in Big Vega. I am interested to know see how this pans out.
 
There is also talk of the tiled rasterization not working at the moment, and this will make a significant difference when it is turned on in the driver. First, we don't even know if they will ever decide to enable it. It could be broken in hardware (unlikely IMO but similar things have happened before). More likely it will either work in future drivers and will provide a slight benefit in some games at higher resolutions, or it actually works fine but the performance gained/power saved is so insignificant they've decided not to bother with it in Big Vega. I am interested to know see how this pans out.
There has to be some sort of "rabbit in a hat" still hidden, maybe it's tiled rasterizer, maybe it's something else, but the fact is that the around 950-1100€ prices are not placeholders (no, etailers don't order cards in at placeholder prices) and AMD certainly can't be stupid enough to think anyone would buy a gaming card at that price if it performed in gaming around the level of GTX 1080.
 
The PCPer guys basically put this whole "driver not meant for gaming and/or buggy" line to rest.
They actually did not.
PCPer made a statement claiming "these are gaming drivers". But then they semi-quote AMD who simply says "this current driver has all the features that are working reliably in Vega to date".
Meaning the FE driver had all the gaming optimization AMD deemed stable to release up to a month ago.
What I think PCPer was trying to do was putting to rest the theory that AMD was sandbagging on the drivers for Vega FE on purpose.

Rewind back to Computex events and expectations and there's obviously a delay that took place regarding RX Vega. They knew drivers in June would be good enough (i.e. functional) to release a "Frontier Edition", but not the gaming card.

Now, what we're 2 days from figuring out is whether the delay happened because the hardware is broken or drivers wouldn't be ready for Computex release + June availability. We know there's a bunch of stuff simply not working right from the B3D suite tests (blatant geometry, texel throughput and effective bandwidth issues), it's just a matter of knowing which (if any) will have been successfully enabled with RX Vega's driver.
 
There is also talk of the tiled rasterization not working at the moment, and this will make a significant difference when it is turned on in the driver. First, we don't even know if they will ever decide to enable it. It could be broken in hardware (unlikely IMO but similar things have happened before). More likely it will either work in future drivers and will provide a slight benefit in some games at higher resolutions, or it actually works fine but the performance gained/power saved is so insignificant they've decided not to bother with it in Big Vega. I am interested to know see how this pans out.
It's still important to distinguish between game specific optimizations and higher level scheduling through the drivers. It's possible ordering of instructions matters more with Vega. On par with Nvidia's software scheduling, but executing code in short bursts as opposed to each subsequent instruction (outside the cadence) being a new, independent wave. A feature like that could be working in compute and not graphics. Another possibility could be how divergence is handled, requiring new scalar code paths for synchronization. Compute being optimized around a wave size and graphics more dynamic when packing waves. It also stands to reason there are changes there for their work distribution that would probably still work with old paths.

I'd agree the tiled rasterization has the potential to make a difference, but I'd characterize it as more than slight. More important for poorly optimized titles in regards to draw order. That seemed to be the big change for Maxwell along with software scheduling. Of real significance is that tiled rasterization and NUMA optimizations are effectively the same thing. That could be a huge deal for multi GPU setups along with HBCC for memory management.
 
I'd agree the tiled rasterization has the potential to make a difference, but I'd characterize it as more than slight. More important for poorly optimized titles in regards to draw order. That seemed to be the big change for Maxwell along with software scheduling. Of real significance is that tiled rasterization and NUMA optimizations are effectively the same thing. That could be a huge deal for multi GPU setups along with HBCC for memory management.
Correct me if I'm wrong, but didn't NVIDIA go to software scheduling already with Kepler?
 
Back
Top