AMD Vega Hardware Reviews

Interesting! Some of the comments made me think that Vega must have very good and consistent frame times, even if it might output lower FPS.
In this test the fps is irrelevant since both GPUs would output over the 100Hz limit of the monitors at all times in Doom Vulkan at that resolution. Even if the Vega did dip below 100Hz on occasion, it wouldn't be noticeable with the adaptivesync display.
 
If the 1080 Ti system was going above the 100Hz limit, while Vega was still within the Freesync range, it would appear smoother.

When I play Overwatch, 135 FPS + Freesync >>> 250 FPS, by a wide margin.
 
Well, at least this abortion of a marketing strategy should put to rest all the hopes of magical "valid drivers with all features enabled!!!1" because there is no way in hell AMD is doing this if they can go toe-to-toe with Nvidia's lineup.

Surprising that they are testing it with a monitor and not a VR headset, after #radeonuprising swept the streets with a tidal wave of "affordable VR for the masses" last summer and all...
 
Surprising that they are testing it with a monitor and not a VR headset, after #radeonuprising swept the streets with a tidal wave of "affordable VR for the masses" last summer and all...
That marketing was specifically targeting the RX480 as providing the performance target for VR. I don't believe people associate "affordable" GPUs with enthusiast $500+
 
Well, that's a useless test and video, except for the fact that these people have convinced themselves of massive differences between the two systems, even though they have most likely picked the one that is the lower performer. Blind tests can cause people to say a lot of stupid things.
 
100Hz monitor @ 3440x1440? At this resolution the 1080TI can do 170fps, which would exceed the monitor's refresh rate and cause stutters, which would appear worse. Adpative V.Sync is needed in this situation.


We have stooped so low indeed, in the era of frame times, fps analysis, we resort to blind tests! Fantastic!
 
Last edited:
100Hz monitor @ 3440x1440? At this resolution the 1080TI can do 170fps, which would exceed the monitor's refresh rate and cause stutters, which would appear worse. Adpative V.Sync is needed in this situation.
Ideally wouldn't it be better to just set an fps cap?
 
All of this is so... Wtf is happening. So now, if it's "look good" (with a bad setup for nvidia, no sync&such) then it's a victory ?
This is how you're selling gpus in 2017 ?

I'm speechless with the Vega situation / marketing. Own the failure and move on instead of this...
 
Hats off to HardOP for revealing the uselessness of blind gaming tests ... guess that's why they decided to use a 1080Ti.
 
Last edited:
100Hz monitor @ 3440x1440? At this resolution the 1080TI can do 170fps, which would exceed the monitor's refresh rate and cause stutters, which would appear worse. Adpative V.Sync is needed in this situation.


We have stooped so low indeed, in the era of frame times, fps analysis, we resort to blind tests! Fantastic!

In that case should not Gsync work to avoid stuttering? There was a Gsync monitor coupled with 1080Ti in the blind test.
 
All of this is so... Wtf is happening. So now, if it's "look good" (with a bad setup for nvidia, no sync&such) then it's a victory ?
This is how you're selling gpus in 2017 ?

I'm speechless with the Vega situation / marketing. Own the failure and move on instead of this...

How/why is this a bad setup for nvidia? The HW was identical except for the gfx card and the gsync vs freesync monitors, And the OS was reinstalled by Brent.
 
How/why is this a bad setup for nvidia? The HW was identical except for the gfx card and the gsync vs freesync monitors, And the OS was reinstalled by Brent.

He may be talking about the screen resolution being a bit low for what a GTX1080Ti can do.
 
AMD is just testing how much of a difference we need in GPU performance with adaptive syncs before start noticing it.

By the same logic we can make a blind test of a 480 vs 1080Ti and cap both at 60Hz(with games and settings we know the 480 can do at 60Hz) but that doesn't say the 480 is equal to a 1080Ti it just say how useless the methodology is.

I'm actually surprise, with all the time AMD had since they know the performance of Vega(6 months or more?) they could end up with a better strategy...some heads in the marketing department needs to roll if they want to at least compete in that regard.
Would be funny if Intel fired the marketing team responsible for the slidegate and AMD hire them :LOL:
 
He may be talking about the screen resolution being a bit low for what a GTX1080Ti can do.
1440p is a fairly popular resolution for gaming and the blind tests AMD had been showing were against 1080 where performance is currently equivalent. Regardless, the blind tests are representative of what was tested.

I'm actually surprise, with all the time AMD had since they know the performance of Vega(6 months or more?) they could end up with a better strategy...
There is something to be said for purposely chaotic so a competitor can't predict your actions.
So far they are presenting what appear to be fair, neutral gameplay/experience without taking raw performance into the equation. Other than being subjective by definition, there is nothing wrong with that.

Would be funny if Intel fired the marketing team responsible for the slidegate and AMD hire them
Funny would be if AMD had glue in the background of every Ryzen product demonstration and owned it.
 
As I said I can make a blind test and make people see no difference between a 480 and a 1080. That of course is no representative of actual performance of both cards just me chosing a scenario where ppl can't tell the difference. Nothing more.

Enviado desde mi 2PS64 mediante Tapatalk
 
As I said I can make a blind test and make people see no difference between a 480 and a 1080. That of course is no representative of actual performance of both cards just me chosing a scenario where ppl can't tell the difference. Nothing more.

Enviado desde mi 2PS64 mediante Tapatalk
Isn't that an argument of benchmarks being superior to actual experience? Essentially image quality doesn't matter so long as framerates are high. Actual performance so far hasn't been the point of these demonstrations as no framerates are disclosed.

In an unrelated note, can the pro vs gaming toggle of FE do variations of the gaming drivers? Select between 17.1 and 17.7 for example without reinstalling.
 
I liked Kyle's video! It wasn't scientific, highly subjective, and of dubious relevant value about the card but it was definitely interesting and I gotta admit that Kyle's interviews were a whole lot more balanced and fair than I thought he could have went.

It doesn't prove anything, but damned if it wasn't fun for me to watch. :)
 
some person noted the AMD setup being more responsive, I wonder if there is any actual difference in input lag in this case? from the monitor side, gsync/freesync, Doom Vulkan implementation or AMD software vs Nvidia software
 
Back
Top