The AMD Radeon RX Vega was tested on a Ryzen 7 system inside a closed PC. People were not able to see the graphics card at all and play through only one gaming title. The title selected was DICE’s Battlefield 1 and was running on a resolution of 3440×1440. The graphics card was compared to a second PC that was running a NVIDIA GeForce GTX 1080 (reference card). This was confirmed by a AMD representative who said that they were using a GTX 1080 (Non Ti) against the Radeon RX Vega graphics card.
Both systems were running a curved display, one had G-Sync and one had Freesync so it’s obvious that the G-Sync system had GeForce installed and the Freesync system had Radeon Installed. How ever, there was no indication which system was using the Freesync + Radeon setup and which had the G-Sync + GeForce setup. Both monitors were fully covered so even the specific models weren’t mentioned.
Anyways, moving onward, there was one system that faced a little hiccup and was performing worse. There’s no way to tell which system that was but we know that a single GeForce GTX 1080 does over 60 FPS (average) on 4K resolution with Ultra settings. The tested resolution is not high compared to 4K so it’s possible that the GTX 1080 was performing even better. Plus, AMD not publicly showing any FPS counter and restricting max FPS with syncing showcases that there was possibly a problem with the Radeon RX Vega system against the GeForce part.
- The AMD rep guy was asked and he said it’s a GTX 1080 non Ti against the RX VEGA
- We were given 2 systems with an RX and a GTX to play BF1 on.
- They do use free- and g-sync and yes there were no fps counters. From my experience there were no fps drops on any of the systems.
- There was a little hiccup, but they resolved it in an instant and from my experience and many others the difference was unnoticeable. Mind you we were not told and are not going to be told which setup is which. Via Szunyogg Reddit
Lastly, AMD reps told the public that the AMD system has a $300 US difference which means two things. First, AMD puts the average price of a Freesync monitor compared to a G-Sync monitor as $200 US cheaper. Then if we take that $200 out of the $300 from what AMD told, it means that the Radeon RX Vega can be as much as $100 US cheaper than the GeForce GTX 1080 at launch which should be a good deal but they haven’t told anything on things aside from that like performance numbers in other titles, power consumption figures and most importantly, what clocks was Vega running at which seems a little sad.