AMD Vega Hardware Reviews

Discussion in 'Architecture and Products' started by ArkeoTP, Jun 30, 2017.

  1. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,781
    Likes Received:
    2,568
    It has because when you buy a VRR monitor, you expect it to perform better than a regular monitor (no tearing, no stutter or lag). This will not happen if your GPU is outputting fps that are far higher than your refresh rate. So you need to take this into account before you buy a VRR monitor. Also when you are trying to subjectively judge experience on a VRR monitor, it's crucial to know whether VSync (or any other similar technique) is on or off.

    For example, Kyle now confirmed he had VSync on for both cards. As previously stated this will set the 1080Ti back significantly, since it's fps are always higher than 100 (in the range of 150 actually), which will manifest as noticeable lag. On the other hand, Vega will probably not suffer as badly (based on Vega FE results) since it's fps is around 100 or lower during heavy action scenes.

    In the end, the comparison is biased against the more powerful card, you are forcing it to operate equally to the lower card. That will not reflect well even from a subjective point of view.

    NV's slide deck from Pascal launch, the X-axis is probably frame numbers.
    http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/13
     
    #501 DavidGraham, Jul 27, 2017
    Last edited: Jul 27, 2017
    nnunn, homerdog, pharma and 1 other person like this.
  2. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,277
    Likes Received:
    3,725
    I find this whole thing kind of odd. So if someone owns a 3860x1440 monitor or even something smaller, buying a high-end gpu could potentially be a very bad choice? This basically never dawned on me. Or would fastsync/enhancedsync be the required solution? As far as I understand those two don't have monitor dependencies. I suppose there's not much difference at any framerate. Basically if you outpace your variable refresh monitor, you're kind of stuck in the same boat as if your monitor didn't have variable refresh monitor at all?
     
  3. Malo

    Malo Yak Mechanicum
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    7,030
    Likes Received:
    3,101
    Location:
    Pennsylvania
    Basically, yes.

    If the game doesn't have its own frame capping, yes.
     
  4. Mize

    Mize 3dfx Fan
    Moderator Legend Veteran

    Joined:
    Feb 6, 2002
    Messages:
    5,048
    Likes Received:
    1,097
    Location:
    Cincinnati, Ohio USA
    The solution is obvious: play turns-based strategy games.
     
  5. Malo

    Malo Yak Mechanicum
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    7,030
    Likes Received:
    3,101
    Location:
    Pennsylvania
    Or find-an-object games!
     
    Mize likes this.
  6. xEx

    xEx
    Regular Newcomer

    Joined:
    Feb 2, 2012
    Messages:
    939
    Likes Received:
    398
    Performance is not always about present is also about future and even if its true that AMD GPUs old better than envidia not everyone would want to buy a GPU and wait years to get the real potential of it.
    There is always the settings thing. With a more powerful GPU you can use higher settings, ergo better visual quality even thins like the noise a card produce when a game push the gpu harder in one card than the other.

    Either way we are not discussing the performance of the Vega card since no1 know or have proof about it, we are discussing the uselessness of this blind test. Its like having a drag race with the more powerful card braking to keep pace with the other car.
     
  7. homerdog

    homerdog donator of the year
    Legend Veteran Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,153
    Likes Received:
    928
    Location:
    still camping with a mauler
    True, though I would bet my larger testicle that the "gaming" Vega card will hardly differ from the Vega FE. Ryan and the crew at PCPer seem fairly certain about this as well. It would take an epic level of dumbassery for AMD to release to market a significantly performance hobbled FE Vega shortly before a much better performing mass market card. Especially when the FE is supposed to be their response to Titan. So if you expect the gaming Vega to be significantly better than the FE, you must assume AMD's Radeon team is staffed with some of the most horrifically incompetent imbeciles in the business world.
     
    #507 homerdog, Jul 28, 2017
    Last edited: Jul 28, 2017
  8. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland
    SO no one of this gpu could pass through V-sync ? ( V-sync set at 60hz. 60fps or 75-144 fps ) but you prefer say this was limit the 1080TI instead of the AMD one ... fair ..
     
  9. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,457
    Likes Received:
    580
    Location:
    WI, USA
    Yeah I'm not sure when I'll get around to buying one of those gaming monitors myself. I've got a 12 year old 1920x1200 24" Dell and a 4 year old 32" 1440p Benq here. Just not very excited by the gsync/freesync thing honestly. Need something else exciting to go with it. I thought I was gonna buy into VR but that's going nowhere.
     
    pharma likes this.
  10. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,457
    Likes Received:
    580
    Location:
    WI, USA
    26 pages of speculation about it though! Man gossip overload!
     
  11. eastmen

    Legend Subscriber

    Joined:
    Mar 17, 2008
    Messages:
    9,985
    Likes Received:
    1,497
    doesn't help that alot of them are tn panels.
     
  12. Anarchist4000

    Veteran Regular

    Joined:
    May 8, 2004
    Messages:
    1,439
    Likes Received:
    359
    The card could simply clock down when it detected itself blasting through frames. Stay in the upper variable sync range and gamers probably wouldn't notice.

    Then there is Chill, which does something similar when high FPS isn't required.

    That doesn't seem unreasonable. Half the ram, maybe higher core and memory clocks, and probably better cooling and power limits.

    The problem is the assumption that FE won't see an uplift in gaming performance from software and drivers. Compute and pro favoring a different compiler tool chain. RX and FE can both see a driver boost with RX still being faster.
     
    CarstenS likes this.
  13. BacBeyond

    Newcomer

    Joined:
    Jun 29, 2017
    Messages:
    73
    Likes Received:
    43
    What results are these exactly? I've only seen 4k results which have it running >60+



    What review has 3440x1440 Doom Vulkan Testing?

    Also Kyle was the one who picked a 1080 Ti instead of the 1080 AMD recommended.
     
  14. Kyyla

    Veteran

    Joined:
    Jul 2, 2003
    Messages:
    1,004
    Likes Received:
    293
    Location:
    Finland
    I have to say as an owner of 144hz 1440p gsync monitor: hitting vsync at 144hz isn't going to be something you notice. It's not a problem, though optimally you cap the framerate to 142 ingame. Vsync at 144hz equals about the same input lag that a 60hz monitor has without vsync.

    Here: https://www.blurbusters.com/gsync/gsync101-input-lag/ its 44ms vs 39ms. I guess if you are a professional player you might notice input lag jumping from 21ms to 44.
     
    Lightman, pharma and DavidGraham like this.
  15. CarstenS

    Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    4,798
    Likes Received:
    2,056
    Location:
    Germany
    Can i throw in a bad car comparison? If you drive through the city at allowed 50 kph (here in germany), you might find yourself in the optimal rpm-range of your car's engine in 3rd gear for a small car that only goes up to 150 kph max. When driving a more potent car, you might find that at the same 50 kph you are slightly too high for 2nd gear and slightly to low rpm for 3rd gear. When driving a near-racing car (MC Laren F1 or a nice Lambo for example), you might find yourself in 1st gear all the time, giving you a less-than-optimal experience while driving through the city. Ok, your might have enough torque in that case to switch to 2nd or 3rd gear anyway, but you'd be missing the optimum rpm-range.

    When driving on country roads, things change at 100 kph allowed, while only at "ze djerman autobahn" you can truly use all the horsepower you got (Vsync off). But I guess you knew that anyway, I just wanted to brag about our autobahns. :)


    ---
    edit: And yes, there ARE sweet-spots of GPU-performance, monitor refresh and all that stuff. But one truth is: You have multiple ways of lowering Fps on your setup to match for example a 144 Hz display if you have excess Fps. But your options are more limited when it's the other way around: reduce details, overclock.
     
    Lightman, pharma and Kaarlisk like this.
  16. homerdog

    homerdog donator of the year
    Legend Veteran Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,153
    Likes Received:
    928
    Location:
    still camping with a mauler
    The PCPer guys basically put this whole "driver not meant for gaming and/or buggy" line to rest. They even dedicated a whole page of the review to Answering questions before you ask. A couple of relevant bits:

    There is also talk of the tiled rasterization not working at the moment, and this will make a significant difference when it is turned on in the driver. First, we don't even know if they will ever decide to enable it. It could be broken in hardware (unlikely IMO but similar things have happened before). More likely it will either work in future drivers and will provide a slight benefit in some games at higher resolutions, or it actually works fine but the performance gained/power saved is so insignificant they've decided not to bother with it in Big Vega. I am interested to know see how this pans out.
     
    DavidGraham and pharma like this.
  17. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,183
    Likes Received:
    1,840
    Location:
    Finland
    There has to be some sort of "rabbit in a hat" still hidden, maybe it's tiled rasterizer, maybe it's something else, but the fact is that the around 950-1100€ prices are not placeholders (no, etailers don't order cards in at placeholder prices) and AMD certainly can't be stupid enough to think anyone would buy a gaming card at that price if it performed in gaming around the level of GTX 1080.
     
  18. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,985
    Likes Received:
    4,570
    They actually did not.
    PCPer made a statement claiming "these are gaming drivers". But then they semi-quote AMD who simply says "this current driver has all the features that are working reliably in Vega to date".
    Meaning the FE driver had all the gaming optimization AMD deemed stable to release up to a month ago.
    What I think PCPer was trying to do was putting to rest the theory that AMD was sandbagging on the drivers for Vega FE on purpose.

    Rewind back to Computex events and expectations and there's obviously a delay that took place regarding RX Vega. They knew drivers in June would be good enough (i.e. functional) to release a "Frontier Edition", but not the gaming card.

    Now, what we're 2 days from figuring out is whether the delay happened because the hardware is broken or drivers wouldn't be ready for Computex release + June availability. We know there's a bunch of stuff simply not working right from the B3D suite tests (blatant geometry, texel throughput and effective bandwidth issues), it's just a matter of knowing which (if any) will have been successfully enabled with RX Vega's driver.
     
    BacBeyond and Anarchist4000 like this.
  19. Anarchist4000

    Veteran Regular

    Joined:
    May 8, 2004
    Messages:
    1,439
    Likes Received:
    359
    It's still important to distinguish between game specific optimizations and higher level scheduling through the drivers. It's possible ordering of instructions matters more with Vega. On par with Nvidia's software scheduling, but executing code in short bursts as opposed to each subsequent instruction (outside the cadence) being a new, independent wave. A feature like that could be working in compute and not graphics. Another possibility could be how divergence is handled, requiring new scalar code paths for synchronization. Compute being optimized around a wave size and graphics more dynamic when packing waves. It also stands to reason there are changes there for their work distribution that would probably still work with old paths.

    I'd agree the tiled rasterization has the potential to make a difference, but I'd characterize it as more than slight. More important for poorly optimized titles in regards to draw order. That seemed to be the big change for Maxwell along with software scheduling. Of real significance is that tiled rasterization and NUMA optimizations are effectively the same thing. That could be a huge deal for multi GPU setups along with HBCC for memory management.
     
    BacBeyond likes this.
  20. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,183
    Likes Received:
    1,840
    Location:
    Finland
    Correct me if I'm wrong, but didn't NVIDIA go to software scheduling already with Kepler?
     
    pharma likes this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...