AMD Vega Hardware Reviews

Discussion in 'Architecture and Products' started by ArkeoTP, Jun 30, 2017.

  1. kalelovil

    Regular

    Joined:
    Sep 8, 2011
    Messages:
    555
    Likes Received:
    93
    If the HBM2 temperature thottling is the cause of the poor bandwidth results (on the B3D test suite), hopefully the 4-hi stacks on RX will have an easier job of dissipating heat.

    Is it possible to underclock the memory on the FE? I wonder what bandwidth results it would give at a halved frequency. Like GDDR5 (but unlike HBM1) IIRC performance can be lowered if frequency is pushed too high because of error-correction overhead.

    (Although since AMD are comparing RX to the GTX1080 non Ti at the roadshow I don't expect any massive performance increases.)
     
  2. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,996
    Likes Received:
    4,570
    If AMD is using support for VESA's adaptive sync standard as leverage to sell Vega for a lower price/performance ratio than the competition, then shit's gonna hit the fan.
    At any moment nvidia could just enable adaptive sync in the drivers and suddenly every Freesync monitor with DP1.2 would support it.

    nvidia is being an ass for not supporting it at the moment (or rather, by only supporting it in laptops, meaning the hardware capability is there), but AMD is painting a big target on their backs if they're counting on this being an irreversible decision.


    In the end, AMD's marketing keeps doing the strangest things. Zero info on the RX Vega cards, but now they're showing them around in totally random places, inside sealed boxes.
    So here's this sealed box and that sealed box. You probably won't notice any difference between these two sealed boxes using these games and settings we cherry-picked, and you'll totally have to trust us that our sealed box is cheaper.
    What is this? A blind wine tasting of videocards? Because if it is, it's fucking ridiculous IMO.
    Carrying these cards around in vans doing dodgy stuff instead of just giving them to reviewers makes AMD look dodgy as hell.

    And they chose Budapest.. do they have any idea of the minimum wage in Hungary? Why not Berlin, London or Paris?
     
    homerdog likes this.
  3. CarstenS

    Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    4,798
    Likes Received:
    2,056
    Location:
    Germany
    Unfortunately, even with the cooler mod, which kept the HBM gen2 temperature (or whatever it is that HWInfo64 is reporting for this) at a maximum of 57 °C during a B3D-suite re-run, bandwidth figures did not improve. The duration of the individual tests is rather short in terms of heating up an actively cooled ASIC.
     
    #303 CarstenS, Jul 19, 2017
    Last edited: Jul 19, 2017
    T1beriu, pharma and kalelovil like this.
  4. Cat Merc

    Newcomer

    Joined:
    May 14, 2017
    Messages:
    124
    Likes Received:
    108
    On the plus side, it would be a win for consumers if this pushes NVIDIA to support VESA Adaptive Sync. I'd get a Volta in that case :lol:
     
    Silent_Buddha and Kyyla like this.
  5. BacBeyond

    Newcomer

    Joined:
    Jun 29, 2017
    Messages:
    73
    Likes Received:
    43
    One of the videos I saw said that they can't say much because they are still under embargo. I'm sure we'll see actual reviews and stuff from people closer to or just after launch @ Siggraph (30th?)

    Exactly. It would be great for everyone is Nvidia would support Adaptive Sync. Hopefully they will be forced to with HDMI 2.1 requiring VRR, though I'd need a new monitor with that :p
     
  6. xEx

    xEx
    Regular Newcomer

    Joined:
    Feb 2, 2012
    Messages:
    939
    Likes Received:
    398
    AMD marketing has always be the worse part of the company. I though it was fix because Ryzen and Epyc presentation were pretty good in my opinion but this whole thing with VEGA just shows that they still need much work to do if they want to step up and challenges its competition.

    If vega is bad just admit it was bad and move on, don't waste money, peoples time and patient just to make them realize is bad at the end. Every day vega images is getting worse and it better be good with that 40% perf. improvement or ppl will get angry.
     
  7. Cat Merc

    Newcomer

    Joined:
    May 14, 2017
    Messages:
    124
    Likes Received:
    108
    Is it an actual requirement? As in NVIDIA needs to support it to claim 2.1?

    It's obvious AMD and RTG have completely different marketing people.
    That said, it's hard to do good marketing to a product with no redeeming qualities. With Ryzen AMD had a winner, so marketing is much easier.
     
  8. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,996
    Likes Received:
    4,570
    I think something like a blu-ray player definitely doesn't have to support Game Mode VRR to claim HDMI 2.1 compliance because it doesn't need to send anything with variable refresh rate, so I guess it's perfectly possible that nvidia won't support VRR either and keep pushing the G-sync nonsense.
     
  9. xEx

    xEx
    Regular Newcomer

    Joined:
    Feb 2, 2012
    Messages:
    939
    Likes Received:
    398
    Good Marketing can make sh** shines like gold. bad marketing can make a good product invisible(oh, hi HTC) when you have bad marketing and a not so good product(Its not like vega its the worst GPU ever made either) then you are just asking for negative numbers.

    I think even the FX had better marketing and execution than vega...

    One thing I can not understand was all that "poor volta" slide in the -strike- forgotten video -/strike- video they released like a year ago...its not like vega was a performance champion back then and then get worse, they knew its performance so unless something was broken in the hardware that they though would give them a huge advantage and though they could fix it in time but couldn't I can't understand why you would play the "i'll shit on you" card instead of the "damage control" one if you know you will have an inferior product.
     
  10. yuri

    Newcomer

    Joined:
    Jun 2, 2010
    Messages:
    178
    Likes Received:
    147
    xEx likes this.
  11. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland
    Not comparable, but most 1080TI retail overclocked hit between 310 and 350W ( i dont compare perf with it as we will see what vega RX give in this sense later ) .... Same goes for a lot of 1080 overclocked who are running way higher than the "standard version" and who are offtly the one used in review . ( 250W+ ) ..

    Ofc, it change, from a review to another.. but well.
    [​IMG]

    [​IMG]


    The question there is not if performance wise they are comparable, but more than pass 300W is not so much "incredible". Ofc the problem is if they perform as a 1080 and need 300W+ ..
     
    #311 lanek, Jul 19, 2017
    Last edited: Jul 19, 2017
    BacBeyond likes this.
  12. BacBeyond

    Newcomer

    Joined:
    Jun 29, 2017
    Messages:
    73
    Likes Received:
    43
    AFAIK yes, unlike Adaptive-Sync which is "optional" 1.2a spec VRR is a requirement for 2.1 output.

    http://www.hdmi.org/manufacturer/hdmi_2_1/index.aspx
    Wouldn't those just stay as 1.4 or 2.0 or w/e?

    Well even most 1080's use 210w+

    [​IMG]

    [​IMG]

    And yeah, that's before a custom OC that's just out of the box.

    All GPUs end up using a lot more power than people think when pushed above their clock efficiency range. I have a feeling that 1600 Vega RX will be similar power usage as an OC'd 1080 while performing similarly.
     
  13. xEx

    xEx
    Regular Newcomer

    Joined:
    Feb 2, 2012
    Messages:
    939
    Likes Received:
    398
    Well the whole point in redesign an arch. is exactly to improve its efficiency because you just can't scale the same design to infinity. from what we know in order to get the same performance than a 1080(in the middle between reference and customs) it eats more than 400W... I don't do the math but how much would it eat to get to the 1080Ti? 500W?

    The problem with Vega is that it eats close than 300W while performing close to a 1070 which is about or less than 200W.

    TDP alone doesn't tell you the whole story, TDP vs performance does and vega in that aspect its just a poor volta.
     
    pharma and DavidGraham like this.
  14. Malo

    Malo Yak Mechanicum
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    7,032
    Likes Received:
    3,104
    Location:
    Pennsylvania
    I've seen that before and tried to look elsewhere as well however I've only ever found Game VRR as a "feature", with nothing specifying it as an actual requirement.
     
  15. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    You can't require support for variable refresh rate because any technology that has some kind of flashing behavior (like CRTs) can't do variable refresh rate by definition.
     
  16. BacBeyond

    Newcomer

    Joined:
    Jun 29, 2017
    Messages:
    73
    Likes Received:
    43
    What CRT has HDMI 2.1 support?
     
  17. Mize

    Mize 3dfx Fan
    Moderator Legend Veteran

    Joined:
    Feb 6, 2002
    Messages:
    5,048
    Likes Received:
    1,097
    Location:
    Cincinnati, Ohio USA
    Does anyone make CRTs anymore?
     
  18. AlBran

    AlBran Ferro-Fibrous
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    20,728
    Likes Received:
    5,819
    Location:
    ಠ_ಠ
    Do gamers these days know what CRT is?

    ¯\_(ツ)_/¯
     
  19. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    10,873
    Likes Received:
    767
    Location:
    London
  20. Malo

    Malo Yak Mechanicum
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    7,032
    Likes Received:
    3,104
    Location:
    Pennsylvania
    There's a CRT with a HDMI port? :shock:

    ok I think we've had enough fun :lol:
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...