Whole lot of stuff that isn't about AMD Navi *cleanup*

Discussion in 'Architecture and Products' started by Bondrewd, May 23, 2020.

  1. Bondrewd

    Veteran Newcomer

    Joined:
    Sep 16, 2017
    Messages:
    1,043
    Likes Received:
    441
    Idk, AMD isn't making any money while Intel is posting record quarters.
    ~Nothing changed~
     
    A1xLLcqAgt0qc2RyMz0y likes this.
  2. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    9,246
    Likes Received:
    3,191
    Location:
    Finland
    That's just bullcrap and you know it very well. AMD has clawed back a lot of marketshare and went a lot more positive on revenue, dropped 2/3rd of their debts in 3 years thanks to Zen
     
    Lightman, BlackAngus, Picao84 and 2 others like this.
  3. Bondrewd

    Veteran Newcomer

    Joined:
    Sep 16, 2017
    Messages:
    1,043
    Likes Received:
    441
    Like 3% server and some 4-5% client over 3 years is about as unimpressive as it gets.
    Good margin expansion and debt control, yeah, but still.
    Meh.
    One cannot kill the mountain; you gotta grind year over year and all.
    Yeah it's an absolute killer, makes me wish for wider N23 adoption come early next year.
    But they won't be.
     
    Cuthalu and PSman1700 like this.
  4. techuse

    Regular Newcomer

    Joined:
    Feb 19, 2013
    Messages:
    361
    Likes Received:
    191
    GCN was better than Nvidia's alternatives at the majority of price points. Higher power consumption doesnt matter for consumers. Performance and price do. AMD typical beat Nvidia at both whenever it competed at a given product segment.
     
  5. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,243
    Likes Received:
    1,244
    That Goliath had drunken himself senseless on his spoils of victory, passed out, and was still sleeping when David nailed him with the rock. I mean he was just lying there!
    Nvidia is a Goliath who has given up on fighting really, he wants to calculate the orbit of the stars in the sky instead, there’s just this annoying guy pestering him with slinging rocks that he has to slap down every so often to be able to tinker with the stuff he’s really interested in these days.

    Totally different situations.
     
  6. Bondrewd

    Veteran Newcomer

    Joined:
    Sep 16, 2017
    Messages:
    1,043
    Likes Received:
    441
    Yeah, the latter is more of a Qualcomm-tier happening, and we all know what happened to QC GPU lead.
     
  7. Putas

    Regular Newcomer

    Joined:
    Nov 7, 2004
    Messages:
    439
    Likes Received:
    108
    Source?
     
  8. techuse

    Regular Newcomer

    Joined:
    Feb 19, 2013
    Messages:
    361
    Likes Received:
    191
    GPU A uses 150 watts. GPU B uses 215 watts. If GPU B is faster and will maintain its performance much better going forward, what logical reason would any consumer have to care about power usage?
     
  9. Benetanegia

    Regular Newcomer

    Joined:
    Sep 4, 2015
    Messages:
    344
    Likes Received:
    316
    Look, let's double down.

    GPU A uses <150 watts. can only play most games at medium settings, 1080p, misses 60fps mark fairly often.
    GPU B uses 215 watts, it's 50x times faster. Plays every game maxed, 16k resolution, 144 fps+.

    I would take GPU A every day, cause I don't give a fuuuge about how much faster the card B is if not going to be able to actually play with it without sweating like a pig 9 months a year in my place. As simple as that.
     
  10. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    1,683
    Likes Received:
    502
    Location:
    msk.ru/spb.ru
    The obvious catch here is that GPU B isn't really faster.
     
  11. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    16,927
    Likes Received:
    16,825
    Let's triple down...

    GPU B can only be bought in one particular place,

    [​IMG]
     
    Picao84, pharma, PSman1700 and 2 others like this.
  12. techuse

    Regular Newcomer

    Joined:
    Feb 19, 2013
    Messages:
    361
    Likes Received:
    191
    In what reality do AMD GPUs heat up an entire room because they use 50-75 extra watts?

    In the case of GCN vs Pascal/Maxwell/Kepler they certainly are.
     
  13. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    1,683
    Likes Received:
    502
    Location:
    msk.ru/spb.ru
    No, they aren't. In all of these cases NV parts were on the same performance level or actually faster (980Ti etc.) at launch frames - which coincidentally is the moment where the majority of PC gamers tend to make an upgrade.
    Kepler fell back over time due to architectural issues but both Maxwell and Pascal are doing just fine even today.
     
  14. techuse

    Regular Newcomer

    Joined:
    Feb 19, 2013
    Messages:
    361
    Likes Received:
    191
    980ti was the one price point Nvidia did well at. At the lower price points the AMD offerings were clearly better. Maxwell and Pascal are both exhibiting the same performance drop off as Kepler, just to a lesser extent.
     
  15. Benetanegia

    Regular Newcomer

    Joined:
    Sep 4, 2015
    Messages:
    344
    Likes Received:
    316
    What kind of stupid question is that? All cards do, everything that produces heat does, in our one and only reality. The higher the wattage the harder it becomes to cool the room down. And not everybody can have centralised AC systems you know...
     
  16. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    1,683
    Likes Received:
    502
    Location:
    msk.ru/spb.ru
    1060 was faster than 480 at launch.
    1080 was faster than Vega 64 at launch (and now actually).
    970 was faster than 290X.
    2080 was faster than Radeon VII.
    Should I continue or do you get the picture?

    No, the are not exhibiting anything close to what Kepler was showing at this time in its life.
     
  17. Pete

    Pete Moderate Nuisance
    Moderator Legend Veteran

    Joined:
    Feb 7, 2002
    Messages:
    5,238
    Likes Received:
    823
    If you had any sense and enough money, you take B and enable vsync.

    Ease up on the stupid. And not too long ago we were using multiple 60 and 100W bulbs in a single room. I don’t think a light bulb was the tipping point for turning on the AC. But I agree that I’d prefer lower power draw, because that usually means quieter fans.
     
    #17 Pete, May 24, 2020
    Last edited: May 24, 2020
    w0lfram and xz321zx like this.
  18. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,780
    Likes Received:
    914
    Location:
    New York
    4K gaming is already here so a true disruption would require lowering the cost of entry. The 2080 Ti is the only card that consistently but barely delivers 4K 60fps in today’s games. Is AMD going to give us 4K 60fps in cyberpunk 2077 for $500?
     
    w0lfram likes this.
  19. Benetanegia

    Regular Newcomer

    Joined:
    Sep 4, 2015
    Messages:
    344
    Likes Received:
    316
    r/whoosh

    It can actually. When we had bulbs (when was that 20+ years ago?) we would tend to not turn on the lights in summer unless stricktly necessary or open doors/windows, etc. Which is not an option while doing an activity that has the potential to wake up the entire building (i.e. MP chat).
     
  20. Leovinus

    Newcomer

    Joined:
    May 31, 2019
    Messages:
    125
    Likes Received:
    53
    Location:
    Sweden
    Instead of bickering about acceptable power/heat for highly subjective circumstances, isn't it possible to agree that - as a general rule of thumb - the better the power/performance the better the product for the vast majority? Higher efficiency has a generally positive impact on both the graphics hardware itself and the system required around it.

    As it stands the power/performance difference between AMD and nVidia isn't extreme. Which is well and good so long as node-efficiency means little to the average consumer. But node-efficiency will come to the fore once nVidia releases their 7nm product unless AMD can close the assumed gap that's about to open between RDNA1 and Ampere.
     
    TheAlSpark likes this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...