Speculation: GPU Performance Comparisons of 2020 *Spawn*

Discussion in 'Architecture and Products' started by eastmen, Jul 20, 2020.

  1. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,214
    Likes Received:
    1,202
    I was aware - I just pointed out that it doesn’t say anything meaningful without actual information. Of course there are faster products in the pipeline from all (2) pertinent players. Just saying that there is a faster prototype product in the wild - well duh. Which means that you can also pose as having one, since either AMD or nVidia is bound to release something vaguely like it and your credibility will be unharmed. In fact, you will have gained insider status, and can troll forums with cryptic statements forevermore.

    Edit:Removed irrelevant stuff.
     
  2. gamervivek

    Regular Newcomer

    Joined:
    Sep 13, 2008
    Messages:
    738
    Likes Received:
    229
    Location:
    india
    I'm not sure what maths they're doing but 2080Ti is 50% faster than 5700XT at 4k, so even if "Big" Navi is twice as fast as 5700XT then it'd still only be 33% faster than 2080Ti

    https://www.techpowerup.com/review/asus-radeon-rx-5700-xt-tuf-evo/28.html

    The Big Navi specs also look a little too optimistic for 2x5700XT performance( 15:15 ) with just a 427mm2 die and 384-bit bus.

    I can see RDNA2 cards doing really well on clockspeed front, with PS5 gpu doing over 2.2GHz and Vega in Renoir doing 2.1GHz stock with a 2.4GHz overclocked 3DMark result floating around. So performance could "linearly" scale with number of units increase.



    otoh, leakers are saying 3080 will be doing 2080Ti + 20%,



    3080Ti/3090 should be another 20-30% on top of that for 45-60% over 2080Ti though RT performance would be more relevant anyway.
     
    Konan65 and pjbliverpool like this.
  3. Bondrewd

    Regular Newcomer

    Joined:
    Sep 16, 2017
    Messages:
    771
    Likes Received:
    361
    Way bigger.
    To put it rather bluntly, yeah.
    Those are real speed deamons.
    I'm sorry to disappoint you but nV isn't breaking 400W in client this year.
    Maybe next one.
     
  4. Megadrive1988

    Veteran

    Joined:
    May 30, 2002
    Messages:
    4,683
    Likes Received:
    196
    GeForce Hopper in 2022.
     
  5. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    3,582
    Likes Received:
    2,309
    Thought the rumor for Hopper's release was sometime in 2021.
    https://wccftech.com/nvidias-hopper-architecture-will-be-made-on-tsmcs-5nm-process-launching-2021/
     
    Lightman likes this.
  6. gamervivek

    Regular Newcomer

    Joined:
    Sep 13, 2008
    Messages:
    738
    Likes Received:
    229
    Location:
    india
    A review by an MSI engineer has the iGPU in 4750G overclocking to 2.65GHz,

    https://linustechtips.com/main/topic/1225149-amd-ryzen-7-pro-4750g-review/

    Over 500mm2 or over 700mm2?

    Maybe 400W won't be broken by stock configuration, but going close to 300W would be enough for it to be bad on pef/W compared to 2080.
     
  7. Bondrewd

    Regular Newcomer

    Joined:
    Sep 16, 2017
    Messages:
    771
    Likes Received:
    361
    Yeah at this point idk what AMD GPU teams are snorting, but they've learned to design some speedy circuits recently.
    A bit over 500mm^2; >700mm^2 stuff is reserved for DC at AMD.
    It will be very hot, yes; but not bad in perf/W.
    nV just had to do something with what they had and this was the answer.
    At least Ampere will be cheaper since the xtors and the market demand it so.
     
    Lightman likes this.
  8. JoshMST

    Regular

    Joined:
    Sep 2, 2002
    Messages:
    467
    Likes Received:
    25
    I'm certainly hoping that lessons learned from the integrated Vega in the 4000 series will have been ported into RDNA-2. Hard to say considering release dates, but cross pollination is certainly something that is stressed between design groups.

    So many questions though on RDNA-2 and how it can be another big jump from the original. Also curious how NV will be able to address challenges in 7nm design, since it is such a big jump from current 16/12.

    All I really want is competition in this space again and some parity in feature sets that gives consumers some real choices between the two.
     
    Lightman, CeeGee and BRiT like this.
  9. Bondrewd

    Regular Newcomer

    Joined:
    Sep 16, 2017
    Messages:
    771
    Likes Received:
    361
    Those were the training grounds.
    They already did with A100.
    Choice doesn't matter, 80% of the market will get nVidia anyway.
    AMD only ever exists to keep nV prices down.
     
  10. Jay

    Jay
    Veteran Regular

    Joined:
    Aug 3, 2013
    Messages:
    2,737
    Likes Received:
    1,843
    Can actually change, same way it has on the cpu side.
    Will take a lot of good execution and stars aligning though. Something that haven't seen much of on the gpu side, but you never know.
     
    Lightman likes this.
  11. Bondrewd

    Regular Newcomer

    Joined:
    Sep 16, 2017
    Messages:
    771
    Likes Received:
    361
    Not even bargain bin Polaris moved the mss % needle. People don't want to buy Radeon, they want them to exist and be "competitive", whatever that means in their minds.
    Let's see, they'll have 3 major uArch updates and 2 full node shrinks in 3 years, 2019-2021. Not accounting fancy DC stuff they're cooking too.
    Is that enough?
    I hope it's enough.
     
  12. Samwell

    Newcomer

    Joined:
    Dec 23, 2011
    Messages:
    127
    Likes Received:
    154
    Sounds interesting, what company you're talking about, which is doing this? Cause it won't be AMD, at least beside someones dreams. Only if you take steps like GCN1.0 to GCN1.1 as major Arch updates and extend the timeframe to 2022.
     
  13. Bondrewd

    Regular Newcomer

    Joined:
    Sep 16, 2017
    Messages:
    771
    Likes Received:
    361
    Yeah.
    Yeah.
    I ain't even talking DC roadmaps yet.
    Almost makes me wish their software was ready by now and not late'21, but alas.
    Roadmaps and no less than one taped out N5 die aren't dreams, unfortunately.
    Oh noes, no.
    One's gotta smash and fuck when they're behind.
     
    disco_ likes this.
  14. yuri

    Newcomer

    Joined:
    Jun 2, 2010
    Messages:
    212
    Likes Received:
    189
    There were talks about that snorting stuff is coming from the CPU know-how realms. It's *a bit* late given 14 years since the AMD+ATi merge, but better late then never, I guess.
    The HD 4000-series did that trick having low cost, high performance and quite good drivers. People jumped on AMD bandwagon rather quickly back then. Nowadays, Polaris is a over 4 years old rebranded technology which wasn't stellar even at launch.
     
  15. Bondrewd

    Regular Newcomer

    Joined:
    Sep 16, 2017
    Messages:
    771
    Likes Received:
    361
    Yeah they borrowed quite a lot of CPU teams design methodologies.
    And devised some new internally.
    Seems very much working.
    Yeah this is very pre-Lisa AMD.
    But they're a wiser bunch now.
    Yeah problem is this was before GTX970 permanently fucked this market up.
    Sorry I just don't have high hopes for AMD having a lot of traction in DIY GPU these days.
    OEMs? That might be a fair shot.
     
  16. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    1,505
    Likes Received:
    251
    Location:
    msk.ru/spb.ru
    There's nothing I can imagine in stopping this from happening besides the simple fact of their GPUs not actually being that much better than NV's despite what some of people are saying about them.
     
    psurge likes this.
  17. PSman1700

    Veteran Newcomer

    Joined:
    Mar 22, 2019
    Messages:
    2,739
    Likes Received:
    923
    I read today that a 3080 is going to be 50% faster then a 2080, 20% faster then a 2080Ti. Sounds too good to be true?
     
  18. P_EQUALS_NP

    Joined:
    Jun 17, 2020
    Messages:
    8
    Likes Received:
    1
    it is a full node shrink so 50% more sees realistic, but i also expect a lot of the performance to be obtained as a frequency boost rather then 50% more cores.
     
    PSman1700 likes this.
  19. Bondrewd

    Regular Newcomer

    Joined:
    Sep 16, 2017
    Messages:
    771
    Likes Received:
    361
    Their brand is very dead and it'll take probably one more launch for people to start maybe considering Radeon as something besides nVidia cost cutting measure.
    No, but they've cranked the power up which is maybe unwise.
    AMD? Turing backlash? who knows.
     
  20. madhatter

    Joined:
    Jul 23, 2020
    Messages:
    1
    Likes Received:
    0
    I would hope that it’s true, given the rumor that it uses GA102 and the rumored power draw.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...