AMD: RDNA 3 Speculation, Rumours and Discussion

Discussion in 'Architecture and Products' started by Jawed, Oct 28, 2020.

Tags:
  1. neckthrough

    neckthrough Newcomer

    That is not correct. It was *always* about cost. You could always do (and can still do) fancy things in a research lab that would never be commercially viable.
     
    Qesa likes this.
  2. DegustatoR

    DegustatoR Veteran

    Seems pretty straight forward: raytracing on an MSAAed render target. Dunno why it needs a patent though.
     
  3. Albuquerque

    Albuquerque Red-headed step child Veteran

    I get that you're quoting his 1965 paper. The observation he made while at Intel, which famously became his "law", was specifically transistors per IC.
     
  4. neckthrough

    neckthrough Newcomer

    But that paper (or rather article) *is* "Moore's Law". I agree that it's an observation, or a self-fulfilling prophecy. But it all stems from that quote. Transistors-per-IC doesn't really make much sense without cost being a factor. What's an IC anyway? The reticle limit? I'm not aware of any other formulation made by Moore that doesn't include cost. Now, as you correctly point out, there are a host of bastardizations/corollaries. And then there's Dennard scaling, which is about power.

    Regardless, our argument is moot. It's all going to shit. Dennard, Moore, everything.
     
  5. Bondrewd

    Bondrewd Veteran

    Yesssssss.
    And we don't have a clear way out of CMOS necrophilia much the same way DRAM may never truly actually die.
    Shit's pretty grim.
     
  6. trinibwoy

    trinibwoy Meh Legend

    Necessity is the mother of all invention.
     
    Father_Murphy and neckthrough like this.
  7. Scaling is no more possible...
    [​IMG]
     
    Lightman likes this.
  8. Rootax

    Rootax Veteran

    Maybe at some point this will force rethinking a lot of the uarch vs "just" increasing number of units (I know it's not that simple). With maybe a longer life period for the products too, without the new arch each 2 years. My wet dream is somebody going full tbdr to help solving the bandwitdh problem, but I guess imgtechs have too many patents for that to happen...
     
    Last edited: Sep 29, 2021
  9. Bondrewd

    Bondrewd Veteran

    Economies of scale demand anything actually fancy to die.
    Every single CMOS replacement anything is exactly that fancy that's badly suitable for economies of scale.

    You know, DRAM should've been replaced like 30 years ago but alas!
    Still here with no signs of dying.
     
    Lightman likes this.
  10. trinibwoy

    trinibwoy Meh Legend

    Isnt that the case for every new technology? It always starts with poor economies of scale and improves from there. I think it’s way too early to lose faith.
     
  11. Bondrewd

    Bondrewd Veteran

    We've had like a bazillion promising DRAM replacements dying without ever reaching said scale.
    And that's like, replacing pretty basic memory tech.
    Imagine how miserable killing CMOS would be.
    We're basically bending physics over with EUV just to keep a semblance of CMOS scaling going.
    That should about tell you the chances actual CMOS replacements have in the wild.
     
  12. tsa1

    tsa1 Newcomer

    There are some fully-optical switches and transistors in development, some of them can even be CMOS-compatible, but not sure if it pans out (considering "successes" of FeRAM, ReRAM and the likes)... It'd be pretty naive to expect something to appear that will be as versatile and cheap as the current gen technologies without the same level of investment into their production / development (inflation adjusted, of course). It also does not help that science now is mostly short-term / small project oriented, big ideas and mega-science projects (to borrow from our agitprop language) are few and far between...
     
    Lightman likes this.
  13. Jawed

    Jawed Legend

    It seemed to me this document specifies discarding the ray query for a node when the node is not resident.

    Perhaps the next version of DXR is going to bring partially resident BVHs? Or maybe BVH Query Feedback, similar to Sampler Feedback?
     
  14. del42sa

    del42sa Newcomer

    Lightman likes this.
  15. Jawed

    Jawed Legend

    Ray tracing performance is all that's going to matter and so Navi 33 achieving Navi 21 ray tracing performance at 1080p is not going to be enough, as solid 120-144fps won't be reached.



    67fps for High ray tracing at 1080p (I don't think he found a good test scene though).



    34fps for Psycho ray tracing at 1080p.

    Regarding second video, he comments that 21.9.1 improved ray tracing performance, "I remember getting way less FPS on 1080p with ray tracing. So something must have improved".
     
    Lightman likes this.
  16. Bondrewd

    Bondrewd Veteran

    Oh noes, no, not even close.
     
  17. Granath

    Granath Newcomer

  18. Jawed

    Jawed Legend

    6900XT is 58% faster than 6600XT at 1080p:

    Sapphire Radeon RX 6600 XT Pulse OC Review - Average FPS | TechPowerUp

    I really don't think AMD is going to put as many as 4096 SIMD ALU lanes in it. It doesn't make sense for 1080p at around 3GHz. Merely putting in twice as much Infinity Cache (i.e. 64MB for 7600XT) would cut that advantage for 6900XT substantially.

    Basically, the triangles have too few pixels on them!
     
    Lightman, PSman1700 and BRiT like this.
  19. I also don't get the "everything that matters will be RT performance" statements, for GPUs releasing next year.

    After UE5 with Lumen was shown we started to see the big AAA games actually moving away from RTRT or making light use of it. Battlefield V was one of the first showcases of ray tracing but Battlefield 2042 is skipping it entirely. COD MW Remake and COD Black Ops had ray tracing but COD Vanguard might be missing out after so many calls for disabling the feature in the previous games. Far Cry 6 apparently uses ray tracing as an afterthought, Halo Infinite AFAIK doesn't have any, nor does Age of Empires 4.

    I'm not suggesting Ray Tracing isn't the future or that it will eventually become the number one performance factor. It just doesn't seem all that important for the games being released within the next couple of years, which is what matters the most for people buying graphics cards in 2022.
     
    Wesker, Lightman and PSman1700 like this.
  20. Bondrewd

    Bondrewd Veteran

    New thing good old thing bad?
    Whatever, really.
    People talking RT in gfx11 context is ugh.
    Maybe a bit too early lads.
    Wait, really?
    damn...
    That's odd, you'd expect DICE to throw at least some visual bling in there; the lads' been doing it for eons now.
     
Loading...

Share This Page

Loading...