AMD CDNA Discussion Thread

Discussion in 'Architecture and Products' started by Frenetic Pony, Nov 16, 2020.

  1. upload_2021-11-10_23-29-29.png

    This is the one used by Frontier not the one in page 8. (only Description is in page 8). The CPU connection uses the external IF links.
    One link goes to Slingshot 200Gbps
     
    Last edited by a moderator: Nov 10, 2021
    Lightman likes this.
  2. trinibwoy

    trinibwoy Meh Legend

    Right, it seems the CPU and NIC connections ride the IF links. So GPU P2P can't use the full 800GB/s in practice.

    Another interesting thing is that IF tops out at 25Gbps per pin over an interposer while NVLink is at 50GBps per pin over a PCB. Shouldn't IF be much, much faster on the interposer?
     
    Lightman likes this.
  3. Jawed

    Jawed Legend

    And NVidia calls its stuff open.

    LOL
     
  4. IF is couple of types. IFOP and IFIS. IFIS is designed for long range. IFOP has higher BW but designed for low energy inter die. IFOP have more BW than IFIS.
    Both are designed for low latency instead of BW because of CPU roots.
     
    Lightman, Krteq and trinibwoy like this.
  5. Kaotik

    Kaotik Drunk Member Legend

    NICs are behind MI200's extra PCIe links, not IF.
     
  6. CarstenS

    CarstenS Legend Subscriber

    The SERDES'es used can do both, IF (IFIS) and PCIe, but only one at the same time. So it's running on PCIe protocol, but blocking one of the eight IFs.
     
    pharma and trinibwoy like this.
  7. Bondrewd

    Bondrewd Veteran

    Nah, only one link contains the actual PCIe root per GCD.
     
  8. CarstenS

    CarstenS Legend Subscriber

    I didn't say otherwise ("The SERDES'es used"). Please refer to context.
     
  9. no-X

    no-X Veteran

    Well, I didn't mention any names, but if you ask… As for your original posting about bandwidth:

    For CDNA you choosed to use lower / directional value (without mentioning it) and negative connotation:
    For A100 you choosed to use higher / bi-directional value (again, without mentioning it) and positive connotation:
    Hard to keep technological discussion, when like half of the posts have to be dedicated to deciphering of biased and misleading content, mixing apples and oranges etc.
     
    Wesker, Lightman, Krteq and 1 other person like this.
  10. DavidGraham

    DavidGraham Veteran

    This isn't positive or negative, it was in the context of explaining power budget. And it wasn't definitely in the context of comparing inter die link speeds, which has to be fast enough to handle communication between two GPUs.


    Maybe if you stopped digging through years old posts and started discussing technical points based on their merits alone.
     
  11. Granath

    Granath Newcomer

     
    Lightman likes this.
  12. Granath

    Granath Newcomer

  13. Lightman

    Lightman Veteran Subscriber

    Me wants!


    Also, me cannot afford or need one :p
     
    sonen likes this.
  14. xpea

    xpea Regular

    there:
    https://www.hpcwire.com/off-the-wir...-healthcare-research-with-exascale-computing/
     
  15. Bondrewd

    Bondrewd Veteran

    >meme flops
    where's the real deal
     
  16. Rootax

    Rootax Veteran

    Looks like 2 Fury X glued together.
     
  17. CarstenS

    CarstenS Legend Subscriber

    The glue transfers 400 GByte/s (both directions combined).
     
    Lightman and digitalwanderer like this.
  18. Rootax

    Rootax Veteran

    That's good glue I'll give you that.
     
  19. Granath

    Granath Newcomer

    Lightman and Krteq like this.
  20. Bondrewd

    Bondrewd Veteran

    Lightman likes this.
Loading...

Share This Page

Loading...