Did anyone every figure out why R600 has a 512-bit bus?

Discussion in 'Architecture and Products' started by nicolasb, Oct 3, 2007.

  1. nicolasb

    Regular

    Joined:
    Oct 21, 2006
    Messages:
    421
    Likes Received:
    4
    What's it for?

    It surely can't just be because ATi engineers lack the imagination to think in numbers that aren't powers of 2, can it?

    And yet, R600 really doesn't seem to get a great deal of benefit from all that bandwidth. For RV670 (according to rumours) we're looking like an R600 on 256-bit bus running at close-to-R600 speeds. The things that you would expect R600 to be able to do with the bandwidth (high AA and AF) are actually its greatest weaknesses. So: what as ATI thinking? :)
     
  2. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Likes Received:
    30
    Location:
    Io, lava pit number 12
    Marketing ? ;)
     
  3. tongue_of_colicab

    Veteran

    Joined:
    Oct 7, 2004
    Messages:
    3,443
    Likes Received:
    649
    Location:
    Japan
    Maybe to figure out how to do it right? build a 512bit card now while the card doesnt really make use of it but in that way make sure that your next card, that will need it you dont have any (or less) problems using a 512bit bus?
     
  4. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    82
    For feeding the very high frequency clocks that the R600 turned out not to be able to support. I suspect it might also end up being a proof of concept that we'll see again when they start putting multiple chips on a card. When the RV670 x2 arrives, it should be 2x256mbit if a single chip needs 1x256mbit, so maybe 1x512mbit won't be too far a stretch at some point in the future, or maybe even R700.
     
    #4 Bouncing Zabaglione Bros., Oct 3, 2007
    Last edited by a moderator: Oct 3, 2007
  5. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Likes Received:
    30
    Location:
    Io, lava pit number 12
    That would be a very expensive (and very public...) way of conducting "experiments".
     
  6. tongue_of_colicab

    Veteran

    Joined:
    Oct 7, 2004
    Messages:
    3,443
    Likes Received:
    649
    Location:
    Japan
    Isnt that why they'll make a RV670 with 256bit bus? :p
     
  7. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Likes Received:
    30
    Location:
    Io, lava pit number 12
    I don't follow...
    What does one have to do with the other ?
    RV670 ("V" as in Value, i guess) is a mainstream product, it was designed that way.
    Not so with R600.
     
  8. fellix

    fellix Hey, You!
    Veteran

    Joined:
    Dec 4, 2004
    Messages:
    3,489
    Likes Received:
    400
    Location:
    Varna, Bulgaria
    As SirEric mentioned a while ago, the R600 marchitecture is 64-bit "granularity" optimized in almost every general direction, so they opted out for 8*64-bit access pattern to the local memory. And, yes - there is/was a bit of benchmarketing involved here, as it's clear that the pure fill-rate numbers aren't going to bloat this uber wide interface, but could you really believe, someone would design so complex PCB just for the numbers on the advertising slide? ;)

    And now the geek's quiz: "Hey, how about that massive stream-out & vitalization thingie, that those 64*5D shaderz were suppose to cap this 100+GB/s path..."? :D
     
  9. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,457
    Likes Received:
    580
    Location:
    WI, USA
    R600 is "optimized"? :wink:
     
  10. no-X

    Veteran

    Joined:
    May 28, 2005
    Messages:
    2,298
    Likes Received:
    247
    The question is not "why", but "why not". It seems, that implementation of 512bit memory interface wasn't more difficult for ATi, than implementation of 384bit bus for nVidia. HD2900XT's PCB is even shorter. I think the main reason was uncertain situation with fast GDDR4, which would be necessary for 256bit part. And the last reason - 512bit bus isn't necessary, but I think it helps occasionally (e.g. when XT is on par with GTX/Ultra).
     
  11. dizietsma

    Banned

    Joined:
    Mar 1, 2004
    Messages:
    1,172
    Likes Received:
    13
    There's a very simple answer to this

    The previous card had 512bit bus and had GDDR4 memory, so if AMD had released a card with 256bit bus and GDDR3 memory people would have thought what? Hello Mr MidRange!

    Big numbers sell cards at the end of the day, even if nobody actually knows what they relate to in real life.
     
  12. fellix

    fellix Hey, You!
    Veteran

    Joined:
    Dec 4, 2004
    Messages:
    3,489
    Likes Received:
    400
    Location:
    Varna, Bulgaria
    Err, which "previous" card?
     
  13. hoom

    Veteran

    Joined:
    Sep 23, 2003
    Messages:
    2,947
    Likes Received:
    493
    Surely its because ATI hadn't expected GDDR3 to scale as high as it did nor that GDDR4 would arrive as soon as it did. (also for R600 to be as late as it was, similar effect)

    Put lower speed GDDR3 on an R600 & you would get a total bandwidth similar to high end 256bit GDDR3 or for RV670 GDDR4.
     
  14. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    10,873
    Likes Received:
    767
    Location:
    London
    You'd think this except that ATI was closely involved in the development of both GDDR3 and GDDR4 (and GDDR5, it seems) - judging by patents, anyway...

    Jawed
     
  15. hoom

    Veteran

    Joined:
    Sep 23, 2003
    Messages:
    2,947
    Likes Received:
    493
    Being heavily involved in writing up the spec doesn't necessarily mean that they knew when/how high it was going to scale in manufacturing from as far back as they would have made a decision to go for 512bit.
     
  16. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Likes Received:
    30
    Location:
    Io, lava pit number 12
    I agree.
    ATI having such a deep involvement at the early stages of R&D of a particular type of memory doesn't mean they could have guessed that GDDR3 would end up reaching 2.4GHz+ (used as such in a competing product, in this case the 8800 Ultra), and that it would be so pervasive against GDDR4 in the market for so long.
    I already have my doubts that Nvidia will ever use GDDR4, and not skip from GDDR3 right up to GDDR5.
     
  17. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    10,873
    Likes Received:
    767
    Location:
    London
    You're asserting that both GDDR3 wouldn't continue scaling and GDDR4 would be late, when the trend was for GDDR to scale well (GDDR3 wasn't much faster than GDDR2 when it first appeared - because GDDR2 kept on scaling) and for GDDR to arrive in good time (NVidia had GDDR3 graphics cards on the market before 6800U/X800XT launched, the first cards that were ostensibly dependent upon GDDR3 to get their required bandwidths).

    I think the best explanation for 512-bit is that ATI wanted to try out a high-density memory interface on the package - in my view ATI's going to want that high density when putting two RV670s onto one package. But maybe they'll be boring and keep the two on separate packages. Maybe it's R700 that's due to make that leap, putting both GPUs on one package? Erm...

    ATI did something similar with the R5xx memory controller. The (asymmetric) ring bus introduced at that time evolved into the symmetric ring bus we have now. That was an "experiment" too, I reckon: R580 seems to have far more bandwidth than it can usefully use (compare it with 8800GTS with exactly the same bandwidth...).

    Jawed
     
  18. wingless

    Newcomer

    Joined:
    Aug 5, 2007
    Messages:
    79
    Likes Received:
    0
    Location:
    Houston, Texas
    When you overclock with cascade liquid nitrogen to 1200+mhz for 3DMark06 world record breaking sessions the bandwidth will be used. :)
    A 512-bit bus with over 100GB of bandwidth sure isn't considered a bottle neck on this card is it? Why the hell are we having this conversation. Personally I love blatant overkill and brute force approaches. I wish more of the R600 had been designed with that kind of thought process. I bet current Nvidia cards would love these huge buses. BTW, I read that Nvidia G9X performance cards will have 512-bit buses so Nvidia thinks that too.
     
  19. CarstenS

    Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    4,798
    Likes Received:
    2,056
    Location:
    Germany
    Saving on R&D? ;)
     
  20. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    14,889
    Likes Received:
    2,304
    because 511bit would just be silly wouldnt it ;)
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...