R600 has 512-bit external memory bus, Orton promises it takes no prisoners

Discussion in 'Beyond3D News' started by Geo, Dec 15, 2006.

  1. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    3
    Exactly!!!!! But I think it is to early.
    But if ATI made that choice - it's Ok. For future use will be usefully.
     
  2. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    3
    Very true. Nvidia was working hard with drivers, that they try cheating with 3DMark03 score. But it was more of the desigh of the chip vs. extra memory bandwidth that needed even with NV35 with 256bit helped just a little.
     
  3. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    At what date, do you think, did ATI change their mind and decide to go for 512-bit instead of something else?

    November 8th? Early september?
     
  4. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    3
    It would be somewhere similar time frame when ATI found out that NV40 would have 16 pipeline. "Just a guess estimate because ATI has better resources and know ahead then anybody else beside Nvidia themselves"
     
    #104 Shtal, Dec 17, 2006
    Last edited by a moderator: Dec 17, 2006
  5. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    Well if ATi did it like they did with the x800 pro to xtx, they would have planned for the 512 bit bus ahead of time, and now they just unlocked it fully. Highly unlikely :wink: . As others have mentioned, thats alot of reasources to just have lieing around :smile: .
     
  6. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    3
    I understand....
    Same way people thought that 256bit memory for R300 was extra not needed, but at the end it turn out to be very useful.

    We just have to wait and see by how much it will pay-off for ATI/AMD this 512bit idea. Just for me 512bit does not make sense because it would be waste for how much it cost. But it may be well pay-off very well for future DX10 titles.
     
  7. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    3
    Sorry for mentioning my stupid poll vote again but I think it is kind the cool. :)

    Sorry but this is my last time I mentioning again I promise!
    But so far here is the result about ATI R600. http://snappoll.com/poll/147813.php

    ATI R600 Engineers
    What do you think: Will R600 be faster or equal against G80 (8800GTX)
    15-20% faster 31% 42

    About the same 27% 37

    35% or more 24% 32

    Slightly faster 18% 25

    total votes: 136
     
  8. poopypoo

    Veteran

    Joined:
    Jun 14, 2005
    Messages:
    1,026
    Likes Received:
    13
    Location:
    Hong Kong
    BEST QUOTE EVAR!!

    Can we never have this mentioned in another thread please!? >_<;

    Finally, Shtal, nothing you're saying makes any sense -- and that poll... lol... :wink:
     
  9. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    3
    Example? please....
     
  10. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    3
    Hi :)

    Well, their few things that might hold R600 back.
    A. CPU limitations.
    I seeing the cases that slower CPU hold G80 (8800GTX) back for true abilitys.
    B. Also ATI crossbar memory controller should split into 32bit channels to complete the 512bit that should improve memory usage more efficiency.
     
  11. 3dcgi

    Veteran Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    2,435
    Likes Received:
    263
    Parhelia's 256 bit interface was 64x4 they just didn't publicize it much. So that wasn't why Parhelia was slow.
     
  12. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    213
    Location:
    Uffda-land
    If I remember correctly, Parhelia had basically no bw-saving features to stretch that 256-bit out even further, while its competitors most certainly did.
     
  13. 3dcgi

    Veteran Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    2,435
    Likes Received:
    263
    You sir are correct.
     
  14. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    3
    http://www.digit-life.com/articles2/p10/index.html

    In here it says Matrox Parhelia is single 256bit x1

    I know for the fact it is single 256bit x1, because when Matox Parhalia 512 came out I was witness with my own eyes - I saw it on the net. I just don't have right now more reliable source that actually says that.

    ---------------------------------------------------------
    I could be convince if R600 use (32x32x32x32x32x32x32x32x32x32x32x32x32x32x32x32 Bit) crossbar memory for total 512bit it will utilize more significantly actual memory. But on the other hand when I point out example for (Like SATA150 with 10K WD-Raptor drive only scaling for 86GB's but the ability of 150MB's for SATA.) It is a bad example: but 32ROP, 32 texture, higher fill-rate, diffidently will help. Is just I'm not sure how ATI will scale bandwidth without waste.
     
    #114 Shtal, Dec 18, 2006
    Last edited by a moderator: Dec 18, 2006
  15. Pete

    Pete Moderate Nuisance
    Moderator Veteran

    Joined:
    Feb 7, 2002
    Messages:
    4,936
    Likes Received:
    333
    Shtal, "CPU limitations" won't hold R600 or any other GPU back in any real sense. If you're seeing a CPU holding a GPU back, you're not playing the game at the right settings. You can see that with G80 and any other brand new high-end GPU.

    And, no, default 3DMark scores don't count, b/c gaming at 12x10 with a current $600 card just isn't relevant to exploring 3D performance.

    It's simple. Raise the resolution, AA, AF, and texture quality level until you're maximizing the GPU.

    As for memory channel bitness, 6x64 doesn't seem to hurt G80 much. I doubt R600 will succeed or fail on that detail.

    Finally, as for D3D/DX 10, JHoxley has said that MS has given in to using DX10 as well as D3D10. If you're going to nitpick, better go after people like me who still say pixel, rather than fragment, shader. :wink:
     
  16. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    3
    Thank you my friend :)

    I knew if you crank everything to maximum it will rely just on the GPU and yes. And I agree with you about G80 64x4.

    What I notice with R520 X1800XT 32x8 memory it scale very well with OpenGL games with ATI optimize driver for new crossbar controller for R520 that came in to play.
    ------------------------------------------------
    And I will not nitpick about how you say or call. "I just don't care"
     
    #116 Shtal, Dec 18, 2006
    Last edited by a moderator: Dec 18, 2006
  17. 3dcgi

    Veteran Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    2,435
    Likes Received:
    263
    That it was multi-channel was only ever said in an interview, however I used to work there and it was not a monolithic memory controller. There were 4 channels. Enough history though. Back to R600.
     
    #117 3dcgi, Dec 18, 2006
    Last edited by a moderator: Dec 19, 2006
  18. Unknown Soldier

    Veteran

    Joined:
    Jul 28, 2002
    Messages:
    2,238
    Likes Received:
    33
    nitpick, nitpick! :D

    Fanks Pete ... didn't know that.

    US
     
  19. poopypoo

    Veteran

    Joined:
    Jun 14, 2005
    Messages:
    1,026
    Likes Received:
    13
    Location:
    Hong Kong
    Well, I'm not one to debate it with you, for I'm truly bandwagon-hopping in this scenario. I'm thoroughly a layman here. I generally respect that your knowledge on this subject is far deeper than mine. However, I was primarily referring to your idea that R600's 512-bit bus would be reactionary to nV's G80 bandwidth, when it seems unlikely that such a design choice could have been made so late, and also that such a design choice would be such an obvious blunder if it were not more than a marketing bullet. ;) Well, that, and that poll... XD that poll is a joke, right? ;)

    PS: if that poll's not a joke, then, oops, sorry, please disregard my vote! :blush:

    PPS: Human error is certainly not to be ruled out. ATi has been floundering quite a bit it seems. But I think the question is less likely whether R600 is a heavyweight, championship-quality chip, and more likely, will it come out before nV comes up with something better (or almost as good and cheaper)? 9_9;
     
  20. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    3
    Logic says G80 (8800GTX is not memory starved) so does R600 w/ 512bit memory.

    If R600 will have 32 ROP, 32 Texture, higher fill-rate, more complicated chip. - It might in some cases be nice to have more bandwidth available vs. 86GB's on G80.

    And the Poll is just for fun that is all :)

    -------------------------------------------------------------
    One last thing I wanted to add.
    I was never against that R600 should not have 512bit memory with massive bandwidth availability. All I'm saying if my car can go up to 200MPH and speed limit is 65MPH on freeway in some cases were their is emergency and you have green light from the law, - sure it will be fun to test your car.
     
    #120 Shtal, Dec 19, 2006
    Last edited by a moderator: Dec 19, 2006
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...