Huddy says "R600"

Discussion in 'Pre-release GPU Speculation' started by Geo, May 25, 2006.

Thread Status:
Not open for further replies.
  1. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    Oh, that's a piker effort. VR-Zone had actual yield numbers for 32/24 pipe R520! :grin:
     
  2. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,511
    Likes Received:
    224
    Location:
    Chania
    I've seen developers note on quite a few occasions that it's very easy to code for XBox360; I'd figure though that it's not directed exclusively to the graphics chip of the console.
     
  3. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,511
    Likes Received:
    224
    Location:
    Chania
    I'd love the conversation to concentrate on the possibility of a 512bit bus a tad more, even if it's purely from the theoretical technical POV. Take my layman's perspective a given, but my primary concern with such wide bus is how they'd handle read efficiency, which if not handled properly might even reduce the overall available bandwidth. Granted 384bits is not going to be easy either for G80, but the hypothetical 512bits are another notch higher.
     
  4. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    Well, it seems to me that 384/320 for NV and Sireric's comments on dedicating clients might suggest the answer there. That was the theory behind the crossbar in the first place wasn't it? So I don't see these companies giving up that kind of granularity, but rather adding a couple more units to it.
     
  5. trumphsiao

    Regular

    Joined:
    Jan 31, 2006
    Messages:
    285
    Likes Received:
    11


    Maxim consumption of R600 based on exteme clock speed is less than 200W~250W

    But hope this time ATI will prevail on Nvidia from Performance/mm^2 stance.
     
    BRiT likes this.
  6. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    4
    I' am going to ask a question; which is still bothers me.

    What is the main reason that R600 is delay for 3 months after the G80 will be released.

    A.) Is it because ATI is behind on R600 project and Nvidia is ahead with G80.

    B.) Is it because R600 is going to be more advanced then G80, so ATI needs more time.

    C.) Or ATI has hard-time solving problems with R600 just like they did with R520 and it was delay many times.
    1.) GPU frequency MHz
    2.) Manufacture process
    3.) Low yields

    If the answer is C.
    Then Nvidia finally did it for sure, they finely became leading technology again.
     
    #786 Shtal, Oct 15, 2006
    Last edited by a moderator: Oct 15, 2006
  7. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,058
    Likes Received:
    3,116
    Location:
    New York
    Somehow I doubt you're going to get an answer anytime soon. So unfortunately you're going to be bothered for a bit longer :)
     
  8. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    That kind of conclusion is fairly facile anyway. What would you want to say about G71 being pretty much a straight die shrink of G70? 1). NV's tech was so uber they could stand pat with a process bump. 2). They mailed it in to focus on G80 instead after G70 launch.

    I wouldn't be comfortable saying either, but YMMV.

    It does sound pretty lucky for ATI that NV didn't hit their original dates for G80 tho, or they'd really be screwed.

    Edit: It would be interesting to know when X1950 showed up as definitely on the schedule, as that seems to be a key inflection point.
     
  9. Geeforcer

    Geeforcer Harmlessly Evil
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,320
    Likes Received:
    525
    As geo pointed out, within last two years ATI went

    R420 -> R520/580 -> R600

    While we don't know much about R600, I'll venture a guess that it's a fairly major revision ;).

    Nvidia on the other hand has been milking NV40 for last two years. Nothing wrong with that, (god knows ATI did the same with R300) but it does mean they had more time to get G80 ready, since it will be their first major core revision in the last two years.
     
  10. SugarCoat

    Veteran

    Joined:
    Jul 17, 2005
    Messages:
    2,091
    Likes Received:
    52
    Location:
    State of Illusionism
    Nobody at a conference call said they were on schedule for a Q4 launch around the beginning of this year? Its so vague anyway though i guess it doesnt matter, especially when the product, a very complex one at that, isnt finalized yet. Is there a delay? Is it an expected delay do to a scheduled change? Or did they crap out on yields for target clocks, blah blah... Actually quite useless speculating.



    Well current products and past products arent exactly a map for what the future holds in store. I think the R300 and even the failure of the FX series have proven that quite well. We dont exactly know how much lineage a core contains in comparison to its predecessor, so coming to the conclusion that Nvidia has had a longer time isnt exactly as simple as that. Even Xenos R&D started years ago, before the NV40 launched.

    ATI has technically been milking it longer though without a truly major architecture revision. R580 for sure was interesting but things like AA and even AF (barring the addition of HQ of course) are essentially the same things done by the R300, so R300 -> Present > NV40 -> NV50;)
     
    #790 SugarCoat, Oct 16, 2006
    Last edited by a moderator: Oct 16, 2006
  11. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Likes Received:
    30
    Location:
    Io, lava pit number 12
    I wouldn't be so sure about that...

    ATI has had basically a new architecture every 2/2.5 years.
    -R200->R250->R280.
    -R300->R350->R360->R420->R480.
    -R520->R580.

    With the exception of GeforceFX/NV3x (because of it's poor performance and delays, it lasted little more than a year) so has NV:
    -NV10->NV15.
    -NV20->NV25/NV28.
    -NV40/NV45->G70->G71.


    So, if R600 is really such a leap forward compared to the previous generation (on the architecture level, at least), it will be a tight fit on the schedule, just over a year after R520's debut.
    With the exception of unified shaders and DX10, i believe more that they'll just keep the investment in performance using the current, proven, memory controller, add ALU's and/or clockspeed and that will be it until the DX10.1 refresh (R680 ?) and then the true new design will be R700.

    Of course, either architecture may even turn out to be totally "new" (sort of speak), and it still will be up to driver support and scalability of each into the refresh parts 6 to 8 months later, and beyond.
     
  12. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    So going from a non-unified architecture to a unified on doesn't constitute a major architecture change?
     
  13. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Likes Received:
    30
    Location:
    Io, lava pit number 12
    Well, since G80 was being perceived as a minor change towards speed and DX10 (non-unified, 48 pipes, etc), and then took a quick turn to being unified and scalar (or so the rumour goes), i'm not ruling anything out yet.

    But, there has been one unified chip on the market for a year, right ?
    [MINDGAMES] What's so different about R600 ? [/MINDGAMES]

    Some nice tasty bone about it should be in order, at least to stir up a bit the discussion between G80 and R600. :wink:
     
  14. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    4
    I was reading alot of comments (sort of) about R600.

    What about name; since it is no longer ATI but AMD now.

    It might be like this:

    [Top Perfomer]
    AMD Radeon X2800XT+ (Since AMD loves plus+ at the end)

    [Performer]
    AMD Radeon X2800XL+

    Also it might be Green+Red box package Welcome new comer AMD video product.
     
  15. Arty

    Arty KEPLER
    Veteran

    Joined:
    Jun 16, 2005
    Messages:
    1,906
    Likes Received:
    55
    AMD adopted the + because the consumers thought Athlon processors were inferior based on just the clock speed. It was an ordinal comparison with Pentium processors.

    Sorry I dont see that or the green+read package happening.
     
  16. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    4
    So what is going to be; your name instead:)
     
  17. trumphsiao

    Regular

    Joined:
    Jan 31, 2006
    Messages:
    285
    Likes Received:
    11

    RV570/RV560 is overclockable unfriendly.

    If R600 want to be as fast as 8800GTX. ATI need to launch 600MHz~750MHz R600 .
     
  18. trumphsiao

    Regular

    Joined:
    Jan 31, 2006
    Messages:
    285
    Likes Received:
    11

    Nvidia 80nm is overclockable compared to ATI 80nm medicore clock increase.
     
  19. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    4
    We all are not sure how R600 will turn out; so we cannot come to conclusion what frequency R600 needs to run in order to match or outperform G80.
     
  20. trumphsiao

    Regular

    Joined:
    Jan 31, 2006
    Messages:
    285
    Likes Received:
    11


    Actually both of them are equally performance on same clock speed. but the problem is if ATI launch R600 by 2007 Q1, Nvidia would be rehashing version of Higher-Clock G80(GDDR4 ) in 3 months by then.
     
    #800 trumphsiao, Oct 16, 2006
    Last edited by a moderator: Oct 16, 2006
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...