The ATI R600 Rumours & Speculation Centrum

Discussion in 'Pre-release GPU Speculation' started by Arun, Oct 16, 2006.

Thread Status:
Not open for further replies.
  1. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,115
    Location:
    Uffda-land
    Maybe, but your PR people will hate you. They hate getting everyone all ZOMG about something and then explaining later why the next part is still ZOMG when they take it away.

    Not impossible, but generally not a happy prospect. And gddr4 is the big bump, so I dunno where you'd suddenly get that extra bw post gddr4. . .
     
  2. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,248
    Location:
    Chania
    Sounds like a huge waste of R&D resources if I didn't misunderstand you. I'm quite confident that IHVs know the limits for each future architecture right from the start.
     
  3. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,115
    Location:
    Uffda-land
    The harder I look at what TR reported Orton saying, the more I notice that he gave current bw, current flops, future flops, and. . .oh, wait, no future bw. But if you extrapolate from what he did give, you'd be lead to believe a number in the 120GB/s range. It's arranged in classic "what's the missing number" fashion almost.

    Ha.
     
  4. psurge

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    896
    Location:
    LA, California
    Hmm I was thinking something like xdr2. There's also some talk of GDDR5 already so who knows (maybe that'll reduce the pin requirements versus GDDR4?). I guess it's more likely for the ultra high-end chips to just keep pushing the die-size envelope ...
     
  5. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    If you were ATI and wanted to blindside an earlier launching G80 by leaking a low number that would have Nvidia thinking they'd got you cold, only to come back with R300 style insanity....?
     
  6. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,115
    Location:
    Uffda-land
    Well, I'm careful to describe stuff as "conventional wisdom", not "received truth". :lol: Anyone who lived thru "extreme pipes" is not going to forget that experience. :cool:
     
  7. PeterAce

    Regular

    Joined:
    Sep 15, 2003
    Messages:
    489
    Location:
    UK, Bedfordshire
    And wrt the memory controller I hope that it is a 384-bit external memory bus being - 12 memory channels (6 primary-ring stops and two channels per ring stop).

    Which fits nicely with a primary-ring stop per 'shader processor' (each one being 4 TMUs, 16 ALUs, 4 ROPs). In other words an primary-ring stop per 'heavy client'.
     
    #27 PeterAce, Oct 17, 2006
    Last edited by a moderator: Oct 17, 2006
  8. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    10,661
    Location:
    London
    Definitely worth holding onto as a possible configuration.

    Jawed
     
  9. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,248
    Location:
    Chania
    Frankly gentlemen I couldn't give a rat's behind, if a GPU has a 256-bit bus with 1.4GHz GDDR4 or a 384-bit or even 512-bit bus with whatever GDDR3, as long as it's delivering the performance I'd expect. Last case scenario sounds costly and complicated, but if the total price/performance ratio matches in the end I don't see a problem there nor would I judge a GPU by it's buswidth.
     
  10. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Location:
    Io, lava pit number 12
    I don't think NV is the least concerned with R600 being "the new R300".
    They know G80 is a very promising design for the future, the DX10 software spec is well know for a long time now, and they built G8x with scalability in mind, like it's SM 3.0 predecessors. And there's always that well oiled marketing machine, of which ATI has consistently fallen victim in the last 2 years, much to market competition's disgust (and consumer anger, due to persistent high prices -longer cicles-).


    Let us not forget why was the NV3x 10 months late and underperforming.
    If it had come out in early 2002, and with the NV35 refresh in the Fall, history could have been different, no ?
    You can't blame it all on NV's decision to throw everything and the kitchen sink into it (both design, DDR2 memories, and manufacturing process), compromising performance in DX9 alone...
    Microsoft was pretty mad with NV because of the costly Xbox contract re-negotiations, and decided to punish them by keeping the DX9 API from them until it was too late to change the GPU design ;)
    And the fact that ATI was signing contracts at the time for the X360 only made it even more suspitious (the whole "FP24 vs FP16/FP32" DX9 thingy).
     
  11. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    3,909
    Location:
    NY, NY

    Well we don't know if that was a power move by nV or Microsoft :wink: , it could have been from nV or from both sides.
     
  12. ants

    Newcomer

    Joined:
    Feb 10, 2006
    Messages:
    44
    IIRC Nvidia tried to push their own architecture for DX9 at the design process, with the xbox deal they thought Microsoft would support them but instead it was voted down by everybody. Nvidia ended up leaving, later citing reasons of not wanting to expose some code or IP. They never had the final spec when developing the NV30. Wasn't the partial precision flag added later in the DX9a spec for them?

    Sorry no sources, this was a long time ago... Someone correct me if I am wrong.
     
  13. rwolf

    rwolf Rock Star
    Regular

    Joined:
    Oct 25, 2002
    Messages:
    961
    Location:
    Canada
    Take 96 pipes and use dynamic logic for the ALUs and you have a 600MHz chip with 2-3GHz ALUs. You end up with a smaller die and more performance than G80.
     
  14. rwolf

    rwolf Rock Star
    Regular

    Joined:
    Oct 25, 2002
    Messages:
    961
    Location:
    Canada
    Perhaps Nvidia just screwed up. Just like ATI dropped the ball with R520. Sometimes things happen without conspiracy.

    I believe that Microsoft went with the unified architecture because it just made sense to them.
     
  15. Sobek

    Sobek Locally Operating
    Veteran

    Joined:
    Dec 17, 2004
    Messages:
    1,774
    Location:
    QLD, Australia
    And just when I was starting to be able to sleep again, you go and bring something like that up.

    Xtreme, no?
     
  16. Farhan

    Newcomer

    Joined:
    May 19, 2005
    Messages:
    152
    Location:
    in the shade
    I think you'd also burn up a whole lot more power that way...
     
  17. 3dcgi

    Veteran Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    2,340
    "the whole "FP24 vs FP16/FP32" DX9 thingy" was decided long before Microsoft even started taking bids on Xbox 2.
     
  18. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Location:
    Io, lava pit number 12
    Then why didn't they inform Nvidia of that fact in due time ?
    Would Nvidia really have made that huge mistake of having FP16 and FP32, but not FP24, if they knew this last mode was to become not only part of the spec, but the de facto standard used by most developers ?
    And how come didn't R300 support FP32 also, since it was a part of the spec ?

    The only reasonable explanation is that MS gave the upper hand to ATI early, just like it had given it to Nvidia on DX8.1, for instance.

    And this is unfair competition interference by MS according to most standards, i think.


    That is the reason why i think R600 will not become another R300, just like G80 won't be another NV30.
    Microsoft needs compeling reasons for the masses to adopt DX10, since it's part of the "exclusivity" given by Windows Vista. That can't be accomplished when half of the market can't compete with the other, leading to high prices of the hardware, leading to slower DX10 adoption rate ("if DX9 is good enough"...).
     
  19. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    What's up with DX9L for WinXP and DX10 for Vista; what a confusion it is going to be.

    ATI and Nvidia needs [totaly/complete] new driver for XP DX9L :( and DX10 for vista.

    Why just don't make same:)
     
    #39 Shtal, Oct 17, 2006
    Last edited by a moderator: Oct 17, 2006
  20. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    What is also amazing when technology becomes advanced/better, so does the price go up.

    R300 was origin $399.99
    R420 was origin $449.99
    R520/R580 was origin $499.99/$549.99

    R600 release could be $599.99 or extreme version could be $649.99

    Getting very close to intel/AMD for extreme version CPU's $1000 mark...

    But ATI/Nvidia probably will...
     
Thread Status:
Not open for further replies.

Share This Page

  • About Beyond3D

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...