The LAST R600 Rumours & Speculation Thread

Discussion in 'Architecture and Products' started by Geo, Jan 2, 2007.

Thread Status:
Not open for further replies.
  1. wolf2

    Newcomer

    Joined:
    Jan 23, 2007
    Messages:
    29
    Likes Received:
    1
    Doesn't sound like you've actually worked in the semi-conductor industry. The negative economics of launching a non-competitive product are huge.

    - inventory risk
    - margin pressure
    - marketing launch costs (PR, evaluation boards, support costs)
    - product engineering
    - negative image
    - lost engineering time on next generation

    I anticipate your response might be, well it hasn't stopped ATI and Nvidia from launching bad products in the past.

    To which my response would be that things (for AMD/ATI) are different now. AMD's goal is Fusion. I submit that one possible explanation for the delay, is that the management has come to realize that R600 may only be a financial burden and it might be better to cut the losses on this money-loser and double down on their Fusion strategy.
     
  2. psurge

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    955
    Likes Received:
    52
    Location:
    LA, California
    Welll that could mean one of 2 things: Geo is under an NDA that specifies that any acknowledgement of the NDA, or anything less than silence or flat out denial of being under NDA is a violation of the NDA. Or it could mean that Geo owes me PM with a detailed explanation of the nursery rhyme :wink:
     
  3. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    1). Professional responsibilites extend beyond legal ones. The legal responsibilities of NDA are minimums, not "best practices", nor certainly not maximums.
    2). Other feedback suggests maybe I put a little too much emphasis on that particular tip. Tho I still have reasonable confidence that R600's superior BW will tell in specific instances.
     
  4. PeterAce

    Regular

    Joined:
    Sep 15, 2003
    Messages:
    490
    Likes Received:
    10
    Location:
    UK, Bedfordshire
    But 'Fusion' is really targeted towards very low end/low power segment i.e for sub $300 PCs, Ultra-light laptops, Ultra-moble PCs.
     
  5. Russell

    Newcomer

    Joined:
    Feb 14, 2007
    Messages:
    80
    Likes Received:
    2
    Or, as I think is obvious, he's not under NDA himself (as he insists is the case) but knows 'certain things' about R600. To reveal what he knows, however, would invite further questions and would wind up leading back to his informant/source, who probably is under NDA and would catch shit.

    Plus it's more fun to mock us who do not know.
     
  6. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Likes Received:
    30
    Location:
    Io, lava pit number 12
    How specific are those instances, on a scale of importance from 1 to 10 ? :wink:
     
  7. wolf2

    Newcomer

    Joined:
    Jan 23, 2007
    Messages:
    29
    Likes Received:
    1
    Yes, good point.

    I should have said 1st tier OEM launch partner. Board partners are good distributors, but it is the 1st tier OEM launch partner that validates the competitiveness of the part and creates an economic vitality for both ATI/AMD and the board partners.

    I'll say again what I just said in the previous post. Things (for AMD/ATI) are different now. AMD's goal is Fusion. I submit that one possible explanation for the delay, is that the management has come to realize that R600 may only be a financial burden and it might be better to cut the losses on this money-loser and double down on their Fusion strategy.

    Launches are very big and expensive deals. One doesn't just launch for the sake of launching or because a flawed product is ready to go.

    ATI is in the really big leagues now, you've really got to think everything through.
     
  8. wolf2

    Newcomer

    Joined:
    Jan 23, 2007
    Messages:
    29
    Likes Received:
    1
    Yes. AMD/ATI see exactly that as their long term strategic plan. Standalone high-end graphics is not.
     
  9. Creig

    Newcomer

    Joined:
    Nov 20, 2006
    Messages:
    57
    Likes Received:
    1
    Only time will tell. But ATI has been neck and neck with (or ahead of) Nvidia ever since the 9700 came out. I don't think AMD would have any reason to close up shop in the high end graphics sector, especially now that they have the combined resources, patents and licensing of both companies to draw from. Not only does the performance of flagship cards drive sales for the entire lineup, but the high end technology of today becomes incorporated in the mainstream card of tomorrow.
     
  10. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    Most bugs are really, really stupid. Like forgetting a term in a boolean equation. But some of those take very arcane combinations to trigger. When they finally do on silicon, it can take days or weeks to find out what exactly when wrong. And then the fix is usually obvious.

    So, yes, most of the time, it's possible to know exactly which designer made the mistake.

    That said, unless there's a repeated pattern of the same guys screwing up time and again, I've never seen anybody fired or even reprimanded for a silicon bug and it can happen to the best of the best.

    The problem is that there are a million of trivial things that have to be covered and a bullet proof guarantee that everything works is usually impossible or would double the schedule. These days, a major part of testing is done with directed randoms: you create random inputs that conform to the required protocol and you apply them at random cycles to the block under test, but you control the distribution of the inputs to guide the design into certain expected corner cases.

    E.g. when verifying an ethernet switch, you'd create a test with 5% medium size packets and 95% very short packets. Another one test would do the opposite. Yet another test would make sure that packets at different ports are applied at roughly the same cycle to check memory conflict handling. etc. One chip can have tens of thousands different tests.

    Anyway, once your random tests are implemented, you (automatically) kick them off each night or each weekend and see what comes out. Initially, bugs will trigger very quickly, but some don't. I've had it happen once that a block that was completed months earlier and that had been passing randoms tests successfully all the time trigger a bug just a few days before tape-out. That's just being very lucky that the 'right' combination was triggered. :wink:
    But it might just as well only have triggered when the chip was already in the fab... or later.
     
  11. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    In which case, R600 would have to be delayed by a year or so. This RAM was just presented at ISSCC. Expected production is not for this year...

    You're not suggesting that the latency and yield of a 4GHz GDDR4 will be better, right?
     
  12. R300King!

    Newcomer

    Joined:
    Aug 4, 2002
    Messages:
    231
    Likes Received:
    5
    Well, maybe AMD should concentrate on ATI's department of High-End graphics and get out of the CPU business all together. Intel is raping them bigtime. They may never catch Intel now. Intel is at 45nm already and eyeing even smaller processes. Maybe AMD should cut their CPU losses and take on the new GPU strategy. ;-)
     
  13. vertex_shader

    Banned

    Joined:
    Sep 8, 2006
    Messages:
    961
    Likes Received:
    14
    Location:
    Far far away
    Bad idea, AMD only can catch up NV in GPU side, when NV make a mistake(s).
    Same in CPU side, if Intel made a mistake(s), than AMD can catch up, without mistake(s) i not seen anything what AMD can help to catch up in cpu and gpu side, not even the R600 or the K8L, but i can be wrong, time will tell :smile:
     
  14. nicolasb

    Regular

    Joined:
    Oct 21, 2006
    Messages:
    421
    Likes Received:
    4
    In which case it can't possibly do any harm to tell us what it was, can it? :twisted:
     
  15. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    What does processes have to do with anything?

    By that logic Intel would own Nvidia in graphics anytime they wanted..so would ATI.

    And I love all this talk of "Intel raping". I never heard this before..AMD had the fastest CPU for a long time, now Intel leaped ahead the last few months..AMD has survived worse.. Hell, one quarter ago I guess AMD was "raping" Intel considering Intel's steadily falling market share uninterrupted for many months?

    They just compete on price now like the old days (at least until the next CPU which is slated soon right?). Last I saw AMD continued to gain CPU share...I dont think there is a big enough performance difference for "most people" to care, when they go to Best Buy to buy a PC, which CPU they get, if one is much cheaper. The people who care are the few..the hardcore.

    Hell I'm still thinking of going AMD..I probably wont but Intel starts at $170 where a X2 3800 is $102..plus AMD CPU mobos are so much cheaper..

    Plus AMD can bundle CPU+GPU now, which gained them some share last quarter alone I think..imagine when that gets rolling.

    I guess Intel got "raped"

    http://www.crn.com.au/story.aspx?CIID=72675&src=site-marq
     
    #2155 Rangers, Feb 24, 2007
    Last edited by a moderator: Feb 24, 2007
  16. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    Well AMD isn't in the same position as they were before, they are a much larger company now then they were before thier A64's, more employee's need more expenditure, they have to make alot more money now to get the same net income levels, thats why they made a loss last quarter. Intel even if it makes losses is a much bigger company. It can take a sizeable hit and still be fine. The same hit if AMD was to take it, probably will put AMD in the poor house. Intel has alot more flexibility in this regard. And yeah Intel is "raping" AMD is this regard right now. Even though Intel itself is taking a hit by lowering prices, AMD is going to have a tough time to stay in a price war if thier techonlogy isn't as good or better then Intel. The 3800+ x2 example you give, is there an Intel Core 2 duo processor that is similiar performance to that chip? No they aren't the price you gave for the Intel chip is a e4300 which performs around a 4600+ x2.
     
    #2156 Razor1, Feb 24, 2007
    Last edited by a moderator: Feb 24, 2007
  17. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    There's an "AMD needs money?" thread over in Industry that would be much more suitable for this threadlet. Please continue it over there.
     
  18. R300King!

    Newcomer

    Joined:
    Aug 4, 2002
    Messages:
    231
    Likes Received:
    5
    It starts with the new tech. If AMD can't keep up they will fall even more behind.

    Depends on what Intel concentrates on. No, Nvidia is not screwing up in the chipset market either, they are making good chipsets.

    Intel was outselling AMD even when Intel didn't have the fastest chips. Now that Intel has the better chips its just going to grow further, IMHO

    Only the people who are not up to date on who's got the better processors now(higher perf, lower power, lower heat, higher OC), yeah.

    But you won't. LOL. Sure, both will wiggle around their prices to compete but the fact is Intel has a better price/perf value than AMD.

    Yeah, AMD can postpone either product until the other one is ready, great strategy. In the meantime Intel can concentrate on delivering the best CPUs possible.

    LOL Nice try. :D
     
    Razor1 likes this.
  19. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    Of course, there's another side of the coin to consider, and that is that given the state of nVidia's Vista G80 drivers thus far, and the tantalizing hints about their likely progress in the near term as revealed by the DX10 demo nVidia's just released, coupled with nVidia's recent remarks about some limitations relative to its current DX10 driver foundation ( http://www.nvidia.com/object/vista_driver_news_022207.html ), it could well be that ATi has decided it doesn't need to rush things in particular with the R600 family launch. It seems reasonable to me that ATi is probably as knowledgeable about these things as the rest of us are, and feels that it simply has more time to generate a more effective, inclusive R600 product launch than perhaps the company formerly believed at an earlier point. That's an alternative point of view which I think has merit.

    I think it's a bit of a mistake to talk about "G80's refresh" as though it's going to somehow leave the G80 behind along with current G80 owners. But if by saying the R600 "might be trumped by the G80's refresh" you mean that you think R600 will trump the G80, but not the hypothetical, unknown G80 refresh, then you're as good as saying that no one should buy the G80 but should instead wait on the G80 "refresh"...;) I really doubt that this is what nVidia intends, and furthermore, until R600 is released I don't see how nVidia would even know what the "G80 refresh" would have to "trump" in order to win that particular contest. OTOH, Ati at the moment knows quite a bit more about both G80 and the state of current G80 DX10 driver development, doesn't it?

    I also cannot figure how you think a 30-day or so delay means that ATi has "lost the chance to compete with the G80." As Vista adoption is still in its infancy (although I have Ultimate now and am using it to good effect with my x1950 Pro AGP), and as the first DX10-required game is a long way off, it seems to me that ATi has plenty of time to both compete with G80 and any near-term (this year) product refresh of the G80. What's the compelling reason to buy either a G80 or an R600 at present? DX9 gaming performance? I rather doubt it, myself. It's Vista and DX10, isn't it? So I think that time is rather plentiful for ATi in that respect. Conversely, it could well wind up that nVidia regrets having marketed the G80 as the Vista-ready product it obviously isn't (at least with respect to DX10 and some other aspects nVidia covers in the link above.) Perhaps ATi feels it imprudent to rush out R600 only to make similar mistakes...? I think this is certainly just as reasonable a proposition as anything else.
     
  20. Thorburn

    Regular

    Joined:
    Oct 8, 2006
    Messages:
    323
    Likes Received:
    19
    Location:
    UK
    I'm a rather strange gamer, I play basically two games, World Of Warcraft and GTR2.
    Now WoW runs pretty damn well on the 6800GT in my third system, but to run GTR2 with all the pretty (and I really like the pretty, cos its my favourite game ever) I need my 8800GTX.

    So for me DX9 performance is pretty compelling...
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...