The LAST R600 Rumours & Speculation Thread

Status
Not open for further replies.
3dilettante said:
All AMD would have to do would be to adjust what it charges for each chip. If yields were acceptable, this wouldn't be an issue.

If an optimal version of R600 places an undue burden on board partners, what's few months going to change? The design would be flawed or AMD would be stupidly overcharging.

Doesn't sound like you've actually worked in the semi-conductor industry. The negative economics of launching a non-competitive product are huge.

- inventory risk
- margin pressure
- marketing launch costs (PR, evaluation boards, support costs)
- product engineering
- negative image
- lost engineering time on next generation

I anticipate your response might be, well it hasn't stopped ATI and Nvidia from launching bad products in the past.

To which my response would be that things (for AMD/ATI) are different now. AMD's goal is Fusion. I submit that one possible explanation for the delay, is that the management has come to realize that R600 may only be a financial burden and it might be better to cut the losses on this money-loser and double down on their Fusion strategy.
 
Welll that could mean one of 2 things: Geo is under an NDA that specifies that any acknowledgement of the NDA, or anything less than silence or flat out denial of being under NDA is a violation of the NDA. Or it could mean that Geo owes me PM with a detailed explanation of the nursery rhyme ;)
 
Huh..?
Maybe I missed something despite my daily checks.. but..:



So tell us unenlightenend ones..!!

1). Professional responsibilites extend beyond legal ones. The legal responsibilities of NDA are minimums, not "best practices", nor certainly not maximums.
2). Other feedback suggests maybe I put a little too much emphasis on that particular tip. Tho I still have reasonable confidence that R600's superior BW will tell in specific instances.
 
To which my response would be that things (for AMD/ATI) are different now. AMD's goal is Fusion. I submit that one possible explanation for the delay, is that the management has come to realize that R600 may only be a financial burden and it might be better to cut the losses on this money-loser and double down on their Fusion strategy.

But 'Fusion' is really targeted towards very low end/low power segment i.e for sub $300 PCs, Ultra-light laptops, Ultra-moble PCs.
 
Welll that could mean one of 2 things: Geo is under an NDA that specifies that any acknowledgement of the NDA, or anything less than silence or flat out denial of being under NDA is a violation of the NDA. Or it could mean that Geo owes me PM with a detailed explanation of the nursery rhyme ;)

Or, as I think is obvious, he's not under NDA himself (as he insists is the case) but knows 'certain things' about R600. To reveal what he knows, however, would invite further questions and would wind up leading back to his informant/source, who probably is under NDA and would catch shit.

Plus it's more fun to mock us who do not know.
 
1). Professional responsibilites extend beyond legal ones. The legal responsibilities of NDA are minimums, not "best practices", nor certainly not maximums.
2). Other feedback suggests maybe I put a little too much emphasis on that particular tip. Tho I still have reasonable confidence that R600's superior BW will tell in specific instances.

How specific are those instances, on a scale of importance from 1 to 10 ? ;)
 
INKster said:
Sapphire, Powercolor, GeCube, etc.
The usual suspects are always there, and they must be anxious by now to have something to sell against the Geforce 8xxx, since they can't claim to be *fully* Windows Vista ready unless they have DX10 hardware, right ?

Yes, good point.

I should have said 1st tier OEM launch partner. Board partners are good distributors, but it is the 1st tier OEM launch partner that validates the competitiveness of the part and creates an economic vitality for both ATI/AMD and the board partners.

I'll say again what I just said in the previous post. Things (for AMD/ATI) are different now. AMD's goal is Fusion. I submit that one possible explanation for the delay, is that the management has come to realize that R600 may only be a financial burden and it might be better to cut the losses on this money-loser and double down on their Fusion strategy.

Launches are very big and expensive deals. One doesn't just launch for the sake of launching or because a flawed product is ready to go.

ATI is in the really big leagues now, you've really got to think everything through.
 
Yes. AMD/ATI see exactly that as their long term strategic plan. Standalone high-end graphics is not.

Only time will tell. But ATI has been neck and neck with (or ahead of) Nvidia ever since the 9700 came out. I don't think AMD would have any reason to close up shop in the high end graphics sector, especially now that they have the combined resources, patents and licensing of both companies to draw from. Not only does the performance of flagship cards drive sales for the entire lineup, but the high end technology of today becomes incorporated in the mainstream card of tomorrow.
 
In this hypothetical situation is it really it really be the fault of one engineer?
Most bugs are really, really stupid. Like forgetting a term in a boolean equation. But some of those take very arcane combinations to trigger. When they finally do on silicon, it can take days or weeks to find out what exactly when wrong. And then the fix is usually obvious.

So, yes, most of the time, it's possible to know exactly which designer made the mistake.

That said, unless there's a repeated pattern of the same guys screwing up time and again, I've never seen anybody fired or even reprimanded for a silicon bug and it can happen to the best of the best.

The problem is that there are a million of trivial things that have to be covered and a bullet proof guarantee that everything works is usually impossible or would double the schedule. These days, a major part of testing is done with directed randoms: you create random inputs that conform to the required protocol and you apply them at random cycles to the block under test, but you control the distribution of the inputs to guide the design into certain expected corner cases.

E.g. when verifying an ethernet switch, you'd create a test with 5% medium size packets and 95% very short packets. Another one test would do the opposite. Yet another test would make sure that packets at different ports are applied at roughly the same cycle to check memory conflict handling. etc. One chip can have tens of thousands different tests.

Anyway, once your random tests are implemented, you (automatically) kick them off each night or each weekend and see what comes out. Initially, bugs will trigger very quickly, but some don't. I've had it happen once that a block that was completed months earlier and that had been passing randoms tests successfully all the time trigger a bug just a few days before tape-out. That's just being very lucky that the 'right' combination was triggered. ;)
But it might just as well only have triggered when the chip was already in the fab... or later.
 

In which case, R600 would have to be delayed by a year or so. This RAM was just presented at ISSCC. Expected production is not for this year...

This is just speculation, but it could have something to do with costs and yields of GDDR4 memory or it could have something to do with latency.

You're not suggesting that the latency and yield of a 4GHz GDDR4 will be better, right?
 
Yes. AMD/ATI see exactly that as their long term strategic plan. Standalone high-end graphics is not.

Well, maybe AMD should concentrate on ATI's department of High-End graphics and get out of the CPU business all together. Intel is raping them bigtime. They may never catch Intel now. Intel is at 45nm already and eyeing even smaller processes. Maybe AMD should cut their CPU losses and take on the new GPU strategy. ;-)
 
Well, maybe AMD should concentrate on ATI's department of High-End graphics and get out of the CPU business all together. Intel is raping them bigtime. They may never catch Intel now. Intel is at 45nm already and eyeing even smaller processes. Maybe AMD should cut their CPU losses and take on the new GPU strategy. ;-)

Bad idea, AMD only can catch up NV in GPU side, when NV make a mistake(s).
Same in CPU side, if Intel made a mistake(s), than AMD can catch up, without mistake(s) i not seen anything what AMD can help to catch up in cpu and gpu side, not even the R600 or the K8L, but i can be wrong, time will tell :smile:
 
Well, maybe AMD should concentrate on ATI's department of High-End graphics and get out of the CPU business all together. Intel is raping them bigtime. They may never catch Intel now. Intel is at 45nm already and eyeing even smaller processes. Maybe AMD should cut their CPU losses and take on the new GPU strategy. ;-)

What does processes have to do with anything?

By that logic Intel would own Nvidia in graphics anytime they wanted..so would ATI.

And I love all this talk of "Intel raping". I never heard this before..AMD had the fastest CPU for a long time, now Intel leaped ahead the last few months..AMD has survived worse.. Hell, one quarter ago I guess AMD was "raping" Intel considering Intel's steadily falling market share uninterrupted for many months?

They just compete on price now like the old days (at least until the next CPU which is slated soon right?). Last I saw AMD continued to gain CPU share...I dont think there is a big enough performance difference for "most people" to care, when they go to Best Buy to buy a PC, which CPU they get, if one is much cheaper. The people who care are the few..the hardcore.

Hell I'm still thinking of going AMD..I probably wont but Intel starts at $170 where a X2 3800 is $102..plus AMD CPU mobos are so much cheaper..

Plus AMD can bundle CPU+GPU now, which gained them some share last quarter alone I think..imagine when that gets rolling.

I guess Intel got "raped"

http://www.crn.com.au/story.aspx?CIID=72675&src=site-marq
 
Last edited by a moderator:
Well AMD isn't in the same position as they were before, they are a much larger company now then they were before thier A64's, more employee's need more expenditure, they have to make alot more money now to get the same net income levels, thats why they made a loss last quarter. Intel even if it makes losses is a much bigger company. It can take a sizeable hit and still be fine. The same hit if AMD was to take it, probably will put AMD in the poor house. Intel has alot more flexibility in this regard. And yeah Intel is "raping" AMD is this regard right now. Even though Intel itself is taking a hit by lowering prices, AMD is going to have a tough time to stay in a price war if thier techonlogy isn't as good or better then Intel. The 3800+ x2 example you give, is there an Intel Core 2 duo processor that is similiar performance to that chip? No they aren't the price you gave for the Intel chip is a e4300 which performs around a 4600+ x2.
 
Last edited by a moderator:
There's an "AMD needs money?" thread over in Industry that would be much more suitable for this threadlet. Please continue it over there.
 
What does processes have to do with anything?

It starts with the new tech. If AMD can't keep up they will fall even more behind.

By that logic Intel would own Nvidia in graphics anytime they wanted..so would ATI.

Depends on what Intel concentrates on. No, Nvidia is not screwing up in the chipset market either, they are making good chipsets.

And I love all this talk of "Intel raping". I never heard this before..AMD had the fastest CPU for a long time, now Intel leaped ahead the last few months..AMD has survived worse.. Hell, one quarter ago I guess AMD was "raping" Intel considering Intel's steadily falling market share uninterrupted for many months?

Intel was outselling AMD even when Intel didn't have the fastest chips. Now that Intel has the better chips its just going to grow further, IMHO

They just compete on price now like the old days (at least until the next CPU which is slated soon right?). Last I saw AMD continued to gain CPU share...I dont think there is a big enough performance difference for "most people" to care, when they go to Best Buy to buy a PC, which CPU they get, if one is much cheaper. The people who care are the few..the hardcore.

Only the people who are not up to date on who's got the better processors now(higher perf, lower power, lower heat, higher OC), yeah.

Hell I'm still thinking of going AMD..I probably wont but Intel starts at $170 where a X2 3800 is $102..plus AMD CPU mobos are so much cheaper..

But you won't. LOL. Sure, both will wiggle around their prices to compete but the fact is Intel has a better price/perf value than AMD.

Plus AMD can bundle CPU+GPU now, which gained them some share last quarter alone I think..imagine when that gets rolling.

Yeah, AMD can postpone either product until the other one is ready, great strategy. In the meantime Intel can concentrate on delivering the best CPUs possible.

I guess Intel got "raped"

LOL Nice try. :D
 
...
Still it's pretty sucky for R600 to be delayed again, and the only way it will be forgiven by the early adopters is if it's pretty darn fantastic product. If it is some last minute internal politics that decided they wanted to do a "family launch", I don't see why they had to cancel Press Day at the last minute, unless it was only there to mislead everyone who was expecting the launch.

I'm leaning more towards something happening at the last minute that pushed everything back a few weeks, maybe the logistics of getting enough cards built and in the shops for availability, and then someone decided they might as well delay a bit more and launch the whole family to get a bigger bang for their marketing buck.

Boring for us, but there you go - they missed the chance to compete with G80, so whether it's by five months or six month probably make little difference now. The only other major pitfall is if AMD do launch R600 and get trumped by G80's refresh at the same time. That would be very bad indeed.

Of course, there's another side of the coin to consider, and that is that given the state of nVidia's Vista G80 drivers thus far, and the tantalizing hints about their likely progress in the near term as revealed by the DX10 demo nVidia's just released, coupled with nVidia's recent remarks about some limitations relative to its current DX10 driver foundation ( http://www.nvidia.com/object/vista_driver_news_022207.html ), it could well be that ATi has decided it doesn't need to rush things in particular with the R600 family launch. It seems reasonable to me that ATi is probably as knowledgeable about these things as the rest of us are, and feels that it simply has more time to generate a more effective, inclusive R600 product launch than perhaps the company formerly believed at an earlier point. That's an alternative point of view which I think has merit.

I think it's a bit of a mistake to talk about "G80's refresh" as though it's going to somehow leave the G80 behind along with current G80 owners. But if by saying the R600 "might be trumped by the G80's refresh" you mean that you think R600 will trump the G80, but not the hypothetical, unknown G80 refresh, then you're as good as saying that no one should buy the G80 but should instead wait on the G80 "refresh"...;) I really doubt that this is what nVidia intends, and furthermore, until R600 is released I don't see how nVidia would even know what the "G80 refresh" would have to "trump" in order to win that particular contest. OTOH, Ati at the moment knows quite a bit more about both G80 and the state of current G80 DX10 driver development, doesn't it?

I also cannot figure how you think a 30-day or so delay means that ATi has "lost the chance to compete with the G80." As Vista adoption is still in its infancy (although I have Ultimate now and am using it to good effect with my x1950 Pro AGP), and as the first DX10-required game is a long way off, it seems to me that ATi has plenty of time to both compete with G80 and any near-term (this year) product refresh of the G80. What's the compelling reason to buy either a G80 or an R600 at present? DX9 gaming performance? I rather doubt it, myself. It's Vista and DX10, isn't it? So I think that time is rather plentiful for ATi in that respect. Conversely, it could well wind up that nVidia regrets having marketed the G80 as the Vista-ready product it obviously isn't (at least with respect to DX10 and some other aspects nVidia covers in the link above.) Perhaps ATi feels it imprudent to rush out R600 only to make similar mistakes...? I think this is certainly just as reasonable a proposition as anything else.
 
What's the compelling reason to buy either a G80 or an R600 at present? DX9 gaming performance? I rather doubt it, myself. It's Vista and DX10, isn't it?

I'm a rather strange gamer, I play basically two games, World Of Warcraft and GTR2.
Now WoW runs pretty damn well on the 6800GT in my third system, but to run GTR2 with all the pretty (and I really like the pretty, cos its my favourite game ever) I need my 8800GTX.

So for me DX9 performance is pretty compelling...
 
Status
Not open for further replies.
Back
Top