geo said:
Tho, generally, I agree with you in the larger point and have said so many times. . .NV30 wasn't as bad as R300 was really just exceptional; "discontinuity" exceptional even, IMHO. Tho that statement is liable to summon Uttar to explain why NV30 really was a disappointment on its own terms too.
Can I, can I?
Seriously though, you gotta be kidding yourself if you think the R300 was so good that NV30 could be considered okay. The number of last-second changes (even disregarding bad long-term design decisions). There also were failed respins and risk production runs, but that's another story entirely. That isn't to say the R300 isn't a wonderful design, though.
If anything, the NV30 is what made NVIDIA realize that "6 month product cycles" were a thing of the past. As for the memory bus thing - may I remind you all that the NV30 was to have such efficient compression that its huge bandwidth disadvantage wouldn't truly matter. Should you still believe it had better compression today (and I actually believe the contrary, because ATI had a few very nice Z-related tricks that NVIDIA didn't have in the timeframe, afaik), then you'd have to argue that GDDR2 was the cause of all their problems because their memory controller wasn't too good at it yet. Which would be even more ironic considering how much they bragged at how good it was for about 6 months, criticizing ATI on making a "cheap" implementation (compatibility mode), and ultimately that was proved untrue.
Anyhow, none of this is the core problem here. ATI thought they didn't do the same mistake NVIDIA did with the NV2A. From my point of view, they completely misunderstood the problems NVIDIA had and created even bigger ones. They are lucky to be in a better position than NVIDIA was, and to have many more engineers than they did, as well as to have a more "extendable" architecture than NVIDIA had in the NV2x->NV3x timeframe. Because otherwise, the RISK of it becoming as big of a debacle for ATI would have been huge, and the R520 could indeed have become a new NV3x. Because of these many factors however, I'm confident in ATI's ability to bring a perfectly good chip at "acceptable" yields at least, that would most likely beat the G70U in several ways at least. Three months later, though.
So, what is it that I'm saying ATI didn't understand properly? Fundamentally, the kind of agreement they got with Microsoft seems to require less resources, because they "only" design the chip. So, from an expense pov, it's a lot better than the deal NVIDIA had. And the "licensing" system means price disagreements are very unlikely.
The problem is that the problem NVIDIA had, in the NV2x->NV3x timeframe, is not expenses or price disagreements. It's chip design. Simply put, they put too much effort into NV2x/NV2A + Chipset design, and thus had few engineers on NV3x in the early stages, while still wanting to deliver in a very aggressive timeframe (Spring 2002 part, anyone?). This, among other factors, created a lot of problems, which I had neither the motivation nor the time to get into here. Simply put, it wasn't pretty.
The reason I'm saying it's worse for the R500/R520 is simple: they aren't the same chips at all, and there's a nintendo team going on at the same time. On the plus side of things, this means that the R600 will definitively benefit from all this, but I'd estimate the "best engineers" currently aren't working on the R520. And even though quantity can be a quality, in engineering, it tends not to be.
Uttar