So we had NV1 - complete failure although interesting.
Riva 128 - Better, but still a failure when compared to the competition.
TNT - Nvidia finally get's it right...still not one of my favorites...
Geforce 256 to 4- Improves even more...image quality still not what I'd like, but it was fast.
FX Series - Again a relative failure compared to the competition. Probably would have been good on it's on.
6xxx series - Nvidia hits stride again but remains "generally" slower and less image quality than ATI.
7xxx series - Still "generally" slower than ATI and still lower IQ.
8xxx series - Absolutely blows away ATI in speed and marginally better IQ, although I prefer R600 AA quality (speed is another matter entirely).
If I'd be personally looking for major "milestones" in GPU history, I'd personally vote for G80 and R300 for the most recent past; mostly because both came with a healthy increase in performance and IQ at the same time.
However that list above is still a tad too loopsided for my taste; ATI started IMHO to take the mainstream gaming market more seriously with the dawn of the initial Radeon (R100), both on the hardware as well as on the software level. If you would had compared R100 to NV1x I don't see there any IQ winner between the two. Both had OGSS (whereby R100's AA performance wasn't exactly breathtaking at the time), while AF was either limited to 2xAF yet unoptimized or something with supposed more samples yet caused far more side-effects that I could have tolerated.
R200 vs. NV2x was Supersampling vs. Multisampling; the fact that NV2x was restricted to 2xRGMS/4xOGMS wasn't much of a problem since R200 was capable of some RGSS only under specific presuppositions. AF on R200 wasn't in the realm of superior AF either qualitywise; apart from the weird angle dependency here it was limited to bilinear AF only like R100.
R300 took the market by storm because NV3x came with a huge delay, was vastly inefficient in the arithmetic department and was still limited to 2x samples for RGMS. AF still wasn't better though on R300, yet considering the much longer list of advantages it had over it's competitor I considered it nitpicking to even mention it. ATI's 6x sparse MSAA was a complete gem for CPU bound scenarios back then.
R420 vs. NV40: NV finally went to 4xRGMS; ATI still has the 6x sparse MSAA advantage in terms of IQ. No major advantages in the AF realm for either/or, with the difference that NV introduced a similar angle dependency to R3x0. However NV since the NV2x introduced their hybrid MSAA/SSAA modes which might have been (and still are) of limited usability, but for cases where someone is either resolution or CPU bound those can have their uses also. Remember we didn't have transparency AA back then and for cases where scenes where overloaded with alpha textures the only other sollution was unfortunately full scene supersampling.
G7x vs. R5x0: ATI implemented a far less angle dependent algorithm for AF which was a significant advantage back then; AA remained for both sides nearly the same with the exception that we saw the first signs of transparency AA (adaptive supersampling for alpha textures).
G8x vs. R6x0: NV bounced back to lower angle dependency (as found up to NV3x), inserted a shitload of TF/TAs in order to supply the chip with insane bilerp fillrates and introduced coverage mask AA alongside MSAA. ATI kept roughly the same less angle dependent AF as in R5x0 and replaced 6x sparse with 8x sparse MSAA. They also introduced custom filter AA, which with the exception of the edge detect mode (which looks outstanding IMHO, yet costs quite a bit of performance) isn't something that knocked me out of my socks either.
Pardon for the rather long list, but at all times "huge" difference in IQ are debatable. For one IQ is far more a subjective matter and as a second for all those years analysis for that department leaves a lot to be desired. To be honest it's the most difficult one, because a reviewer would have to do a lot more than to run a series of benchmarks and write down the results and a second due it being rather a subjective matter there's always some risk involved.
The bare truth is that both IHVs have a long track of transistor saving implementations for AF (higher angle dependency) and various performance optimizations. If one would sit down and would compare the persentage of a scene Multisampling takes out it vs. the persentage in data for AF, the differences are like night and day. Logically to me personally there's more weight on AF then on AA quality; it's first thing I notice anyway. Both of them are at least equally guilty for fooling around with that quality for one or the other reason. It was simply a rather tragic irony for NV that when ATI finally removed their high angle dependency they went back to mimic R3x0's angle dependency.
That said I wouldn't be in the least surprised if either/or or even both in the future end up sacrificing quality again if their transistor budgets for target X should get tight again. While it might be understandable, I say let the innocent cast the first stone.
So, yeah, they've certainly been successful, but I count at least 2 failures prior to the FX failure. And while successful other than having better marketing they still trailed ATI in speed and IQ until the 8xxx series.
Did you count the failures and successes of ATI prior to the R300 too?
Basically, when it comes right down to it. Nvidia isn't all that different than any other tech company. They have hits and misses with regards to their hardware.
Nvidia however has certainly excelled when it comes to marketing. Don't think anyone would argue that point.
That's most certainly true. The point where I disagree is that any of their past successes where mostly due to marketing. It's not like anything ATI touched turned into gold and anything NV touched was a steaming pile of poo (or vice versa); both did the very best possible within their own capabilities for each timeframe. If NV's successes would be alone due to marketing then I wonder where terms like
execution,
developer support and many others would fit in such a picture.
I don't recall who said in the past that one should let ATI to design HW and NV to execute; while it's of course a wild exaggeration for both, there's still some truth hidden within it.
And by the way these type of discussions have been recycling for years now. Despite each and everyone's opinion on issue A or B, it remains an undisputable truth that it's for everyone's benefit that the desktop graphics market has and will continue to have at least two contenders. W/o any significant competition it'll get utterly boring and I'm afraid that in somewhat monopolistic scenarios we'd face far less evolution than up to now.