Hi,
I've read most of the thread and while the performance speculation is interesting, I haven't seen anyone really take a stand on what kind of performance would be *acceptable* and what *good*. The way I see it, acceptable would be to put out something over the Radeon performance curve, good would be something that retains their advantage from the earlier generation. Of course, I'm not talking about raw performance here, but with respect to release date (since 5670 would have blown away competition a few years ago).
I'm going to look at 4870 and 280 since 4890 and 285 were minor updates.
So, looking at Radeons first, ATI got something like 1.8 the performance of 4870 out in 5870 (is that about right?) in 15 months (4870 in June 08, 5870 in September 09). For NV to retain the advantage of 280 (out in June 08), GF100 coming out in 26 months (this March) would have to be ~2.77 times as fast as 280 (1.8^(26 months / 15 months)). That would be *good* performance, keeping up with ATI development speed and retaining their advantage from 2008.
For *acceptable* they would only have to improve upon 5870 by what the six months of development are worth. That is, GF100 should have at least ~1.27 times the performance of 5870 (1.8^(6 months / 15 months)). That performance would keep them competitive, with performance between ATI's current and projected next generation. However, they would have lost the advantage they had in 2008.
Did I have some of my facts wrong? I took the dates from Wikipedia, so I'm not 100% on them. Also, I couldn't find a very good chart of average performance between 4870 and 5870 so the 1.8 was an estimate based on what I could find.
Your thoughts: which of these (or neither) sounds more probable, and is this a reasonable way to look at things