But the original GDDR5 timing already made clear that 4Q07/1Q08 would be too early for GDDR5 (or at least not at sane pricing), while 3Q08 would be just fine.
My main point is simply that NVidia would tend to be conservative in adopting it - 1 or 2 quarters behind ATI. If they're pushing other things then do they want to increase risk by pushing GDDR5 too?
Though it's notable that NVidia was so aggressive with GDDR3 that it appeared on a low-end part first (5700U wasn't it?).
It is very important to remember that it's not just GT200 that was delayed; GDDR5 was too!
And I expect that all the architectural features of GDDR5, such as the longer bursts, are a non-trivial change for the GPU. "b", so far with NVidia GPUs, is far from being able to encompass the kind of radical changes you're proposing. "b" is nothing more than indicative of a "cost-cutting 55nm refresh".
So something doesn't add-up. A refresh part doesn't change memory architecture significantly - unless the original part was planned to have that memory system too.
So I'd only be willing to accept the idea of GT200b being GDDR5 if GT200 was originally planned to be GDDR5 too.
Uhhh, well either they're planning it on GT200b or they'll have to wait until 40nm in late 1Q09/early 2Q09. There is nothing between GT200b and 40nm; along with GT206 (ultra-low-end chip), it is the last NVIDIA chip on 55nm AFAIK.
40nm is late too. Now I have to admit the idea that NVidia would go for both 40nm and GDDR5 on their winter 2008 enthusiast part seems pretty unlikely.
Also, under what sane roadmap would you have had GT200 in 4Q07, GT200b in 2Q08 and GT20x with GDDR5 in 4Q08, all replacing one another and on very similar or identical processes?
GT200 and GT200b are no different from G92 and G92b in this regard - just a refresh, and in the CC they put quite some weight on the money they can save by going to 55nm. I said "summer 2008" deliberately, to imply a gap that's more like 9 months than 6, by the way.
GT20x with GDDR5 would be following their schedule to introduce an enthusiast part in Q4 of each year. That was their promise back in 2006, wasn't it? Tick-tock, enthusiast-refresh?
One thing that is suspicious is the timing of some 55nm parts - G96 appeared in July as 65nm. Now G96b in 55nm form has just turned up? What the hell's going on there?
Overall, as you say, NVidia seems to be very wobbly these days.
There is no way in hell to justify such a fast schedule in the ultra-high-end given the low volumes and the design+mask costs.
Hmm, when GT200 is "yielding badly" and there's only about 80 of them per wafer, I expect there was a strong incentive to go with 55nm if it was at all practicable. Repeated re-spins of GT200b plus a large inventory of GT200 could well have put the kibosh on 55nm though.
Also, don't forget that the 3/4 part, GTX260, is not meant to be ultra-high-end in price. And as Bob explained it, you shouldn't think of the 3/4 part as the "salvage" but instead think of the fully functional GPU as "bonus" to sell at even higher margin.
GTX260 was originally only going to be $50 more than 8800GTS-640's launch price. Then it launched at $400.
BTW, I don't think it's realistic to expect NV to go from 512-bit of GDDR3 to 256-bit of GDDR5 for GT200b as I pointed out even excluding the memory/PCB cost factors. That would mean either 16 ROPs (too few) or 32 (too many), and given the lower effective bandwidth of a given bitrate of GDDR5 you couldn't achieve the same effective bandwidth despite likely increasing core clocks slightly.
For what it's worth I'd tend to agree with the 384-bit configuration.
Jawed