They will use it as long as it works for them thats the way i see it the only reason they would useI just wonder, how long NV will stand on the GDDR3 node for their SKUs?
If this G200 line is likely to repeat the G80 path, I would see the next die shrinkage (55/40 nm ?) replacement of the bulky GTX-280 to scale down the memory bus in half, wiring some uber-megahertz GDDR5 chippery
gddr5 is if ati took the performance crown from them which im hoping happens after all competition inspires innovation as for the die shrink i have no idea why they stuck with 65nm
Unified shaders with g80 what new innovation will they pull out of their hats for this one thats going to separate it from the rest make it a base for future revisionsNow, if GT 200 turns out to be long life for Tesla architecture - 2 years or better - then the margins will be very good. If it turns out to be a short-lived turkey and we see GT300 in 6 months, then it is probably a very bad margin. These are extreme scenarios.
I really don't think ati will pull out an x3 the power requirements would be insane unless they did some major die shrinkage even then the x2 variant sounds like a true contender and we haven't even seen 4xxx series benchmarks im not saying a x3 variant would be rejected id be first in line for it:smile:i am guessing at $650 for Nvidia's GT200 top performer. AMD will counter with price [and x2/x3] imo.
-What did that link i quoted say, about $110 for a GT200 chip? That IS a decent margin!
Last edited by a moderator: