In April last year I bought a GTX260 SP216 for €150. Now, 16 months later, nVidia's next generation mid-range card is quite a bit more expensive and, in most cases, barely any faster. What's the point?
Thanks, corrected.GigaByte you mean?
http://www.adrenaline.com.br/tecnol...specificacoes-precos-e-estreia-no-brasil.html
Site is portuguese (brazilian), but slides are in english. So, has nVIDIA learned the ATI lesson, and will start from now on doing less monolithic dies?
Thanks! that finally brings the GF104 TMU amount out in the open..
http://www.adrenaline.com.br/files/...ei/2010-07-07_nvidia_comparativogtx460480.jpg
Now, was the 64 TMUs on GF100 by design or not
In some ways very similar to G80->G92.
And it seems likely that GF104 was supposed to be on 32nm. Which was supposed to be ready by now
Why not G80 vs. G84?In some ways very similar to G80->G92.
Though I think there's a general expectation that GF100 will be "replaced" by GF102. Which is presumably not going to happen until 28nm is working. For NVidia.
So GF104 is sort of a preview of GF102 in that sense, presumably with the double-precision stuff deleted and the reclaimed space used for an extra SIMD.
Why not G80 vs. G84?
The GF84 was a preview of the G92.
Yeah, why not?Why not G80 vs. G84?
The GF84 was a preview of the G92.
Thanks! that finally brings the GF104 TMU amount out in the open..
http://www.adrenaline.com.br/files/...ei/2010-07-07_nvidia_comparativogtx460480.jpg
Now, was the 64 TMUs on GF100 by design or not