Well, CPUs have been stuck at 3GHz for years now, and no one seems to mind.
Oh, we mind , Intel promised 10Ghz chips out by this time....
Well, CPUs have been stuck at 3GHz for years now, and no one seems to mind.
But i agree that the "1 year from now" timetable is a little too tight for a G90, even with Jen-Hsun's comments about a load of codenames to come.
Yeah, well, I think basic physics tells us that was an impossible goal.Oh, we mind , Intel promised 10Ghz chips out by this time....
That hardly matters, however, if you don't even know what the codename stands for. Heck, for all we know, no such codename might even exist!I'll throw my prediction in that we wont see G90 until 2008.
[...], the sources pointed out that Nvidia's forthcoming G84 and G86 GPUs for the entry-level and mid-range segments are likely to play a key role in the GPU market, with the two new GPUs expected to debut in the first quarter of 2007.
Although details of the G84 and G86 are still not available, Nvidia has completed the roadmap for the two chips, with local graphics card makers likely to receive product samples in January or February of next year, the sources stated
The following is speculative, and should not be considered based on any non-public information, as it quite simply is not so!
It's looking pretty certain, now, that R600 will have ~50% more GFLOPs (345 versus >500). Am I meant to believe that GFLOPs will not be important in the D3D10 generation?
Jawed
Since when is Nvidia just content to have to biggest bang for the buck?
Since when is Nvidia just content to have to biggest bang for the buck? I'm sure they want to have the biggest bang period.
Surely that's the biggest buck for the bang!?
It's looking pretty certain, now, that R600 will have ~50% more GFLOPs (345 versus >500). Am I meant to believe that GFLOPs will not be important in the D3D10 generation?
Yeah.Jawed is probably discounting the MUL until it's found
heh, I bet it'll be extremely amusing to see how wrong we all are once the actual roadmaps and/or products are out there. Or maybe not and we'll be mostly right, but I seriously doubt that, TBH. As for overclocking, you'd expect them to want to be able to release a small incremental bump with the 80nm shrink indeed, and they wouldn't want to repeat the G71 overclocking fiasco.
What's funny is that if NVIDIA decides to be very aggressive, they could manage what I described as G95 performance not in H2 2008, but rather in H1 2007! SLI is an interesting technology to have at your disposal indeed, and G80 scaling seems to be excellent, so we'll see what happens!
Uttar
How would I know? Better/faster support for 8x/16x MSAA maybe? Faster INT8 blending rates?
Uttar