I think graphics technology just soared straight over the heads of most journalists in the past few years. Back in the VooDoo days, everyone still knew how things worked... All accelerators were very simple, and could only draw textured triangles. Whichever card was fastest, was the best.
But these days, I don't think anyone but the actual people who design the hardware, or develop software for it, actually understand what these massively complicated devices are really all about.
So I suppose the journalists have no choice but to parrot everything that NV, ATi or whoever else tells them.
It's not just 3d cards either... I've seen the same happening with CPUs. I recall that when the P4 was introduced, nearly all websites mentioned something like "Because of the improved branch predictor, the floating-point performance is increased".
Since nearly all websites mentioned this, I suppose Intel marketing told them this. But ofcourse it's complete nonsense this way. Either Intel or the websites pulled it out of context, but apparently nobody realized what they were saying anyway.
But these days, I don't think anyone but the actual people who design the hardware, or develop software for it, actually understand what these massively complicated devices are really all about.
So I suppose the journalists have no choice but to parrot everything that NV, ATi or whoever else tells them.
It's not just 3d cards either... I've seen the same happening with CPUs. I recall that when the P4 was introduced, nearly all websites mentioned something like "Because of the improved branch predictor, the floating-point performance is increased".
Since nearly all websites mentioned this, I suppose Intel marketing told them this. But ofcourse it's complete nonsense this way. Either Intel or the websites pulled it out of context, but apparently nobody realized what they were saying anyway.