Kirk on Cg

Gollum said:
When trying to see it only mathematically, of course you got a point. You almost sound to me like on of those reviewers who only care about their fps in Q3A benchmarks.

Absolutely not.

If you want to limit yourself to taking such statements from a company litterally, adding up numbers that don't tell the whole story to check for validity of said statement in order to make a fuzz about it, then that's your problem. If others don't agree try taking a little more open-minded aproach to the situation and never forget its only PR, its meant to exxagerate! I don't hear you complaining that your toothpase isn't providing you with the freshest white smile of the world as it claims to ...

No, I'm well aware it's a PR-based statement and therefore nothing to get too upset over. I was just a little tired of seeing the continuous "we double performance, we double our transistor count every six months" rhetoric. Besides, my teeth are white so I can't spew a rant in that direction. ;)
 
Gollum,


we can nit pick all we want. I doubt that future games will show us anything different. First of all those 1st gen GPUs wont support pixel shadders so any comparison would be meaning less as time goes on (ie less and less support for the older cards). Second of all, by the time these games get here we will have Nv30+ or R300+ and those might not x2 the performanec of the GF4 ti4600 or the 8500. And Dave never said with all IQ max taking advatnage of every feature that the card has :) So we can twist it a hundered different ways if we wanted it to.

I agree with JR that we keep hearing the same PR over and over and over. I wish Dave would stick to talking about technical stuff instead of PR. nVidia has a big enough PR team they don't need his help ;)
 
Back
Top