The Register carries Nvidia's debacle

The Register said:
Benchmark firm Futuremark has uncovered extensive cheating by NVidia in its 3DMark03 suite.

After an initial report at ExtremeTech, Futuremark revisited the tests and discovered eight instances of cheating, which improved the performance of NVidia's Detonator FX and WHQL drivers by as much as 24.1 per cent.

But NVidia achieved this not through brilliant optimization, but by alternative means which omit graphic details: so the output, while at times similar, does not resemble what it should.

Read the rest here...
 
they still accuse ATi of cheating with the quack thing all the while forgetting to mention that nVidia provided the tools (do you all think Kyle and Steve at hardocp are smart enough to actually write the tools and find the messed up textures?)

bug in drivers not discovered until R200 because R100 ran the code just fine.
 
sure hold on a sec


this was bought up at rage3d

Originally posted by dnoyeB
Quake gives you full control over the video settings, so why ATi felt the need to reduce quality through the back door in an unchangeable fashion is beyond me.


this was the reply made by catalyst maker regarding that

Two points.
1) We did not TOUCH image quality in the least bit

I think he knows more about that issue than you would since he deals with the drivers
 
Uhhh,

1) reply in that thread, please
2) proofread your posts. I can only begin to guess what you're trying to convey.
 
I think the biggest problem by far with the Quack-Detgate contrast is chronological. The Detonator cheats are current news--current events--and the Quack/Quake issue hasn't been a current event for a long time. Basically, the Quack thing is usually referenced by people wanting to deflect attention away from the current Detonator scandal. Sorry, but "Quack" is no defense for the current Detgate, not even close. The two are so far removed from each other in time that the fact that they aren't even remotely similar isn't even relevant, IMO.
 
The iussue is, I believe we showed evidence that Nvidia has been cheating at least as far back as the GF2. Ultimately as far back as their Initial introduction of S3TC in the TNT2 imo.

Quak is an Insigificant drop in the bucket compared to the near Continuous infractions by Nvidia. Sooner or later everyone is going to realize that. Nvidia has a LONG history of reducing Quality for speed. Who knows what else they have gotten away with.

I have a feeling that the Wormhole is going to go deeper than the 3dmark03 issue. I have seen some interesting indications that the lid may be completely ready to blow.
 
Hellbinder[CE said:
] ...
I have a feeling that the Wormhole is going to go deeper than the 3dmark03 issue. I have seen some interesting indications that the lid may be completely ready to blow.

You mean there might be more....*gasp*...Don't know if the old ticker can handle all of this excitement....;)

As far as what nVidia was doing years ago--yep, that's what I recall, too. Detonator releases advertising "up to 50% performance improvements" (but only in selected benchmarks near you) have been routine for nVidia ever since it was battling 3dfx... The 3D crowd wasn't nearly as savvy, then, though (many of them were doing orgasmic flips over such things as "agp texturing" and the like...*chuckle*...so it's no wonder things like that got by pretty easily.) I imagine nVidia's feeling pretty confused right now and thinking "Why, now?"....;) It's really true that bad habits are hard to break, I suppose. But really what surprises me is how entrenched and institutionalized this behavior has become within nVidia, apparently. Sometimes when companies grow real fast--too fast--they get brittle before they break. I guess we'll see in the coming months if nVidia has any flexibility left, right?
 
Hellbinder[CE said:
]The iussue is, I believe we showed evidence that Nvidia has been cheating at least as far back as the GF2. Ultimately as far back as their Initial introduction of S3TC in the TNT2 imo.
Not to belabor a point, but the TNT2 did not support S3TC. The GF256-SDR was the first chip that did.

And, whether it interpolated in 32/24 bit or 16 bit (both 'correct', though 32/24 obviously has better output) had would have had little to no impact on the speed, so I'm not sure where you're getting at with associating cheating with S3TC.
 
RussSchultz said:
And, whether it interpolated in 32/24 bit or 16 bit (both 'correct', though 32/24 obviously has better output) had would have had little to no impact on the speed, so I'm not sure where you're getting at with associating cheating with S3TC.
Not quite.

Firstly the S3TC spec states it should be 32-bit interpolation, so technically anything that only used 16-bit interpolation was outside the spec.

Secondly, there are architectural reasons why you might prefer to use 16-bit intermediates than 32-bit ones (if you store uncompressed texels in the texture cache, for example) and so it could be faster. I doubt if it did actually make any difference, but it is possible.
 
Dio said:
Firstly the S3TC spec states it should be 32-bit interpolation, so technically anything that only used 16-bit interpolation was outside the spec.

IIRC there was never a really clear spec and MS also did not specify it very clearly when describing the DXT format. BUT the main thing is that doing the interpolation in 32bit accuracy is an obvious thing to do given that you interpolate between two 16bit values, not going to 32bit accuracy is quite silly... unless - like NVIDIA - you have a cache that stores "uncompressed" texture data at which point storing 16bit or 32bit uncompressed colors has a huge impact on how many texels you can actually store inside your cache.

K-
 
There was a clear spec, but it was not made public except to S3's licencees AFAIK. Everyone else had to use the Microsoft spec which was, like their compressor, not as good as the originals.
 
Dio said:
Firstly the S3TC spec states it should be 32-bit interpolation, so technically anything that only used 16-bit interpolation was outside the spec.

Secondly, there are architectural reasons why you might prefer to use 16-bit intermediates than 32-bit ones (if you store uncompressed texels in the texture cache, for example) and so it could be faster. I doubt if it did actually make any difference, but it is possible.

IMHO, as far as I remember the hack to enable 32bit interpolation on GFx-cards ( DXT3 or DXT5 ) was slower than the 16bit DXT1 as specified in the drivers. So you could very well say Nvidia inflated all benchmarks which used DXTC (like Quake3). I'm not sure if this was corrected in the new chips ala NV3x or not. So this inflated benchmarks could not only be a thing of the past.

But overall the speed difference seems to be really small :

http://www.digit-life.com/articles/gf3/

Only on older cards the difference was greater ( 6-7% ) :

http://www.3dcenter.de/artikel/2000/10-26b.php
( german article Geforce SDR @ 170 / 200MHz )
 
Dio said:
There was a clear spec, but it was not made public except to S3's licencees AFAIK. Everyone else had to use the Microsoft spec which was, like their compressor, not as good as the originals.

That's interesting information--thanks!
 
Russ you asked the question in this thread, so I answered it here.

As for proof reading if I changed the wording in the least bit then I would no longer state that the persons whom I quited wrote it.


Do you understand?

Good.
 
Now another site 3dchipset.com is reporting the fiasco:

Even more to this story - courtesy of NVidia's incredible, continiung recalcitrant attitude and - Now NVidia sponsored TV shows are calling ATi cheats - not mentioning NVidia's actions - and get this - they use NVidia's own cheat screen shots to show how ATi was "cheating" by infering it was ATi that produced NVidia's output!!! Now that definitely is libel!!!

http://www.3dchipset.com/comments.php?id=1534&category=1

TSS supporting their Sponsor? - 30-05-2003 - Solomon


Now, I know most of you are all sick of hearing about this. A little of me is tired of it too, but when stuff like this happends. Well it's interesting to talk about. The show The Screen Savers which airs on TechTV reported about the cheating drivers debacle we have heard over the past 2 or so weeks. What makes this one different? Let's say this is pretty bad, that it could possibly be liable for a law suit? Here is the skinny:


During the Tuesday nights showing of The Screensavers, Leo and Pat were talking about cheating drivers. Both only mentioned ATI had code in their drivers optimized for 3DMark and no real mention of any regarding nVidia. They then went on to to show screenshots of the improperly rendered sections of the benchmark. The screenshots that they showed on air remind you were not that from an ATi video card, but rather from their sponsor's nVidia's card. Now, I'm not so sure if that was a mistake or was intentional, but nothing of such sort of correction was made on Wednesday's or even Thursday's show.

From the recent findings by, Futuremark (owner of the benchmark), Beyond3D, and Extreme Tech shows a marginal 1.5 - 2.5% increase due to the pixel shader coding which ATi officially stated and a 25% to 30% from nVidia which Futuremark determined was a cause of cheating the benchmark. What is scary is that no mentioned of the 25% to 30% was mentioned on the show, just that ATi was cheating. I would of expected maybe one or two web sites to make this kind of error, but not a national televised TV show.

So you be the judge. Did TSS use false info? Or was this a clever tactic to try to put their sponsor in a better lime light? Their is a thread over at Tech TV, under the TSS Producers Feedback section regarding the incident that people have asked to correc. If you have watched the show before, they have a, "LAN Party - Powered by nVidia". If I had to pick one I would have to say it's the second one. The information about ATi and nVidia have been brought to the surface for quite some time now. So needless to say I'm wondering if ATi will take any action. If any company was wrongfully represented I would surely be talking to my lawyers. Or asking for an official appology from the TSS people on air.
 
Back
Top