NV: and they do it again, now - and forever?

T2k

Veteran
Is this true?

http://www.digit-life.com/articles2/gffx/gffx-13.html

They just simply don't give a sh@t about me/you - really makes me angry... now I will definitely offer ATI or Matrox for everybody, in any case, instead of ANY Nvidia card.

This is just f%$#^n' disgusting. Isn't this against some law or rule?

PS: If you, Futuremark (Marki or Worm) don't comment this... just close the door. Nobody will give a flying frog about your benchmark.

DL wrote it: Again, NVIDIA's drivers use some cheats. This time it's connected not with shaders, but with forced compression of semitransparent textures which are used to form puffs of smoke. This explains the speed boost that was seen above. The pictures of both companies differ from the reference one and from each other. Where is FutureMark now which a while go blamed cheaters so angrily and released the patch 3.30 where all things allegedly fell back into place? We don't see whether the reference pictures are considered correct. Or maybe these are not cheats but useful optimizations? But a picture must be rendered as the developers planned, mustn't it? Where's the truth?

Nota bene, Markus or worm: questions are up and running... I'm lookin' forward to FM's reply on this.
 
PS: If you, Futuremark (Marki or Worm) don't comment this... just close the door. Nobody will give a flying frog about your benchmark.

Just one little thing here. There is a possibility that Nvidia doesn't use any application detection in these drivers (as in changing stuff just for 3D Mark specifically) . And thus, they don't cheat the way that the 3.30 patch is supposed to stop. Besides, these drivers aren't even an official release (which is supposed to come soon) so i'd rather reserve my judgement until then.
 
looks like a duck walks like a duck and smells like a duck! in view of recent findings Nvdia do not deserve any benefit of the doubt they have not earned any trust but quite the opposite. all reviews of Nvidia cards can not be trusted for the simple fact it seems Nvidia's drvier team keep finding new ways to cheat.
 
Firstly, I was under the distinct impression that all the transparent textures were compressed to DXT5 in the first place, what "forced compression" are NVIDIA doing?

Secondly, nice of T2k to go off on a rant...from one source of information. Digit-Life proclaim that the 5900 performed better in 3DMark03 with later drivers, even in the 330 revision. Funny, but my own tests show the opposite:

3dmark03build320.gif

3dmark03build330.gif


I didn't include the final score with those tables, but they calculate to be:

320 revision
44.03 = 4817
44.10 = 4817
44.24 = 4771
44.65 = 4724
44.67 = 4729
44.71 = 4740
44.90 = 4693

330 revision
44.03 = 4003
44.10 = 3995
44.24 = 3994
44.65 = 4679
44.67 = 4708
44.71 = 4706
44.90 = 4693

So - not a single 330 final score is higher than a 320 one, although the 44.90s are dead-level.
 
Neeyik said:
Secondly, nice of T2k to go off on a rant...from one source of information. Digit-Life proclaim that the 5900 performed better in 3DMark03 with later drivers, even in the 330 revision. Funny, but my own tests show the opposite:

Wait: it wouldn't be the first time for NV, huh? ;)

On the other hand I asked: is this true?

For FM: I don't like that "silenced"-style. I wanna see some FM comments on it.
 
I don't see why nVidia doesn't take an official stance on this. In games, you can make shaders work the way you want for performance... why not have it open to optimize (but with no loss of IQ)? It seems they will continue to do it over and over until nobody gives a shit.

They could easily satisfy me if they (FM) start an optimizing standard... but then again... nVidia is not going to openly admit their FX cards have 8500 class holes in them.

They said themselves that there is no IQ degradation. As I recall, that was a huge point to many people... and now there is no loss of IQ. (UT2003 is a different story though :) )

I'll bet we will never hear a peep from nVidia about this BS. They have already gone too far with it to come back and say something... until a year or so maybe (off the record). :rolleyes:
 
T2k: Why should Futuremark make a comment? The point I was trying to make was that Digit-Life's report is not 100% conclusive evidence that they are "cheating" (ie. deliberate manipulation of the benchmark and/or drivers to produce a better-than-before 3DMark score)...that is unless they are "cheating" in Digit-Life's own RightMark too!

rightmarkps20.gif

ps20procedural.gif


To paraphrase Digit-Life themselves..."What is the truth? Why don't DL make a comment about this?" How indeed can we "trust" RightMark's results anymore than 3DMark's?

What about other testers, such as MDolenc's Fill Rate tester?

fillrates.gif


Well, that's odd - PS1.4 goes up, PS1.1 stays the same, everything else comes down. Now doesn't GT2, 3 and 4 in 3DMark03 use plenty of PS1.4 shaders? Ah yes they do, so this must prove that NVIDIA are "cheating" in....well, what? But hang on, the masses cry! All the PS2.0 fill rates have dropped but the PS2.0 test in 3DMark03 has massively increased. So they must be cheating in that - but wait a minute, it doesn't add anything to the final score, so it's not going to improve the cards standing in the Futuremark databases. Yes, GT4 uses PS2.0 shaders but improves in GT3 results in more marks than improves in GT4.

Now I am not suggesting, for a single moment, that no form of driver shennanigans is going on with the current crop of leaked drivers. However, I have serious reservations over Digit-Life's analysis of every single IQ difference and minutae and proclaiming that virtually every one of them is some form of a cheat to improve the 3DMark03 (and somewhat oddly, this is all Futuremark's fault as well...). Hands up those people who can remember a similar argument going on between Radeon 8500s and GeForce3s in the IQ tests in 3DMark2001? I can remember some very heated "discussions" about how the NVIDIA card was more close to the refrast in image detail than the 8500, therefore the NVIDIA was not only rendering more correctly but ATI were also "cheating". I seem to recall that once all the dust had settled, that the general agreement was that it is acceptable to have rendering differences, especially when compared to the refrast.
 
I tried to read that article but it has got to be the poorest written english I've ever seen in an review. Please tell me that was translated.
 
Neeyik said:
I can remember some very heated "discussions" about how the NVIDIA card was more close to the refrast in image detail than the 8500, therefore the NVIDIA was not only rendering more correctly but ATI were also "cheating". I seem to recall that once all the dust had settled, that the general agreement was that it is acceptable to have rendering differences, especially when compared to the refrast.

Yea they were taken on the second set of drivers (or maybe 3rd, first smoothvision set anyway) for the 8500 and in those drivers the mipmap lod was set a bit blurry in d3d. The differences shown against the refrast were real and visiable without it.

The next drivers had the lod back at normal but everyone was too busy flaming each other to bother retesting. :)
 
Neeyik said:
T2k: Why should Futuremark make a comment? The point I was trying to make was that Digit-Life's report is not 100% conclusive evidence that they are "cheating" (ie. deliberate manipulation of the benchmark and/or drivers to produce a better-than-before 3DMark score)...that is unless they are "cheating" in Digit-Life's own RightMark too!
To paraphrase Digit-Life themselves..."What is the truth? Why don't DL make a comment about this?" How indeed can we "trust" RightMark's
What card/config was used for these results?
AFAIK, recently DL used only game test from Rightmark 0.4, which shows identical results +-1-2fps on all drivers since 44.03 (ie ~63fps on 1024x768, @ P4-3.2)
 
I'm not talking about their (very) beta RightMark Video Analyzer - I'm talking about D3D RightMark 1016; the one where you can run various synthetic tests such as HSR or state changes. Those particular charts were done using a 5900 non-Ultra, XP1800+, 1GB PC2100, WinXP.
 
That's what one of Rightmark developers said to me (I asked what he thinks of such results - and I gave him link so probably he'll answer himself)
Ôèëèï Ãåðàñèìîâ
Pixel shaders (of NV3x), become a bit faster in every new Detonator IMHO (with 2-3% say) . At last 16fp/32fp work corectly. Still I think they could work 30-50% faster. Until 44.65 writing pixel shaders for NV3x was nightmare.

In 1016 in Pixel Shading test, our shaders were unoptimal, there were many "posibilitie" for optimisations, now they are drastically changed/tuned. would like to show new beta only after 50.x become available.

New Pixel Shading test could be made available in any moment if there is ineterest in it.
 
Here you can find latest Pixel Shading test.

http://www.rightmark3d.org/d3drmsyn/PixelShading.zip

Just replace old files ([D3DRM Path]\Modules\D3D RightMark\Pixel Shading\)

Changes:

- all shaders now written with pure HLSL,
- new "vector normalization" option,
- fix some bugs...

PS: Do not compare new results with results from old versions.

PPS: Anyway, you can modify our shaders to check drivers for "cheats".
Edit "Pixel Shader_HLSL.fx" file and compare results with results from original file (from latest version).[/url]
 
3DMark03 use best available precision for pixel shader 2.0 which is FP32 with NVIDIA and FP24 with ATI, right?

NV30 can only use half from pipelines or whatever you call them, when FP32 precision is used. I have only one question. Does NV35 has same problem with FP32 precision? If answer is yes, then I think that you can't optimize PS 2.0 with FP32 because it's hardware limitation then and we can make conclusion that NVIDIA drivers use FP32/16 mix rather than only FP32 which it suppose to do.

Also these two things makes me wonder:

-NVIDIA has released only 44.03 version from official drivers even latest beta drivers have especially much better PS 2.0 performance.

-NVIDIA has encrypted Detonator drivers which have much faster PS 2.0
 
Neeyik said:
320 revision
44.03 = 4817
44.10 = 4817
44.24 = 4771
44.65 = 4724
44.67 = 4729
44.71 = 4740
44.90 = 4693

330 revision
44.03 = 4003
44.10 = 3995
44.24 = 3994
44.65 = 4679
44.67 = 4708
44.71 = 4706
44.90 = 4693

So - not a single 330 final score is higher than a 320 one, although the 44.90s are dead-level.

You are misreading your results. Look at them. The 320 ones are far to high, and this is seen when 330 is applied... notice around 44.65 there is a huge performance jump, followed by more steady figures. Hah! This in itself suggests something is amiss. 03 tests what 3D cards can do, if nVidia were really improving their drivers, we'd be seeing steady rises, not huge leaps followed by no progress.

Not to mention that 44.65 was release not that long after the 330 build and joint statement IIRC. Gosh, wonder what that driver did. Ho hum.

Frankly, it no longer matters. nVidia have cheated so many times that I don't trust a single benchmarked figure involving them. I don't think the results of the next generation would matter, I'd almost certainly buy ATI.
 
Back
Top