Egg on Nvidia's face: 6800U caught cheating?

Status
Not open for further replies.
DemoCoder said:
FUDie said:
Sure it's a bug. Yet the same LOD bug is shown in all screenshots. Maybe there's some weird bug with the drivers causing that one quad to be not transparent, but that doesn't explain the LOD shift elsewhere.
-FUDie
There is no LOD shift elsewhere in that Max Payne shot. Only the quad is significantly different. Should I say BFD like you said to the issue of differing frames? You seem perfectly willing to dismiss the different pipeline state and geometry as BFD, but when every there LOD level is basically identical except for a single anamalous quad, we're supposed to see it as evidence of cheating.
How would the state of the character affect he state of anything else in the scene? Completely ridiculous. A developer should know better.

And if you can't see a problem in the Max Payne shot, then that's even more damning for NVIDIA as that implies the LOD shift only applies to 3D Mark 2003. Oops.

But anyway, maybe you should adjust your gamma. I can see a small LOD shift in the walls on Max Payne.

-FUDie
 
P.S. On my flat panel, I can see a lens flare around the second light in the ATI shot. That is probably where the green quad is coming from on the NVIDIA shot. The ref rast is missing all the lens flares.

-FUDie
 
Some clarification for cheating with AF and AA .

1. There are no AA or AF tests.
2. AA can be done by simply blurring it removes the aliasing now its a crap method but it works.
3. You can not set AA or AF in the standard version of 3.1.3 3dmark2k3 so by setting the control panel AA or AF you are incorrectly rendering it.
 
Isn't the whole issue getting what you payed for, and being in control of features and settings and expecting them to do what they are supposed to? If you dial up tripple buffering in the drivers and it gives you double buffering then you are being lied to are you not? I don't like the questionable choices nVidia keeps making when it comes to drivers, but it is up to me to decide if I like the image quality or not. If I don't like it, too bad for nVidia. BUT if I see features in drivers and want to use them, they should do as I ask.

I hate to do a parallel of other consumer hardware, but if you purchase a product that has a knob on it that says tripple, double, single, and off but the tripple setting does exactly the same as the double setting, you are being ripped off. Funny thing is I have a blow dryer that does this, the highest setting is exactly the same as the other two, I know because I repaired the thing and found out. Maybe nVidia was inspired by a blow dryer in more ways than one.
 
FUDie said:
How would the state of the character affect he state of anything else in the scene? Completely ridiculous. A developer should know better.

How do you know that the state of the character is the *only* thing different. There could also be small differences in vertex positions or attributes. You don't know.

And if you can't see a problem in the Max Payne shot, then that's even more damning for NVIDIA as that implies the LOD shift only applies to 3D Mark 2003. Oops.

Why would it be more damning? Different API state will execute different paths in the driver, and therefore trip up different bugs.

But anyway, maybe you should adjust your gamma. I can see a small LOD shift in the walls on Max Payne.

Yes, I can see a small shift of a few pixels, but it's well within allowable limits. Absolutely equality with the refrast isn't required. The refrast image is even missing the glow around the ceiling lights! Are we to believe that a small shift in the LOD (look at the proxycon images for gawd sakes) is the work of some uber cheat? You're grasping at straws FUDie.
 
DemoCoder said:
FUDie said:
How would the state of the character affect he state of anything else in the scene? Completely ridiculous. A developer should know better.
How do you know that the state of the character is the *only* thing different. There could also be small differences in vertex positions or attributes. You don't know.
Because I can see lens flares on the ATI shot the correspond to the mess on the NVIDIA shot.
And if you can't see a problem in the Max Payne shot, then that's even more damning for NVIDIA as that implies the LOD shift only applies to 3D Mark 2003. Oops.
Why would it be more damning? Different API state will execute different paths in the driver, and therefore trip up different bugs.
Very convenient bug: One that only modifies LOD. It's amazing how all of NVIDIA's bugs seem to boost performance. Clip planes, compressed textures, LOD shift...
But anyway, maybe you should adjust your gamma. I can see a small LOD shift in the walls on Max Payne.
Yes, I can see a small shift of a few pixels, but it's well within allowable limits. Absolutely equality with the refrast isn't required. The refrast image is even missing the glow around the ceiling lights! Are we to believe that a small shift in the LOD (look at the proxycon images for gawd sakes) is the work of some uber cheat? You're grasping at straws FUDie.
I'm grasping at straws? That's funny. Maybe you should take a look in the mirror sometime.

-FUDie
 
Good morning all...

...to answer a few of the questions.

All rasterizer images were taken on both cards (3dmark and Max Payne) and the output compared to ensure it was exactly the same (which they were). Then one image was chosen from the 2 to compare to the cards.

The max payne square occurs in several long corridors, we used that one as an example as its near the begining of the game and so you can get there and try your own tests easily (if you have access to the card). this green square is not specific to that scene.

I'll try to pop back in later and respond to more enquiry's however I am busy with other things today so proirity goes to getting a response from NVidia/FM and getting it up on the site.

Stu
 
FUDie, here's a little hint, highest mip level != highest quality.

Moreover, you're trying to call a 5-10 pixel shift a performance cheat? What do you call angle dependent AF then? What do you call ATI's 1 pixel shift vs refrast on the trollcave image? Hey, they're cheating. Ding! Ding! Ding! Call the police, they're switching LOD 1 pixel sooner than they should have vs refrast!

Did you actually compare the IQ of the resulting images? Did you ever consider refrast doesn't offer the highest quality?

I hope NVidia releases a driver that allows you to switch between the two LOD algorithms so you can see there is isn't a big performance difference. I'll expect a retraction from you then.
 
GT1 seems a strange test to cheat in as it has the highest frame rates already and has some cpu influence on the frame rate.

Nivida drivers have had worse bugs before, ie 42.68 which didn't even draw some details ( bullets, flames , explosions ) which got corrected in later drivers.

The first statement needed is from Futuremark, nvidia will say bug whether it is a bug or not, then the real test is what drivers the card is released with and how that renders.

This is probably all a storm in a tea cup.
 
Before we begin it should be stated that these tests were performed using the Forceware 60.72 drivers, the version provided by Nvidia to all press reviewing their new card.
My first thought was: so have you done the same tests using other drivers? eg. the 56.72 etc. for comparison? Are there any noticeable difference in performance or IQ between versions?

My other thought was: are people putting ATI drivers to the same level of scrutiny that Nvidia ones are? It seems (and nVidia only has itself to blame of this) that all Nvidia drivers are put under the microsocope. Finding a few frames in a benchmark that don't quite match to a reference image that wouldn't be spotted with the human eye doesn't necessarily equate to cheating. Would it be improbable that if you went through enough games and benchmarks that you wouldn't find instances were reference images don't match up to those rendered by ATI drivers? Would that prove ATI are cheating?
 
jolle said:
With all this witchhunting in that department these days, you could stop and ask yourself: Does it matter?

If any chipmaker decides to sacrifice IQ that CANT be seen for peformance, does it matter to the end user?
The line for IQ, is it drawn at what you can FIND by deep digging, or what you SEE onscreen when you play..

Not calling any shots on this case, but just in general..
Optimizing is a good thing, when you sacrifice IQ while doing so its instead a cheat..
But are you sacrificing IQ when the end user cant se the difference, or is it enough that you can find a difference with these Mipmap tools and blowups of screenshots?

I think a missed point is that if Nvidia have specific app detection and are changing to hand tuned optimisations, it gives the impression that games/benchmarks are faster.

However, what happens when you try to run a game or app that hasn't been hand tuned by Nvidia programmers? You've seen high scores in 3DMark2003, but then when you run a game, you find the frames are a lot lower than you were led to believe. What happens when a developers issues a patch and Nvidia's hand tuned optimisations get broken, and your performance drops back down?

Such optimisations (even if they are not very noticable) are in general a bad thing, because you get a misleading benchmark result (which Nvidia can crow about, such as the big deal they made on launch of NV40's 14,000 3DMark score), and which causes you to buy a product that then doesn't live up to that level of performace because Nvidia may not have spent time and money optimising for your particular game.
 
This is all crap IMO. LOD determination has changed on NV40 vs previous NVIDIA chips. Everything up to and including NV38, all ATI chips and the refrast were in perfect agreement about how this needs to be done for isotropic filtering. It's now slightly different, that's all.

Imagery:
NV40
NV38
R360

Source of all three
Ignore the brilinear stuff on NV38 and ignore the lower precision on R360. Just look at the shape. It's obvious that there are differences. If you do the colored mipmap thing with bilinear filtering, slight differences between mipmap selection will be more pronounced, obviously.
I personally don't give a damn. This is no cheat.
 
Bouncing Zabaglione Bros. said:
(which Nvidia can crow about, such as the big deal they made on launch of NV40's 14,000 3DMark score)
A little OT: I'm beginning to wonder, if the people talking about this particular score have the "how Nv obtained this score" in their mind. I would say no :rolleyes: .

All i can say, go back to the presentation, and listen to what the speaker says.

As for this particular thread, i would say, go back to your homework.
 
Ho-hum.

Oh hey, on a much more important note : Anyone knows when the hell we can buy a NV40 card? Coz Albatron sure as hell told me it's not gonna be a mass production product, at least not until end-May.
 
Evildeus said:
Just to make some more drama :devilish: It seems that Ati is directing website to this article (for what purpose :rolleyes: )
Dans un email reçu il y a quelques minutes, un représentant officiel d'ATI nous renvoit vers un article publié sur le site DriverHeaven laissant croire que NVIDIA triche dans ses drivers
http://www.clubic.com/n/n12418.html 8)

I'd be extremely interested in seeing the content of the e-mail but even without seeing it, my initial reaction is surprise & some disappointment. As far as I can tell nobody has come up with an explaination that is accepted by everybody on why the "errors" are occuring.

It just shouldn't happen! :(
 
Status
Not open for further replies.
Back
Top