Egg on Nvidia's face: 6800U caught cheating?

Status
Not open for further replies.
Malfunction said:
If ATi is aiding in the distribution of these unconfirmed results without explainations from either party just to sell videocards, then it tells me that X800Pro and possibly X800XT are in trouble as far as competition is concern.

Following this logic, does this mean that nVidia were in trouble when they tipped off websites about Quake/Quack III?


As an aside, I'm interested to see Driver Heaven using 3DMark03 again, after they had so publically damned and dismissed it not so long ago. I wonder if there was a particular reason for them to suddenly fire it up again on the 6800?
 
Joe DeFuria said:
Oh...that's when the first DX 9.1 Cards appear? ;)
No 9.1 jokes today, ok? I had a twinge in my chest last night that scared the living bejesus out of me and I don't think I should get myself too worked up today.

Hanners said:
Following this logic, does this mean that nVidia were in trouble when they tipped off websites about Quake/Quack III?


As an aside, I'm interested to see Driver Heaven using 3DMark03 again, after they had so publically damned and dismissed it not so long ago. I wonder if there was a particular reason for them to suddenly fire it up again on the 6800?
No hold on just a second here! I thought I was the mean one from EB who has an axe to grind with DH...we might have a copyright problem with you going off on 'em! :|
 
digitalwanderer said:
No hold on just a second here! I thought I was the mean one from EB who has an axe to grind with DH...we might have a copyright problem with you going off on 'em! :|

No axe grindage here. ;) I'm happy to see them going back to using 3DMark03 in all their reviews, I'm just suprised by the complete 180 they've done so soon.
 
digitalwanderer said:
No 9.1 jokes today, ok? I had a twinge in my chest last night that scared the living bejesus out of me and I don't think I should get myself too worked up today.

Apologies, dig! Didn't mean to put your life in jeopardy.

that being said, maybe you should consider a less stressful way to spend your time in general...rather than posting on these boards during a graphics-card dual launch...perhaps consdier something like...like maybe take-up roller-derby, or how about the Running of the Bulls?

:)
 
Joe DeFuria said:
maybe you should consider a less stressful way to spend your time in general...rather than posting on these boards during a graphics-card dual launch
This IS relaxing compared to my day job! :rolleyes:

On the "DH is using 3dm2k3 again" thingy, they ain't exactly very nice to FM in their piece. I caught a tone of them accusing FM of some funny bidness with their special approval of the 60.72 drivers for the nV40.

Mebbe they only use it when they get to knock it? :|
 
Hanners said:
Malfunction said:
If ATi is aiding in the distribution of these unconfirmed results without explainations from either party just to sell videocards, then it tells me that X800Pro and possibly X800XT are in trouble as far as competition is concern.

Following this logic, does this mean that nVidia were in trouble when they tipped off websites about Quake/Quack III?


As an aside, I'm interested to see Driver Heaven using 3DMark03 again, after they had so publically damned and dismissed it not so long ago. I wonder if there was a particular reason for them to suddenly fire it up again on the 6800?

We wont be using it in reviews, we wont be publishing detailed performance results/comparisons using it. It served a useful purpose for this particular article and has some features that made testing easier over AM3 or any other synthetic test. My opinion on 3dmark remains the same...

...just because we dont use it in reviews doesnt mean we dont use it at all...
 
Fair enough. Oh well, I was hopeful there for a second. :cry:

I have to ask in that case, what made you decide to fire up 3DMark03 to find this anomaly in the first place?
 
Hanners said:
I have to ask in that case, what made you decide to fire up 3DMark03 to find this anomaly in the first place?
Good question. If 3dm2k3 ain't on your normal list of utilities to check cards with why did you choose to use it? :|
 
Every time i get some hardware in there are a few tests which i always run on the drivers to ensure things are working normally....obviously some differ from product type to product type. In the case of video cards 3dm03 is one of the quick tests i do. E.g. its useful to make sure the hardware is working.
 
Joe DeFuria said:
Sigh....no ATI were NOT definitely cheating wrt to Quake3 debacle.
MOST OF YOU ARE probably familiar by now with the controversy surrounding the current drivers for ATI's Radeon 8500 card. It's become quite clear, thanks to this article at the HardOCP, that ATI is "optimizing" for better performance in Quake III Arena?and, most importantly, for the Quake III timedemo benchmarks that hardware review sites like us use to evaluate 3D cards. Kyle Bennett at the HardOCP found that replacing every instance of "quake" with "quack" in the Quake III executable changed the Radeon 8500's performance in the game substantially.

The folks at 3DCenter.de followed Kyle's trail and discovered that, on the Radeon 8500, "Quack 3" produces much better image quality?texture quality in particular?than Quake III. The FiringSquad observed the same behavoir, only they did so in English.

With the publication of these articles, it became a matter of public record that ATI was intentionally sacrificing image quality in Quake III for better benchmark scores. The issue, as far as I was concerned, was settled: ATI was busted.
- http://www.tech-report.com/etc/2001q4/radeon-q3/index.x?pg=1

And you don't call that cheating?!?!?!?!?!?! Lol, it's funny how people are loyal to a video card company that tries to screw them over for money.
 
Veridian3 said:
Every time i get some hardware in there are a few tests which i always run on the drivers to ensure things are working normally....obviously some differ from product type to product type. In the case of video cards 3dm03 is one of the quick tests i do. E.g. its useful to make sure the hardware is working.

Fair enough. I was just suprised you spotted it without explicitly looking for it, that's all. :)
 
Hanners said:
Fair enough. I was just suprised you spotted it without explicitly looking for it, that's all. :)
Yes, he must have amazing eyesite to have noticed this without explicitly looking for it... ;)
 
Diplo said:
And you don't call that cheating?!?!?!?!?!?! Lol, it's funny how people are loyal to a video card company that tries to screw them over for money.

Were you actually around at the time of this stuff going on? Or are you just pulling up stories? Like perhaps when two weeks later, a patch was issued that fixed the issue without performance loss?

Yeah...that's an impressive cheat there....
 
Bouncing Zabaglione Bros. said:
jolle said:
With all this witchhunting in that department these days, you could stop and ask yourself: Does it matter?

If any chipmaker decides to sacrifice IQ that CANT be seen for peformance, does it matter to the end user?
The line for IQ, is it drawn at what you can FIND by deep digging, or what you SEE onscreen when you play..

Not calling any shots on this case, but just in general..
Optimizing is a good thing, when you sacrifice IQ while doing so its instead a cheat..
But are you sacrificing IQ when the end user cant se the difference, or is it enough that you can find a difference with these Mipmap tools and blowups of screenshots?

I think a missed point is that if Nvidia have specific app detection and are changing to hand tuned optimisations, it gives the impression that games/benchmarks are faster.

However, what happens when you try to run a game or app that hasn't been hand tuned by Nvidia programmers? You've seen high scores in 3DMark2003, but then when you run a game, you find the frames are a lot lower than you were led to believe. What happens when a developers issues a patch and Nvidia's hand tuned optimisations get broken, and your performance drops back down?

Such optimisations (even if they are not very noticable) are in general a bad thing, because you get a misleading benchmark result (which Nvidia can crow about, such as the big deal they made on launch of NV40's 14,000 3DMark score), and which causes you to buy a product that then doesn't live up to that level of performace because Nvidia may not have spent time and money optimising for your particular game.

Yeah, when it comes to synthetic tests is another animal in one way, and yet they are (i assume) meant to relay hardware performance for games..
If it doesnt match up with gaming results, its not really interesting anymore. thats why the "application specific" cheats directed at 3dmark and such are specially nasty, they devalue the whole idea of benchmarks..
And its part of the reason I rely on real game benchmarks over synthetics, even tho there has been cheating in popular benchmarked games too..
And also there was some discussion on weather 3dmark03 in general translates well into gameperformance, but that is a different discussion all together, and not really relatable to this one...

I feel like a criminal in a room full of lawyers now, so Im just going to muse a bit with some thoughts, Its some things ive been rolling around in my head and would like to see others takes on..

1. How much Image quality difference can there be before it goes from optimizations to cheating? at what level they can be spotted..
2. If same settings is applied in both synthetics, and games, will it still ruin the "same conditions" synthetics relies on, seing how they are meant to relay peformance in games, or is it only the "application specific" tricks that are cheating? granted that there isnt any tangable IQ difference (depending on the definitionquestion on what IQ difference there can be is, as posed in 1)
3. Can it be generally said that, if exposed as a choice for a user, a "cheat" will become a optional optimization?
Or rather is a optimization without a option for the user always a cheat?

And again, in general when discussing Cheats vs Optimizations, not this NV40 case in perticular..
 
Oh great, here comes another "was quack cheating?" arguements.

Come on, it's not like it actually matters anymore regardless of what happened. We arn't talking about the 8500, and we arn't talking about a quake3, so I really don't it it having much relevence to the topic at hand except as a footnote that it happened.

Lets not get in a piss match about "but ATI cheats, so it's ok!". Neither company should be doing these kinds of things, and in the current topic we are talking about nVidia and the 6800ultra. Go start a new thread about ATI drivers cheating if that's your intention.

Nite_Hawk
 
We test all drivers throughly (ATI/NV/XGI - whoever), i always look at mipmaps etc i found a similar issue in another application before i went looking in 3dmark. The volari review i wrote recently followed a similar theme though it wasn't necesary on that to delve into mipmaps when there were far more serious issues with that product/software.

The examples used in the review are not the only screenshots/examples for these and other applications. They are however 2 of the most easy to recreate for the end user max payne because it happens early on in the game and 3dmark because its free.

We also had to draw the line at how many examples were enough...the ones published were enough imo and its always useful to hold some back.
 
Veridian3 said:
We also had to draw the line at how many examples were enough...the ones published were enough imo and its always useful to hold some back.

I personally think you should have first gone to FutureMark, nVidia and even ATI first privately with your findings, given a little time to get a response, and then publish.
 
Status
Not open for further replies.
Back
Top