Reply from nVidia about the 3DMark anomaly

From my perspective here. I think it is ludicrous to accuse ATI of cheating for the Quake optimization and to overlook nvidias "optimizations" for the benchmark 3DMark. The reasoning is simple and one doesn't need a major in Ethics to see who should be accused of "cheating". The game Quake 3 is used as a benchmark to some extent but mainly IT IS A GAME.

On the other hand 3DMark is a BENCHMARK for which people use to measure their computers 3D graphic performance. It isn't a game in any way.

To put it simply ATI was optimizing for a game. Nvidia was cheating on benchmark software. This isn't rocket science guys. My question is why isn't nvidia being eaten alive by sites on the internet. Seems that the time ATI received their scathing reviews was during the launch of the Radeon 8500 of which nvidia was/is very much scared of as it is/was a threat to their market share.

Nvidia should receive an even worse piece of humble pie for cheating on a benchmark people use to determine the performance of their 3D graphics cards IMO. That isn't real difficult guys come on......


Geek
 
ATI didn't optimize Quack for gameplay, they lowered image quality solely for the purpose of improving benchmark framerates and hoped that the user wouldn't notice ingame.

How can you call secretly lowering IQ if you detect Quake running (and not Quack) a non-cheat?

Moreover, if the worst kind of cheat, because IQ in benchmarks in one thing, but I don't play benchmarks, I play GAMES, and this "optimization" actually effected the look of the game when used as A GAME.
 
DemoCoder said:
ATI didn't optimize Quack for gameplay, they lowered image quality solely for the purpose of improving benchmark framerates and hoped that the user wouldn't notice ingame.

How can you call secretly lowering IQ if you detect Quake running (and not Quack) a non-cheat?

Moreover, if the worst kind of cheat, because IQ in benchmarks in one thing, but I don't play benchmarks, I play GAMES, and this "optimization" actually effected the look of the game when used as A GAME.

Again for the 100th time, the Quake reference has been in the drivers a year prior to the 8500 with no image issues, ONE driver release later ( 2 weeks after the 8500 launch) image quality restored, the texture slider in Quake 3 worked again, frames per second within 1-2 fps of the supposed CHEAT drivers.

Yes I can see where you see this was a specific 8500 optimization... :rolleyes:
 
DemoCoder said:
ATI didn't optimize Quack for gameplay, they lowered image quality solely for the purpose of improving benchmark framerates and hoped that the user wouldn't notice ingame.

How can you call secretly lowering IQ if you detect Quake running (and not Quack) a non-cheat?

I know one way to not call it a cheat...replace cheat with bug and there you go...
ATi stated it was a bug,since they managed to get the FPS back without lowering IQ it certainly seems to me like a bug...(new core,new drivers=unforseen things...and that happens with ATi just like any other company...)
This thread isn't about bashing nVidia btw...it's asking the question "why don't we see this 3D mark issue all over the online sites like we did with ATi's Quack issue?"...
Quack was blown out of proportion,this issue is kept too quiet,some true reporting without screaming is what should've been done on both issues...
The public should be informed with the facts and the verdict should come when you have the facts on how things look after they have done something about it...(which clearly hasn't happened with the Quack issue as we see in this thread since it's still claimed to be a cheat...)
 
DemoCoder said:
ATI didn't optimize Quack for gameplay, they lowered image quality solely for the purpose of improving benchmark framerates and hoped that the user wouldn't notice ingame.

How can you call secretly lowering IQ if you detect Quake running (and not Quack) a non-cheat?

Moreover, if the worst kind of cheat, because IQ in benchmarks in one thing, but I don't play benchmarks, I play GAMES, and this "optimization" actually effected the look of the game when used as A GAME.

How can it be a CHEAT if it was fixed without a speed loss?
Didnt do much cheating then, huh?
Looks more like all it did was detect Quake3 (so what) and have a bug that was fixed.
 
DemoCoder said:
ATI didn't optimize Quack for gameplay, they lowered image quality solely for the purpose of improving benchmark framerates and hoped that the user wouldn't notice ingame.

How can you call secretly lowering IQ if you detect Quake running (and not Quack) a non-cheat?

Moreover, if the worst kind of cheat, because IQ in benchmarks in one thing, but I don't play benchmarks, I play GAMES, and this "optimization" actually effected the look of the game when used as A GAME.

Remarkable, how can you maintain any sort of integrity in your opinion based on your obvious bias for nvidia? Granted your graphics understanding is likely significantly more then my own but your in a place full of your peers or even more educated then yourself. Even I can see that you are bias and I am only a novice here. It is so apparent.
 
I don´t stick too much wether occurances like that can be called a cheat or a bug. What puzzles me is why on earth more than once issues like that can get completely overlooked by reviewers.

Why do I have the feeling that the wide majority out there just installs cards, runs timedemos and creates a couple of multicoloured happy performance bars and call it a day, while apparently hardly ever game with the cards tested. Either that or they just lack in very basic knowledge to see differences.

Thank God there are exeptions to rely on.
 
There can be a subtle distinction between 'optimizing' and 'cheating' but there is a difference. In this case, optimizing means trying to get the most out of your hardware in a certain situation for the benefit of the user.

This is not cheating, although cheating will usually include some sort of optimization as well. Where cheating is truly differentiated from optimizing is from the ethical perspective (although ethics are somewhat subjective...). That is, if the optimizing for Q3 produces higher FPS and no IQ degradation, then it benefits the person who plays that game and should not be considered cheating.

Consider that Carmack has indicated that D3 will have different rendering codepaths for GF2, GF3/4 and R8500, with more codepaths possibly being developed for VP10 and Parhelia. This is clearly optimizing for best performance (otherwise your conclusion would be that Carmack's a 'cheater', which is clearly ludicrous). The ethics of this case are clear cut as this is for the benefit of the gamer.

Same goes for any ATI or NV optimizations for Q3 as such optimizations will clearly benefit those who play that game (excepting for bugs, such as ATI Quake/Quack texture slider bug).

All indications are that NV is optimizing for 3DMark. Ethics are bit more murky here since no one actually plays 3DMark. The only result is a higher 3DMark score. Therefore, the question becomes whether or not a higher 3DMark can be considered to be a benefit to the user. If not, then it's a cheat.

My opinion is that there is no benefit to the user therefore it's a cheat, not an optimization, whether the scores stabilize in the next drivers or not.

Just my two cents.
 
As for the claim that Q3 is a benchmark and not a game, just take a look at the stats at gamespy.com. At this moment, they register Q3 as the 3rd most popular online game as ranked by # of players.

Let's see if 3DMark ever makes the chart. :rolleyes:
 
tamattack said:
All indications are that NV is optimizing for 3DMark.

Which indications would that be? The only information to come out of Mad Onion or NVIDIA has been that it is a bug in the driver. The only fact regarding the situation is that the scores differ between having the title screens enabled or disabled. There is still no proof that it is a bug or an optimization, just proof that there is a difference. Even the ammount of the difference has varied greatly between the people reporting to have witnessed it.
 
There is still no proof that it is a bug or an optimization, just proof that there is a difference. Even the ammount of the difference has varied greatly between the people reporting to have witnessed it.

And yet there was no concrete proof that ATI did intend for the image quality to change in order for their FPS to increase. Please provide absolute facts or source code that shows this is what the intend. All of the test that re-ran by changing the name of Q3 is that it had a different code path. Thats all it proves. Once again we have the same situations when you look at ALL the facts in both cases. And yet one is labeled as a cheat and the ohter gets by as a bug :rolleyes:

Truth is we will never really know.....
 
I don't really understand what some people expect to see now. Especially those who consider labelling the quack thing a cheat as wrong.

Do they want some sites to repeat their mistake by labelling this case a cheat too, just to have equal treatment of both ATI and NVidia?
Or do they expect them to learn from their previous mistakes and not call it a cheat as long as there is no hard evidence?

And AFAICS there is absolutely NO hard evidence.

Of course they should investigate. But what else can they do besides looking for image quality differences (which no one reported in this case) and superficially checking the drivers?
 
jb said:
There is still no proof that it is a bug or an optimization, just proof that there is a difference. Even the ammount of the difference has varied greatly between the people reporting to have witnessed it.

And yet there was no concrete proof that ATI did intend for the image quality to change in order for their FPS to increase. Please provide absolute facts or source code that shows this is what the intend. All of the test that re-ran by changing the name of Q3 is that it had a different code path. Thats all it proves. Once again we have the same situations when you look at ALL the facts in both cases. And yet one is labeled as a cheat and the ohter gets by as a bug :rolleyes:

Truth is we will never really know.....

I didn't label the ATI issue a cheat. In fact, I don't even recall mentioning it. If you're upset about what some websites did months ago, go bitch to them about it. Someone once said two wrongs don't make a right, yet I see many people here who want NVIDIA to get the same kind of undeserved bad press they feel ATI got. That's not fairness, that's spite. You complain when someone blows something out of proportion, then when they don't do it a second time, you complain that they're giving preferrential treatment. Make up your mind.

And if you want to talk about double-standards, why not look at your own comments. Apparently nobody thinks it's cheating to optimize 3D rendering drivers if it doesn't reduce image quality. I don't recall anyone complaining that 3DMark2001SE looked worse with the title screens enabled than it did without them. Oh, but it's a benchmark, and nobody plays it, so it doesn't deserve to be optimized for, right? Who cares if the same engine was used in one of the most popular games of 2001.

This has nothing to do with ethics, and nobody outside of NVIDIA will ever know the real cause of the discrepancy in the scores. All this is, is a bunch of disgruntled ATI fans whining that their favorite company got the shaft, and the evil superpower isn't getting shafted worse when there's less (read: zero) evidence they even should be. At least someone found a passing reference to Quake 3 in the ATI drivers to give *some* basis for an accusation, right or wrong.
 
Crusher,

This has nothing to do with ethics, and nobody outside of NVIDIA will ever know the real cause of the discrepancy in the scores. All this is, is a bunch of disgruntled ATI fans whining that their favorite company got the shaft, and the evil superpower isn't getting shafted worse when there's less (read: zero) evidence they even should be. At least someone found a passing reference to Quake 3 in the ATI drivers to give *some* basis for an accusation, right or wrong.

My point is that there is a huge double standard out there that seem to favor nVidia at every turn. Also these review sites did an half a$$ed job and did not bother to tell people that those same Q3 ref had been in the drivers for many months and yet no issues or "cheats" were ever found on the Original Radeons running thoose drivers. Fan sites are fine as I would expect that. Sorry but I think professional review sites should be fair and honset to everyone. Today they are not.

Who cares if the same engine was used in one of the most popular games of 2001.

Yea and we both know that 3dmark was 100% accurate at predicting Max Payne performance. One only needs to look at how well the K2 did in Max Payne and yet how low it scores in 3dmarks to see that is not the case.
 
Who said 3DMarks would predict the performance of Max Payne? The scores from the lobby portion might be somewhat indicitive of a worst-case scenario in Max Payne (I don't know, I've never actually played it), but the purpose of 3DMark2001 is not to tell people how well their card will run a particular game. It's to compare the performance and features of different video cards. I certainly hope you aren't going to argue that the Kyro 2 should be getting higher 3DMarks when it can't run half the tests. The only area you could possibly have a gripe is in the fillrate tests, due to the lower fillrate requirements of the Kyro 2, and I think it's pretty safe to say there's no accurate and fair way to compare the fillrate of the Kyro 2 to other video cards, so you're pretty much stuck with them getting the shaft in that test. Besides, the overdraw in the Dragothic test should make up for it.

Anyway, my point was that IF nvidia is optimizing for 3DMark, then it's also possible those same optimizations are being used in Max Payne, thus improving the performance of an actual game, not just a benchmark. But it's all pointless anyway, because as I said, we have no clue as to what is actually causing the difference in the scores. Any respectable news site should refrain from posting conjecture as to the cause, since there is no evidence to support any argument. If by doing this you claim they're biased or holding double-standards, I say you're placing the blame on the wrong people at the wrong time. You can blame them for the ATI issue all you want, but you can't honestly think there's anything wrong with not posting rumors and guesses about this issue.

And lets not forget that the Inquirer did jump all over this, quickly accused NVIDIA of cheating, and claimed that NVIDIA sent them special drivers to fix the issue. If you're the kind of person who would flame some news sites over the ATI issue, but let the Inquirer's bullshit stories go without criticism, then I say you are a hypocrite, and just as biased as any news site I've seen.
 
Xmas said:
I don't really understand what some people expect to see now. Especially those who consider labelling the quack thing a cheat as wrong.

Do they want some sites to repeat their mistake by labelling this case a cheat too, just to have equal treatment of both ATI and NVidia?
Or do they expect them to learn from their previous mistakes and not call it a cheat as long as there is no hard evidence?

And AFAICS there is absolutely NO hard evidence.

Of course they should investigate. But what else can they do besides looking for image quality differences (which no one reported in this case) and superficially checking the drivers?

Speaking for myself:

I observe that in the case of the Quack phenomenon, image comparison shots were provided, and articles were posted.

I observe that in the case of the 3Dmark phenomenon, the same sites that posted the prior articles did not post a similar front page article, even if it was to discuss how the image quality was NOT compromised.

To me, this is clear indication of a bias. As to whether this is vendor bias, or a bias towards sensationalism and scandal....you may say that sensationalism and scandal gets you more hits, and that is true, but so would an article clearing nVidia of image quality concerns in this issue.

Therefore, I don't see a motivation to ignore this that is based on objectivity, since even clearing nVidia of this would generate hits and popularity, and would be profitable, even if the "objectivity" in this case was profit-seeking.

This does fit, however, shying away from associating a favored vendor with a possible case of cheating. This, to me, negates the credibility of investigative reporting in the one case, and makes it appear like ingrained bias causing the difference in behavior between the two issues.

Did I miss or forget a thorough article comparing image quality rendering at a site that covered the Quack issue in detail?
 
Who said 3DMarks would predict the performance of Max Payne? The scores from the lobby portion might be somewhat indicitive of a worst-case scenario in Max Payne (I don't know, I've never actually played it), but the purpose of 3DMark2001 is not to tell people how well their card will run a particular game. It's to compare the performance and features of different video cards. I certainly hope you aren't going to argue that the Kyro 2 should be getting higher 3DMarks when it can't run half the tests.

Once again I was talking how well a K2 does vrs a GF2 or MX in Max Payne yet loose by 1000 or 2000 pts to the above in 3dmarks. Funny how those "optimizations" did such a good job in 3dmarks yet the K2 beets the GF2 class cards in Max were those same "optimizations" are used. Does not make much sense to me. Also FYI the K2 runs all but the shader test which is the same as the GF2 class cards yet the GF2 usally score 1000+ higher scores. 3Dmark is usless for comparing two cards. Its only usefull for tweaking your system.

Also it seems find to you to have web sites report major issues such as "cheating" with either not reporting all of the facts or carefully omiting key points? But yet not take the same amout of time to investigate when other vendors do something simular?

And I dont really listen to the Inqwell as they are usally not right about anything.
 
Xmas said:
Especially those who consider labelling the quack thing a cheat as wrong.

It was wrong to call it a cheat,but what was worse was how it was presented...

Do they want some sites to repeat their mistake by labelling this case a cheat too, just to have equal treatment of both ATI and NVidia?
Or do they expect them to learn from their previous mistakes and not call it a cheat as long as there is no hard evidence?

I personally hope they learn from it yes...this doesn't mean that they should keep quiet about an issue imho,it means reporting about it...(like they should have with the Quack issue)

Of course they should investigate. But what else can they do besides looking for image quality differences (which no one reported in this case) and superficially checking the drivers?

Hmmm...and how did they ever happen to find out the Quack issue...before that program "mysteriously" appeared I saw no complaints about the IQ in Q3A...(as far as I remember they actually claimed it was very good...if memory serves right they even said it was better than nVidia's...)
Then "someone" made a nifty little prog that showed the difference and all of a sudden the IQ sucked,ATi sucked and so on...
Had they done proper reporting about it stating "look at this,the IQ decreases when you use Quake.exe" with pics and all.
Then they should have waited to say something about bug/cheat until they had the clear evidence of either...then I'm quite sure you wouldn't see anyone in here complain about how it was handled...
That's what I want to see now,if they had learned the lesson they should be reporting about this in a calm and clear fashion...
 
Does not make much sense to me.

That has become painfully apparent. I'll try a little harder to explain to you the difference between a benchmark and a game.

3DMark is a benchmark. It tests the FEATURES of a card, and it's maximum capabilities when using those features.

Max Payne is a game. It is not going to use 1024x1024x32 textures with 100,000 polygons, 8 hardware lights, and multiple vertex shaders all the time, because they don't want people to have to buy the latest video card just to be able to play it. It limits its use of a video card's features in order to make it playable on cards like the Kyro II. If Max Payne was identical to 3DMark2001, you couldn't even run it on a Kyro II, because the Kyro II doesn't have pixel shading capabilites.

Benchmarks measure what a card is capable of. Games try to make every card capable of running them. Just because two cars can cruise down the highway at 60 MPH without a problem, doesn't mean they should be rated at the same speed on a race track.

Now, if you're quite finished arguing about your 2 year old gripe with the Kyro II's 3DMark score, perhaps you can focus on the actual topic at hand, which is the fact that you and numerous other people here want to give NVIDIA the exact same treatment that you bitched and moaned to high heaven about when it happened to ATI. And again I say, two wrongs do not make a right; it's hard to sound virtuous when you're screaming for an eye for an eye; and so on and so forth. There's no reason for people to be printing stories about NVIDIA's drivers in this case, because there's no proof. Find some proof, and I'm sure they'll be glad to post some news about it.
 
Back
Top