Reply from nVidia about the 3DMark anomaly

Why was an obvious cheat by ATI that reduced visual quality less important than a possible one by nVidia that does not?

R U forgetting the Det bug back when the gf3 came out that caused horrible banding in d3d? Back then you could have said the same thing, lowering the IQ (highly visible banding in fog as well as other artifacts) to get better 3dmarks score. Of course it was fixed with a driver patch with no loss to the scores. Hmm throw in the fact that it was detecting spalsh screens then it sounds a lot like what happen to Q3/ATI.
 
Joe DeFuria said:
2) nVidia "detectes" 3DMark using the splash screen, and performs some type of specific optimization.

IMO, the two situations are pretty much identical (other than the method of detection), and should be treated identically.

Well, considering nobody has ever proven that the drivers are conciously looking for the splash screen, that any extra work is being done, that the score is supposed to be the lower of the two, or that it is in fact an optimization and not a bug, I would have to say the two situations are not identical at all. And no, they have not released drivers yet that fix this problem. Their latest driver set was just approved and released during the time this news was spreading around the net, so I assume it will take them a few weeks to put out another update.
 
Joe DeFuria said:
How is what ATI did an obvious cheat again?

ATI's drivers detected Quake3.

2) nVidia "detectes" 3DMark using the splash screen, and performs some type of specific optimization.

We don't know this for certain. But, I will admit that it is a very good possibility (in the range of 70%, in my mind). I think the most obvious is that the drivers detect a splash screen (A large texture displayed in a certain way for a certain length of time), and then go on to flush all the caches and whatnot on the video card.

There is another possibility, of course. That is simply that the splash screens use a somewhat different rendering method than the games use, which causes the GF's to automatically flush their caches.

In the end, I see three possibilities:

1. Issue fixed, but performance stays at lower level, or never quite reaches current level: Obvious cheat.

2. Issue fixed, performance is almost identical to current: Probable bug, possible cheat.

3. Issue fixed, performance higher: Obvious bug.

I say this because we have no way of knowing for absolute certain that nVidia is detecting 3DMark2k1, or any specific game benchmarks. With ATI's cheat, there was no doubt that the drivers were detecting Quake3.
 
Why is detecting Quake 3 a cheat ??? What if a video card company saw a way to make it work better with their hardware. If application detection is 'cheating' then Sysmark is ALSO CHEATING.

http://www.anandtech.com/cpu/showdoc.html?i=1543&p=5

All sysmark does is look for a Intel CPU not for SSE flag, application detection is done by lots of applications to be optimized for particular hardware. As long as the optimization doesn't lower IQ, and it didn't for a YEAR on the original Radeon, then what is the issue here..making the game run/look better is not a crime.
 
Well, technically, BAPCO followed the Intel white papers on using SSE to the letter. The Intel white papers say specifically to look for an Intel CPU.

I'd say this was, therefore, sort of half-cheating on Intel's part. From Intel's perspective, it would be possible for another manufacturer to use that same SSE bit for something else entirely, and it would also be possible for another manufacturer to claim support for SSE when it wasn't really there. So, this is explainable, and not entirely dispicable.

Detecting Quake3, however, is wrong. Quake3 is one of the most popular 3D benchmarks. Any driver that detects it is cheating (And yes, I am saying this with the full knowledge that it is still very possible that both nVidia and ATI detect Quake3 in current drivers). ATI got caught with their pants down in that particular respect with a cheat that was blatant and obvious. nVidia has yet to be caught to the same degree.

This doesn't necessarly mean that ATI is any better, just that if nVidia is indeed cheating, they are being smarter about it.

However, what we do know is that nVidia's video cards, in recent history, have performed very well across a very large range of software. The problem with cheating is that it shows a video card as being better than it actually is, with a person complaining that, "Card X did so well in Quake3! Why does it do so terribly in Super Duper Smack Town 3?" While it is true that such events are inevitable, as not all game engine designers are very good at optimizing for a wide variety of hardware, cheats only serve to worsen the issue.

It is for this reason that it is important to look at a very large variety of benchmarks before deciding that one video card has superior performance over another. Personally, I think 3DMark2001 scores should not be considered at all, and Quake3 scores should be at least accompanied by a few other benchmarks (four, minimum, would make for a halfway-decent decision).
 
Detecting Quake3 is not a CHEAT :rolleyes: ..like I said before if a video card company saw a way of improving the game peformance on their hardware without lowering IQ how the hell do you assume that is cheating ?
Nobody really knows what the application detection of Quake 3 was doing in the Radeon drivers but they were in there almost from the beginning of the Radeon cards, this was not something ATI put in specifically for the 8500 debut like some webiste tried to say.
Since I still own a Radeon Card and countless reviews on the net if you can find any of them complain of image quality...then you can make your claim.
 
It's a cheat because Quake3 is a benchmark.

Besides, good drivers shouldn't need any game detection, except for, perhaps, those situations where games were developed on poor drivers from a different manufacturer, and the game developer refused to patch to fix the problem.
 
It's a cheat because Quake3 is a benchmark.


Quake 3 IS NOT A BENCHMARK it is a online multiplayer game that was marketed as a online player game...so it's a GAME
eek7.gif
 
Doomtrooper said:
Quake 3 IS NOT A BENCHMARK it is a online multiplayer game that was marketed as a online player game...so it's a GAME
eek7.gif

Not to ATI. To ATI, Quake3 is one of the most widely-used 3D benchmarks.

It is also a game, yes, that is true, which does mean that a relatively small number of people will get to reap the benefits of ATI's cheating. But, there are far more popular multiplayer games.
 
Anyone who tries to say Nvidia doesn't do game-specific optimizatiosn never looked at late Det2s and early Det3s in a hex editor or disassembler.

Nvidia has just gotten a lot smarter about how they detect their game-specific optimizations is all. Old nvidia drivers have game names and executables, and various other references in them as well.
 
Chalnoth said:
I think the most obvious is that the drivers detect a splash screen (A large texture displayed in a certain way for a certain length of time), and then go on to flush all the caches and whatnot on the video card.
???
There is another possibility, of course. That is simply that the splash screens use a somewhat different rendering method than the games use, which causes the GF's to automatically flush their caches.
???

Someone makes a post on nvnews about how nvidia must be "flushing caches" and now everyone takes this as gospel? Very interesting...

What sort of hardware cache wouldn't flush automatically after the data wasn't used after a period of time? If the cache was in software, then I wouldn't consider it very good code if it relied on particular characteristics of a benchmark in order to work well.

Ok, I'll humor you. Let's suppose that you wanted to flush a hardware cache, but didn't: Any data in there would be thrown out after less than a frame of rendering anyway (remember, 1024x768x32 = 3 MB of data for the color buffer, and that's not counting texture or Z data), so it shouldn't cause a large impact in performance.

P.S. Why do people come to such ridiculous conclusions over things they have no data about?
 
OpenGL guy said:
P.S. Why do people come to such ridiculous conclusions over things they have no data about?

lol! because this is B3D... and when people don't have any data to base their information on here, they speculate wildly based on any collection of possibilities.

I mean that only in the best of ways though... it keeps B3D a place that's always fun to come back to. Sometimes there's really brilliant analysis and speculation here... and sometimes its outright off the wall. Either way it's usually fun. :D
 
OpenGL guy said:
Ok, I'll humor you. Let's suppose that you wanted to flush a hardware cache, but didn't: Any data in there would be thrown out after less than a frame of rendering anyway (remember, 1024x768x32 = 3 MB of data for the color buffer, and that's not counting texture or Z data), so it shouldn't cause a large impact in performance.

P.S. Why do people come to such ridiculous conclusions over things they have no data about?

I was thinking more along the lines of things temporarily stored in video memory, such as textures, that perhaps the drivers don't always clear out when told to.

Anyway, the answer should become obvious in time.

And no, I'm not planning on desassembling nVidia's drivers and hunting through huge amounts of nigh-unreadable code in order to find game-specific optimizations. If somebody else wants to, you're welcome to it...I'm sure nVidia won't like it, but if you're that obsessed...
 
Chalnoth said:
I was thinking more along the lines of things temporarily stored in video memory, such as textures, that perhaps the drivers don't always clear out when told to.
Uh, you mean that nvidia is not freeing textures when told to? I.e. on context destroy/mode switch? I seriously doubt it.
Anyway, the answer should become obvious in time.
I doubt it.
 
Chalnoth said:
And no, I'm not planning on desassembling nVidia's drivers and hunting through huge amounts of nigh-unreadable code in order to find game-specific optimizations. If somebody else wants to, you're welcome to it...I'm sure nVidia won't like it, but if you're that obsessed...


I didn't expect you would. I won't bother again as I looked through them back in the quake/quack ATI mess... wish i had pics right about now, but i don't. However, as happens rather frequently with you, since you won't look at it, you don't believe it ever happened.

However, they did do it in the past, and they still do it today regardless of what their PR says... they just don't use things as blatantly obvious as game executables that show up as string refs in their driver .dlls in a disassembler.
 
Ichneumon said:
I didn't expect you would. I won't bother again as I looked through them back in the quake/quack ATI mess... wish i had pics right about now, but i don't. However, as happens rather frequently with you, since you won't look at it, you don't believe it ever happened.

I generally expect others to try to present evidence to support their own claims. I'd personally rather not take the time and effort to refute my own claims...if I liked doing that, there'd be little reason for me to come here.

However, they did do it in the past, and they still do it today regardless of what their PR says... they just don't use things as blatantly obvious as game executables that show up as string refs in their driver .dlls in a disassembler.

Doesn't really surprise me, but it doesn't really bother me, either. I think my biggest beef with this whole scenario is that people are saying that the Quake/Quack issue isn't cheating.

If nVidia does have a specific optimization for 3DMark2k1, then it is definitely cheating. If nVidia has a specific optimization for Quake3, then it is definitely cheating.

If ATI or nVidia has a specific optimization for Freespace 2, it is not cheating, because that game is not benchmarked by anybody.

I just don't like seeing people call it cheating before it's obvious what's happening. Regardless, it doesn't matter that much, as I don't believe anybody who is buying these cards really cares, and nor do I feel the people at ATI and nVidia really care whether or not we call them cheats (While, by contrast, it'd be a huge deal for a somewhat substantial number of people if either company admitted wrongdoing...so I doubt they will...).

I personally just hope it makes some people think twice about looking at the 'standard' benchmarks only before buying.
 
If ATI or nVidia has a specific optimization for Freespace 2, it is not cheating, because that game is not benchmarked by anybody.

Tell me - are driver optimizations for Morrowind (or SOF2 or GTA3 or any other new game) cheating or not? ;) I mean the game is not a benchmark yet, but as Mike stated, he considers using Morrowind as a benchmark. Now, if I understood you, then it is OK to have optimizations NOW, but these have to be removed ASAP if TomsHardware or AnandTech or someboy that is considered "anybody" bothers to use the game as a benchmark? :eek: :eek: :eek:

Edit: are separate code paths in game engines for different cards also cheating to you? :rolleyes:
 
aah well the problem with blanket statements and moral stances in the commercial arena is you end up in a tangle of contradictions :)
 
SvP said:
Tell me - are driver optimizations for Morrowind (or SOF2 or GTA3 or any other new game) cheating or not? ;) I mean the game is not a benchmark yet, but as Mike stated, he considers using Morrowind as a benchmark. Now, if I understood you, then it is OK to have optimizations NOW, but these have to be removed ASAP if TomsHardware or AnandTech or someboy that is considered "anybody" bothers to use the game as a benchmark? :eek: :eek: :eek:

No, because Morrowind is currently not a widely-used benchmark. But, I am hopeful that there are no game-specific optimizations needed by 3D video card companies for this game, or most any other.

Additionally, separate codepaths for different hardware is necessary today to take advantage of what the different hardware can offer.

Besides that, every single program automatically finds itself "optimizing" for one graphics architecture over another just because the game developers make a choice in how the rendering situations are set up. This is why only one benchmark that hugely-favors a Radeon 8500 cannot prove that the 8500 is superior. The reverse is also true.
 
Doesn't really surprise me, but it doesn't really bother me, either. I think my biggest beef with this whole scenario is that people are saying that the Quake/Quack issue isn't cheating.

And my biggest beef with this whole issue is people not being consistent in the way they treat the Quake and 3D Mark issues, and end up being contradictory without acknowledging it.

Another beef is people saying factually that Quack was a cheat, when the evidence suggests otherwise.

Chaloth, you INSIST that because ATI detected Quake3 in their drivers, it therefore MUST be a cheat. I simply disagree with that notion, and if you would at least accept that your position is in fact an OPINION, and not some factual truth, then at least we could agree to disagree. (And then you could continue to try and sort out our inconsistent arguments.)
 
Back
Top