New 3DMark03 Patch 330

Nite_Hawk said:
Brent:

I've liked your reviews, and I respect your opinion. So far I've seen nothing that you've written which I'd say is false or misinformed. Having said that, I can't understand how Steve and Kyle could continue to ignore what nvidia is doing and attack futuremark instead. These things could just as easily be done in a time demo for any videogame out there, and yet it's not even mentioned. I'm sorry to say it, but the respect and reputation that *your* articles brought to HardOCP are lost by their words. Brent, I will certainly continue to read what *you* write, but HardOCP as a whole is a tarnished name.

Nite_Hawk

Nicely put Nite_Hawk, I dont think too many ppl around here will be taking anything seriously on HardOCP whilst Steve and Kyle remain so blatantly Nvidia biased. I can understand the opinion of dropping 3Dmark as a game indication benchmark but not reporting and denouncing driver cheats?

Well thats a little too much for me to stomach!
 
Yes Nite_Hawk, nicely put. You and WaltC beat me to it in fact, doh.

Deflection said:
They don't in fact say that. They say the overall score is useless. This has been a distinction for them in the past.

Sure, but their stance is strongly reminiscent of that of so many people who have jumped to nVidia's defence in this situation (and in turn have referenced Kyle's supposedly "informed" comments in order to back up their own argument). They are concentrating on whether or not 3DM is a useful benchmark or not instead of whether nVidia are cheating to deceive the card-buying public. It just seems like avoiding the real issue IMO and in doing so they are going to appear very pro-nV and anti-ATi. It's a shame because the site is largely impartial and has demonstrated this on many an occasion in the past.

MuFu.
 
Tim said:
John Reynolds said:
And what evidence leads you to assume that? I'm not saying I know something but I think it's way, way too early to make such assumptions.

There are no image quality degradation with Ati´s drivers and they have no problems with the free lock mode. Other than the performance improvement there seems to be no difference, this indicates that it is optimizations not cheats. (Ati could off cause be using some kind of free mode detection).

After reading over the whole situation, if ATI is indeed replacing the shader with a version more suited to it's architecture, even if it produces the exact same output, I gotta cry foul. If this were a game, the benchmark run would have used content that existed in the game, and thus optimizations like this would enhance the overall game performance as well. Therefore, one could argue that the "optimization" did not unfairly skew the benchmark result. In an ideal world, ATI would be able to work with the game developer to get the optimized shader into the game as an ATI-friendly rendering path.

The problem is, this isn't a game we're dealing with. It's a synthetic benchmark. Coming up with fine-tuned "optimized" shader code like that isn't useful here, because the performance boost from them cann't be translated to real-world performance gains as it can from a game's timedemo to the actual game. Plus, the only way a synthetic benchmark can truly be effective is if all cards are given the exact same workload to handle. If you want to showcase how to take advantage of your particular architecture, why not talk to some game developers, get them to write shadercode as they normally would, then re-write it to better suit your architecture and show us how much faster it'll be. Maybe that'll even persuade the gamedevs to code the "ATI way" more often.

So ATI, I still love ya, but ditch the 3DMark-specific code. Based on the percentages FM quoted in their audit, and it'll shut up all the nvidiots who are now crying "look! ATI does it too!" :)
 
Personally I think that FutureMark dealt this thing just the right way. The FAQ is well written and covers most of the questions we've seen floating around after the first Nvidia + 3DMark-optimization-news.

As an owner of an Ati card, it's little sad to see that they also seem to have something going on. Now we just have to wait for some more info from FM about this one. Still it's going to be interesting to see which company addresses these issues first (I would guess that it's the one with shorter name :rolleyes: )
 
Wow, this thread is increasing in size almost as fast as I can read it! ;)

I wonder what the 'important' people think about this proven cheating? By 'important', I mean the large companies such as Dell and Gateway who make up one hell of a large part of the business for graphics chip companies. Both Gateway and Dell are members of the Beta programme for 3DMark, so they obviously see some value in the benchmark and must use it as part of their decision making process as regards the OEM deals they sign.

If I was NVidia management, this is what I would be worrying about.
 
Mufu, i guess [H] will probably get back on the issue only if Futuremark will confirm ATi "optimizations" as cheats, so they can say that both cheated and/or say that 3DMark is crap (again).
 
Brent said:
It hasn't flown over MY head. I understand it.

If I was in your position I would just leave [H]. You don't belong there, you belong to a site like Beyond3D.
 
Brent said:
WaltC said:
[H said:
front page]
3DMark Patch:
Futuremark has released a patch for 3DMark 2003 that eliminates “artificially high scoresâ€￾ for people using NVIDIA Detonator FX drivers. This is in response to the news item we posted last week ( as did several sites ). To us, things like this just solidify our belief that 3DMark03 overall score is useless as a real world benchmark. Thanks to everyone who sent this one in.

This is really sad.... :(

You know, apparently this issue has simply flown right over the heads of the people staffing [H] to the degree that even when it is explained in an itemized fashion, they cannot understand what has occurred. Their position is that the fact that nVidia cheated 3DMark means that the benchmark itself is "useless," and the fact that since the patch forces the Detonators to render normally without cheating and lowers the aggregate score to that well below a similarly clocked R9800P means absolutely nothing.

Wow. Talk about gluttons for punishment! *chuckle* Who in his right mind would want to buy a product that the manufacturer falsely advertises as having a level of performance much lower than it actually has??? Whew! Not me....;) These guys have really put on the blinders...this is taking CYA to an entirely new level.

It hasn't flown over MY head. I understand it.


Leave [H] and make your own site or join a more reputable one Brent(hint beyond3d hint).


Edit: Doh! Humus beat me to it. :) I think if you poll the people in this forum, you'd find that many share Humus' and my sentiments
 
Evildeus said:
martrox said:
optimisations/cheats...... what's that look like to you, ED? Looks like the word "cheats" to me..... And, at this point, can you admit that nVida is cheating?......
/= or for me so, it means what i mean ;)
Yes Nv is cheating according to FM

IMG0006295.gif

I just noticed something rather interesting in this chart. Notice that as expected ATI's game test4 score decreases, yet it's pixel shader and vertex shader scores remain the same. Now, it was speculated that some of the same shader tests are being using in game test4 and the pixelshader test. If ATI is optimizing one of the shaders in GT4, then it's not doing it for any of the same ones in the pixel shader test. This in and of itself might not mean anything, but I thought it was somewhat interesting none-the-less.

Nite_Hawk
 
What about cheats in SPECViewperf, another well-know synthetic benchmark ?

Any info on that subject ?
 
Marc said:
What about cheats in SPECViewperf, another well-know synthetic benchmark ?

Any info on that subject ?
I don't think this would be worth it for the consumer boards as not much people look at such boards if they want workstation-class 3d graphics. That said, it might be possible the professional versions (such as FireGL, Quadros) get the boost over the normal versions not mostly because of some unlocked hardware features or more optimized (for professional applications) drivers, but just because of more cheating drivers :(. I think however it is more difficult to cheat in that, because SPECViewperf is all about geometry processing, not really pixel shading / texturing / memory bandwidth. There are no shaders (vertex or pixel) to detect and exchange either.
That's not to say it's impossible to cheat of course...
 
WoW! The decrease in overall score is pretty sad. What a lot of nVidia fans keep missing is that these drivers were used in the review of the GeForce FX 5900 Ultra. Wouldn't you feel cheated that the results to make the 5900 Ultra seems dirty?

I seriously don't know why anyone would be defending nVidia in this matter? ATi scores dropped, yes, but it wasn't as dramatic as nVidia's. I would suspect all cards to drop in score. But it's comical to see people defending nVidia on this issue. I don't know what I would be more pissed about. The 44.03 drivers or the fact that nVidia issued the 44.03 drivers for the 5900 Ultra review. :?
 
Mummy said:
Tim your position is bull...

No it is not, you just don't understand it.

Even with NV drivers there is no degradation, unless you change the Camera's "rail" path,

Not true they change shaders, so they look different. If making changes so that the program looks different than the developer intended is it not degradation.

i dont know if ATi is cheating (more for the "they are" than the "they arent" though, just my impression), but *ANYTHING* made to speedup a *SYNTHETIC BENCHMARK* is a *CHEAT*, doesnt matter if every pixel is the same.
This is completely nonsense as long as you do all the work the developer has intended it is not cheating (unless you are using predetermined camera path to place clipping planes etc.).

Unless you think that application specific optimization always are cheats, but I think you find yourself pretty much alone with that possition.
 
MuFu said:
Sure, but their stance is strongly reminiscent of that of so many people who have jumped to nVidia's defence in this situation (and in turn have referenced Kyle's supposedly "informed" comments in order to back up their own argument). They are concentrating on whether or not 3DM is a useful benchmark or not instead of whether nVidia are cheating to deceive the card-buying public. It just seems like avoiding the real issue IMO and in doing so they are going to appear very pro-nV and anti-ATi. It's a shame because the site is largely impartial and has demonstrated this on many an occasion in the past.

MuFu.

I agree with you on the impartiality factor up to this point--I was very surprised and disappointed to see Kyle ranting about this as he did. In fact, his entire rant seems so bizarre that it has got me wondering if something else not immediately apparent isn't going on here.

To my knowledge, Kyle is the only individual who divined some linkage between the nVidia DoomIII nv35 promotional demo (closed to the public and all but a few web sites) and ET's original expose' of nVidia's 3DMark 03 driver cheats. I think everybody saw Kyle's seemingly wild and baseless accusations in as bizarre a light as I did. After all, there was nothing in the ET article which even remotely referenced Kyle or [H], let alone the nv35 demo. Hence, why such a personal reaction to an article not directed at him?

On the other hand, if Kyle had some knowledge pertinent to a similar driver behavior in the closed, dongled, restricted Doom III nv35 demo he had already featured, it would certainly explain the defensiveness and the vitriol of his reaction. If he feared some kind of backlash on [H]'s D3 demo scores based on knowledge he already had about the driver behavior running the D3 demo, it would explain a lot. It would certainly explain his eagerness to justify such driver behavior with his "If you can't SEE the cheating it's not cheating..." remarks, and his eagerness to attempt to shift the focus from nVidia's drivers to the 3D Mark benchmark. I am not suggesting that this is what he did--just that this avenue of thought does more to explain his behavior to me than any other I have contemplated. I know of no one else who saw a link of any sort between the ET article and the [H] nv35 D3 demo--this was unique to Kyle, I believe.

What is puzzling is that link he apparently saw between the D3 demo and the ET expose'. If he had restricted his remarks to merely a discussion of the 3D Mark 03 benchmark--and not linked ET's actions to [H]'s presentation of the D3 demo in any way--I could believe that this was merely a continuation of his distaste for 3D Mark, which became most apparent after nVidia resigned from the program in December '02. But his defensiveness and his linking of the two disparate things begs a question or two at the least I should think.

Heh-Heh...;) Again this is all purely speculation, and is mainly me grasping at straws to try and understand the intemperate outburst of Kyle's which seems to defy other explanations. But perhaps, as someone else suggested, linking the ET article with his nv35 D3 demo may just have been a pretext enabling him to interject himself into a topic which he completely missed...;) I'll leave that to those who know him better than I...;) But I do think it is odd behavior for someone who in the past few months has at least worn a patina of objectivity when discussing such issues.
 
After reading over the whole situation, if ATI is indeed replacing the shader with a version more suited to it's architecture, even if it produces the exact same output, I gotta cry foul. If this were a game, the benchmark run would have used content that existed in the game, and thus optimizations like this would enhance the overall game performance as well. Therefore, one could argue that the "optimization" did not unfairly skew the benchmark result. In an ideal world, ATI would be able to work with the game developer to get the optimized shader into the game as an ATI-friendly rendering path.
I have to agree. Replacing a shader routine imo is exactly the same thing as inserting clip planes.

Further imo this does provide more evidence that 3dmark03 is not a well written, sound, Benchmark. If Even ATi are secretly rewriting code that makes better sense, Then imo ATi+Nvidia = Futuremark makes NO SENSE.

This is a complete turn around for me, but I think the evidense is pretty compelling.
 
Tim said:
This is completely nonsense as long as you do all the work the developer has intended it is not cheating (unless you are using predetermined camera path to place clipping planes etc.).

This has been addressed in this thread already.

F.e:

The problem is, this isn't a game we're dealing with. It's a synthetic benchmark. Coming up with fine-tuned "optimized" shader code like that isn't useful here, because the performance boost from them cann't be translated to real-world performance gains as it can from a game's timedemo to the actual game.
 
Hellbinder[CE said:
]Further imo this does provide more evidence that 3dmark03 is not a well written, sound, Benchmark. If Even ATi are secretly rewriting code that makes better sense, Then imo ATi+Nvidia = Futuremark makes NO SENSE.

Ehm. You can't write a shader that's 100% optimized for all archetectures. It's just not possible. What's optimal for ATI is not optimal for nVidia and vice versa. This is one the reasons many developers dislike Cg for instance, despite being able to run on ATi cards it just wont perform anywhere near where it could had the driver had access to the high level code and made the appropriate optimisations itself, like in OpenGL 2.0.
 
Brent said:
WaltC said:
[H said:
front page]
3DMark Patch:
Futuremark has released a patch for 3DMark 2003 that eliminates “artificially high scoresâ€￾ for people using NVIDIA Detonator FX drivers. This is in response to the news item we posted last week ( as did several sites ). To us, things like this just solidify our belief that 3DMark03 overall score is useless as a real world benchmark. Thanks to everyone who sent this one in.

This is really sad.... :(

You know, apparently this issue has simply flown right over the heads of the people staffing [H] to the degree that even when it is explained in an itemized fashion, they cannot understand what has occurred. Their position is that the fact that nVidia cheated 3DMark means that the benchmark itself is "useless," and the fact that since the patch forces the Detonators to render normally without cheating and lowers the aggregate score to that well below a similarly clocked R9800P means absolutely nothing.

Wow. Talk about gluttons for punishment! *chuckle* Who in his right mind would want to buy a product that the manufacturer falsely advertises as having a level of performance much lower than it actually has??? Whew! Not me....;) These guys have really put on the blinders...this is taking CYA to an entirely new level.

It hasn't flown over MY head. I understand it.

Again, you do have me respect and admiration sir...but there really seems to be something goofy with Kyle's stance on this issue to me. I just can't agree with/understand it! :(

Humus said:
Brent said:
It hasn't flown over MY head. I understand it.

If I was in your position I would just leave [H]. You don't belong there, you belong to a site like Beyond3D.

If not B3D, I think you could easily either find a new home that fit ya better or have great success starting your own place. Either way, I'll still be reading ya. :)
 
Humus said:
Hellbinder[CE said:
]Further imo this does provide more evidence that 3dmark03 is not a well written, sound, Benchmark. If Even ATi are secretly rewriting code that makes better sense, Then imo ATi+Nvidia = Futuremark makes NO SENSE.

Ehm. You can't write a shader that's 100% optimized for all archetectures. It's just not possible. What's optimal for ATI is not optimal for nVidia and vice versa. This is one the reasons many developers dislike Cg for instance, despite being able to run on ATi cards it just wont perform anywhere near where it could had the driver had access to the high level code and made the appropriate optimisations itself, like in OpenGL 2.0.

So, basically, this screams for a benchmark using a HLSL. Then we will be absolutely sure that two different cards won't do exactly the same work :) , but at least we know that it will correspond to actual gaming performance. Of course we will have to analyze the rendering quality rather extensivly then but we're already doing that now so i don't see any problem with it.
 
Hellbinder[CE said:
] I have to agree. Replacing a shader routine imo is exactly the same thing as inserting clip planes.

Further imo this does provide more evidence that 3dmark03 is not a well written, sound, Benchmark. If Even ATi are secretly rewriting code that makes better sense, Then imo ATi+Nvidia = Futuremark makes NO SENSE.

This is a complete turn around for me, but I think the evidense is pretty compelling.

But I haven't seen any evidence that any code changes that "make more sense" have been initiated--merely code changes that "increase the benchmark score" when the benchmark is run.

As I said in an earlier post there is almost no difference whatever in the aggregate score on my 9800P between 3.2 and 3.3 of the benchmark--a difference of 86 points--which equates to a difference well within the normal range of expected deviation when running the benchmark (+ or - 3% as indicated by FutureMark is within the range of normal deviation.)

Any benchmark for any purpose ever made can be cheated--that is no reflection on the benchmark since it is impossible to write one which cannot be cheated...;) Rather, it is a reflection on those who cheat it...right?
 
Back
Top