New 3DMark03 Patch 330

Evildeus said:
Why haven't ET/B3D talked of Ati "optimisation" before the PR?

Ed, as I said before, if it isn't visible how are we supposed to to know its there? We highlighted some severly obvious visual issues when using the developer edition, which is not going shader code that produces the exact same output even though its optimised, only upon investigation to how the issues that we brought up did Futuremark find that some the optimisation were being triggered by shader code which we were not in a position to be able to investigate.

Tell us Ed, just how were we supposed to to find it?
 
Pete said:
As for ATi's performance boost: How far back does it occur? 3.4, 3.3, 3.2, ... ?

I still use the 3.2's w/ 9700pro. I had a 1 point drop, a 0.4 fps lower score in gametest 1. The rest was exactly equal between 320 and 330 patches.


Can we get over the grasping at straws yet? Nvidia cheated. period.
ATI is under the microscope now, LETS WAIT AND SEE (shocking concept,eh?).
 
Hardware.fr:

IMG0006295.gif


Please take note that the most hefty performance difference is in the shader heavy benchmarks (GT4, VS 2.0 and PS 2.0).

It have been almost a year since we were introduced to the glory of CineFX but today I feel like I just woke up sucking a lemon.
 
If ati really replacing some shaders better suited for their architectur than it's alittle of a mixed bag for me.

While you can consider this as a cheat in 3dmark03 because ati drivers does not exactly that what the application tells them to do(cheat) but if the same thing would be done in a game for instance replacing unoptimized cg shader code with their better code without decreasing IQ then it would be very much appreciated(optimization)
 
DaveBaumann said:
Evildeus said:
Why haven't ET/B3D talked of Ati "optimisation" before the PR?

Ed, as I said before, if it isn't visible how are we supposed to to know its there? We highlighted some severly obvious visual issues when using the developer edition, which is not going shader code that produces the exact same output even though its optimised, only upon investigation to how the issues that we brought up did Futuremark find that some the optimisation were being triggered by shader code which we were not in a position to be able to investigate.

Tell us Ed, just how were we supposed to to find it?
Well, i'm not asking B3D/ET to see/find the thing. I'm asking why they didn't report it. I supposed there's some discussion in the beta program between members or are those issues not discussed. If that so, well sure you can't tell.

Are you saying that if the visual issues of Det FX weren't find by ET/B3D, you wouldn't have talked about it?
 
tEd said:
While you can consider this as a cheat in 3dmark03 because ati drivers does not exactly that what the application tells them to do(cheat) but if the same thing would be done in a game for instance replacing unoptimized cg shader code with their better code without decreasing IQ then it would be very much appreciated(optimization)

I understand the internal turmoil in this. ;)

However, for a synthetic benchmark, the drivers / hardware whould just do exactly what they are told.

There' no guarantee that every shader can be "optimized" for every card, or that the driver writers WILL optimize shader code for every game in existence. Highly unlikely in fact.

So a synthetic benchmark should not be tampered with, at all, IMO.
 
tEd said:
If ati really replacing some shaders better suited for their architectur than it's alittle of a mixed bag for me.

While you can consider this as a cheat in 3dmark03 because ati drivers does not exactly that what the application tells them to do(cheat) but if the same thing would be done in a game for instance replacing unoptimized cg shader code with their better code without decreasing IQ then it would be very much appreciated(optimization)

I was torn on that issue as well, but I think in the end it's only fair if every card runs the exact same shader code. If nothing else, it will avoid even the "appearance" of cheating. IHV's can't guarantee that they'll be able to "optimize" the shader code of every single game to the same degree as they can for 3DMark, so in my opinion they shouldn't even try, and if they've got a serious enough issue with the efficiency of the default shader code, they address that in the beta process. Of course, they have to be PART of the beta process to do that :)
 
Joe DeFuria said:
I understand the internal turmoil in this. ;)

However, for a synthetic benchmark, the drivers / hardware whould just do exactly what they are told.

There' no guarantee that every shader can be "optimized" for every card, or that the driver writers WILL optimize shader code for every game in existence. Highly unlikely in fact.

So a synthetic benchmark should not be tampered with, at all, IMO.

Damn Joe, I gotta learn to type faster :)
 
Evildeus said:
Well, i'm not asking B3D/ET to see/find the thing. I'm asking why they didn't report it.

WTF are you talking about? How can they report on something they don't know exists?

I supposed there's some discussion in the beta program between members or are those issues not discussed. If that so, well sure you can't tell.

Which beta members "discovered" the ATI optimizations? THEY DIDN'T. FutureMark did.

Are you saying that if the visual issues of Det FX weren't find by ET/B3D, you wouldn't have talked about it?

EXACTLY. There would be nothing to talk about!
 
I tested my Radeon 9800 Pro if anyone's interested. Athlon 1400, Abit KG7, 512 MB PC2100, XP SP1, Catalyst 3.4, Audigy. On the left, build 320; on the right, build 330.

Overall - 5019, 4894

GT1 - 97.7, 94.9
GT2 - 37.7, 37.7
GT3 - 32.6, 32.5
GT4 - 35.7, 32.9

No sounds - 18.8, 18.5
24 sounds - 15.6, 15.5
60 sounds - N/A

Everything else stayed the same. I can't access the FM PDF at the moment, don't remember if it said anything about test 1? I assume if this also included a replacement vertex shader that would be the reason for the sound test differences.
 
Pete said:
Two reasons, ED:

1. There were no visibly obvious cheats on ATi cards (no clip planes / lack of buffer clears).
2. No one had FM's 330 patch.

These reasons are so obvious, what am I left to conclude about your motivations or capabilites? You seriously couldn't figure this out yourself? It seems most forum trolls enjoy attention, no matter how it's attracted.
1 agree
2 Disagree. Are you saying that there wasn't any test on the new revision by beta members before it's released? No discussion about the issue before the PR? Perhaps it's the case, i don't know, do you?
Thx for the troll part 8)
 
Evildeus said:
Well, i'm not asking B3D/ET to see/find the thing. I'm asking why they didn't report it.

Huh? Are you asking why they didn't report something they didn't detect/see? You do realize that some of the detect/replace shader code may not result in any visual differences?
 
On ATI's Optimization

It is very likely that ATI's (and other manufacturers') drivers include a list of "well-behaved" applications where certain efficiencies can be gained by choosing rendering strategies that would break less-well-behaved applications.

It may be that the reduction in ATI's performance in 3dmark may simply be the result of the application not being identified as on that list anymore, rather than actual, 3dmark-specific cheating.

Certainly at least a portion of NVidia's decline would be due to the same thing, although from the extent of decline as well as the off-the-rail visual anomalies it's clear that there are 3dmark-specific cheats in the NVidia drivers.
 
I agree with some of the other sentiments around here. neither ATI nor nvidia should be replacing code via the drivers. If they have an optimzation for 3dmark03, then it should be submitted to futuremark and *they* can decide if it should go in or not.

For the people who are on the fence, think about it this way. If ATI has an optimization that could benefit both nvidia and themselves, and only they use it, it's not really that their drivers or their cards are any better, it's that they are running a different test. Now, it can be argued that as long as the output is the same, it doesn't really matter how they got there, but none of us know the specifics. What if futuremark's algorithm is great for being general purpose, but ATI's is very limited? Perhaps futuremark had a good reason for using it. It's not ATI or Nvidia's place to change things without letting futuremark or the consumer know what's going on.

Nite_Hawk
 
Nite_Hawk said:
Bjorn: Well, that begs the question. If it's the *exact* same output, you should submit it to futuremark as an enhancement to the code.

I think that benchmarks are meant to be contrived bits of code to provide work for hardware. It's not the result the matters, or how it looks or whether the effect can be done faster, it's the workload.

Having said that, drivers have the responsibility of making the hardware go, if they can optimize a shader program on it's way to the card, great, if they have to detect specific games to do it because there is no generic op code conversion, no problem. Detecting benchmarks though is kinda stupid, whether you are justified or not.

Gets kinda grey when games are used as benchmarks, but game benchmarks only tell you what that game will benchmark like anyway, there are often separate bits of code per card so optimizations don't matter since you are not altering the fact that it's not a fair comparison to begin with. You could look at a Kyro2 or something and say that just about everything it did was cheating because it wasn't doing the same job as the other cards. And on it goes.. :)

Basically, all benchmarks are useless if you use any drivers that come out after the benchmark is created/updated, all game benchmarks are useless and everybody on the net that follows any of it has been wasting their time all these years. :)
 
I'm sure I'll get roasted for this, but...

I haven't looked at the PDF, but I wonder what they did to avoid 'detection'? Going beyond that, I wonder what, if anything might have upset the apple cart--changing the scores in ways beyond simply turning off the detection stuff.

If its something like re-ordering calls, or load order or something that, even though it generates the same results, changes the method of going about it this could upset the apple cart that could contribute to the depressed scores.

Now, you can't compare this build with the previous build scores, and next week when the new drivers come out with new detection algorithms, we won't be able to rely on them either. Perhaps I'm jaded, but I'm suspicious of any high profile app being used as a benchmark.

My take on optimization: if the card can detect it in a way that doesn't target a particular application (i.e. benchmark), I'm ok with the card optimizing shaders, reordering render order, etc as long as the output is what is expected. I'm ok with this even if the optimization is happening within the benchmark, AS LONG AS ITS NOT SPECIFICALLY TARGETTED TOWARD THE BENCHMARK.

If one dufus stuffs their shaders with nops, then somebody else will too. If the driver can fill open execution slots by combining instructions or re-ordering them--I'm all for it.
 
Evildeus said:
2 Disagree. Are you saying that there wasn't any test on the new revision by beta members before it's released? No discussion about the issue before the PR? Perhaps it's the case, i don't know, do you?
Thx for the troll part 8)

Who's to say that FM -has- to submit all builds of 3DMark to the beta members before release? The shader code wasn't changed so much as relabelled. It's not altering the work that's being done, just making sure all cards are actually -doing- it. Besides, it would defeat the whole purpose of the patch. The idea is to defeat the identification process the IHV's are using to cheat. When you lock your door on the way out of your house, do you put up a sign pointing to the hidden key under the rock in your yard and tell crooks not to use it or you'll sue?

To be fair, breaking code-detection in this manner is only a band-aid solution, IHV's can simply re-write their drivers to detect the new code. but hopefully it'll make a point to the IHV's that their actions can and will be found out, and they'll stop trying.
 
Evildeus said:
2 Disagree. Are you saying that there wasn't any test on the new revision by beta members before it's released? No discussion about the issue before the PR? Perhaps it's the case, i don't know, do you?
I guess the better questions to ask would not be "Why wasn't it reported?", but "Did Beyond3D have the patch prior to it's official release? If so, how long before the release did they have it?"
 
Himself said:
snip
all game benchmarks are useless and everybody on the net that follows any of it has been wasting their time all these years. :)
Absolutely wasting our time. But where would humanity be without all this tabloid style writing? Hamburgers and tabloids differenciates us from the animals.
Some people would argue that it would be the thumb or brains... but that surely must be incorrect.
 
Back
Top