Futuremark Announces Patch for 3DMark03

mczak said:
Neeyik said:
Here are those test results I was on about in full:

Drops all round on the FX with the drivers.
Though I find it VERY strange that the 44.03 driver drops performance so much (since 330 already disables optimizations this driver has for 3dmark03). But the FX5900Ultra loses ANOTHER 50% (in GT 2 and 3 at least) with the 44.03 driver going from 330 to 340???

The simple answer is that the 330 patch did not disable all of NVIDIA's optimizations in 3dmark03 with driver 44.03. But the 340 patch has defeated all the detections.

You can find results I posted from 3dmark03 a few weeks ago here: http://www.nvnews.net/vbulletin/showthread.php?s=&threadid=20526 Build 330 was used in conjunction with several different driver revisions. Anti-detect was used on drivers it works with.

You can see that the drops with 44.03 from 330->340 jive perfectly well with 44.03 + Anti-Detect + 330. Suggesting that anti-detect was reliable, and that not all cheats in 3dmark03 were disabled with the 330 patch.
 
Neeyik said:
“Few weeks ago we released our 52.16 driver that includes our brand new unified compiler technology. With the new patch the benchmark, our unified compiler gets not used by the app so it goes to CPU and we are definitely slower,â€￾ Luciano Alibrandi, NVIDIA’s European Product PR Manager, added.[/i]

The bechmarks earlier in this thread, ( http://www.beyond3d.com/forum/viewtopic.php?p=191490&highlight=test+results+i+full#191490 )
show a very similar performance drop on the 45.23 drivers and a much more sever performace drop on the 44.03 drivers with the new 340 patch.

Interesting that two driver sets that do not contain their unified compiler technology are effected by disabling it :oops:
 
Tridam said:
I've tried v330 and v340 shaders in another application. They showed exactly the same execution speed. So NVIDIA is doing replacement of these shaders ONLY in 3Dmark03. If a game uses the same shaders it won't have the enhancement.
Thanks for the info. Can you try the same test, with 3DMark open (but waiting idle in some menu / not running the bench)?
May be the shaders substitution catch up when the 3DMark is detected open, even if it's not 3DMark using the targeted shaders...

Thanks,
Bye!
 
Mark0 said:
Tridam said:
I've tried v330 and v340 shaders in another application. They showed exactly the same execution speed. So NVIDIA is doing replacement of these shaders ONLY in 3Dmark03. If a game uses the same shaders it won't have the enhancement.
Thanks for the info. Can you try the same test, with 3DMark open (but waiting idle in some menu / not running the bench)?
May be the shaders substitution catch up when the 3DMark is detected open, even if it's not 3DMark using the targeted shaders...
What one application is doing is unlikely to affect another application. Each has their own graphics context, etc.
 
OpenGL guy said:
What one application is doing is unlikely to affect another application. Each has their own graphics context, etc.
I agree. It was just for curiosity. Strange things happens in strange times... :)

Thx,
Bye!
 
I wonder if ATI is working on a shader optimizer as well, since their R3xx benefits from hand-optimized shaders too (granted, only a few percents).
Or it's simply not worth it?
 
nyt said:
I wonder if ATI is working on a shader optimizer as well, since their R3xx benefits from hand-optimized shaders too (granted, only a few percents).
Or it's simply not worth it?
We've had a shader optimizer in the driver for over a year ;) As I mentioned elsewhere, you'd be sad if we disabled it. Note that this is a general optimizer, not one that replaces detected shaders with hand-coded ones.

We're always working to improve the optimizer. Stay tuned!
 
OpenGL guy said:
We've had a shader optimizer in the driver for over a year ;) As I mentioned elsewhere, you'd be sad if we disabled it. Note that this is a general optimizer, not one that replaces detected shaders with hand-coded ones.

In general terms, what kind of percentage improvements does that bring? Kind of interested in how good/bad the architecture is without the optimiser.
 
PaulS said:
OpenGL guy said:
We've had a shader optimizer in the driver for over a year ;) As I mentioned elsewhere, you'd be sad if we disabled it. Note that this is a general optimizer, not one that replaces detected shaders with hand-coded ones.

In general terms, what kind of percentage improvements does that bring? Kind of interested in how good/bad the architecture is without the optimiser.

Probably depends on how much improvement you can get on a given shader. A shader that happens to be not suited to the architechture will get more benefit from being optimised/reorganised than a shader which already suits the ATI architechture.

IIRC there have been several hints from ATI staff that they are well on their way to writing the next version of the optimiser that will give even more improvements in a generic fashion.
 
PaulS said:
OpenGL guy said:
We've had a shader optimizer in the driver for over a year ;) As I mentioned elsewhere, you'd be sad if we disabled it. Note that this is a general optimizer, not one that replaces detected shaders with hand-coded ones.
In general terms, what kind of percentage improvements does that bring? Kind of interested in how good/bad the architecture is without the optimiser.
It's very shader dependent. For example, there's generally more room for optimization in longer shaders (it's tough to optimize single instruction shaders ;)).
 
LOL,Nvidia PR machine said that:

"So, we advocate that when reviewers are using 3DMark as a game proxy, they must run with the unified compiler fully enabled. All games run this way. That means running with the previous version of 3DMark, or running with a version of our drivers that behave properly.

Derek Perez
Director of Nvidia PR"

Link

oh,dave,the test u do with 340 patch is invalid accroding to Nvidia ;)
 
Tridam said:
I've tried v330 and v340 shaders in another application. They showed exactly the same execution speed. So NVIDIA is doing replacement of these shaders ONLY in 3Dmark03. If a game uses the same shaders it won't have the enhancement.
There may be additional triggers for the replacement, which may or may not include the application name - which I really don't think is still being done, this could easily be defeated by renaming 3DMark03 ... anyone?
-
Eg,
1)currently bound textures (size, format, not necessarily image data)
2)shader constants (the code references c0 and c1) - particularly interesting for instruction count reduction
3)current vertex shader - you may only be able to optimize the fragment shader by moving work to the vertex shader, or by reducing the data passed between the two; so you need a matching pair
4)basically any other render state - say, render target w/o destination alpha, blending is disabled => lets you discard any computation of final alpha

Preferably (*ugh* did I just say that?) you'd build a hash value from whatever things you need to know for your replacement to work, so you can make a quick first comparison to a set of known rendering state vectors. If you 'hit', you proceed with a real comparison (because hash comparisons aren't exact, they merely provide quick elimination of non-matches). This can be made very fast and wouldn't impact 'uncheated' applications much, which is, of course, quite important.
 
Rookie said:
LOL,Nvidia PR machine said that:

"So, we advocate that when reviewers are using 3DMark as a game proxy, they must run with the unified compiler fully enabled. All games run this way. That means running with the previous version of 3DMark, or running with a version of our drivers that behave properly.

Derek Perez
Director of Nvidia PR"

Link

oh,dave,the test u do with 340 patch is invalid accroding to Nvidia ;)
ROFLMFAO~~~~

The man either has no shame or no clue, I still ain't sure which! :LOL:
 
OpenGL guy said:
nyt said:
I wonder if ATI is working on a shader optimizer as well, since their R3xx benefits from hand-optimized shaders too (granted, only a few percents).
Or it's simply not worth it?
We've had a shader optimizer in the driver for over a year ;) As I mentioned elsewhere, you'd be sad if we disabled it. Note that this is a general optimizer, not one that replaces detected shaders with hand-coded ones.

We're always working to improve the optimizer. Stay tuned!

I truely admire ATI for the fact that they didn't use this fact to do marketting. They just let the compiler do it's job.

I have just one question : does this compiler work more for hand written shaders ? Or does it work the same for hand written and HLSL compiled shaders (globally, I know hand written shaders can be much better than automatic & genericly compiled shaders ?)

And just another thing, I'm just more impressed by the fact my R9700 Pro is more than one year old and is still smocking contenders ! :oops:

And, btw, is the GF4Ti boost spotted by nick by using the 340 patch + FX 52.16 (and earlier has been confirmed ?
 
Just talking to Derek Perez on IM at the moment. He has said there are shader replacements in the drivers (fine) and when asked if the compiler was still a work in progress he said that it was finished. Now, if the runtime shader compiler is "optimal" then it should be achieving as good, if not better, performance as a hand coded shader that is mathematically equivelent to the application shader - this being the case, why would you need hand coded shaders...?
 
Magic-Sim said:
And just another thing, I'm just more impressed by the fact my R9700 Pro is more than one year old and is still smocking contenders ! :oops:

It is just dashed nice that 9700 cards take that special moment out to attend to their competitors sartorial needs, especially at work. :)
 
Magic-Sim said:
OpenGL guy said:
We've had a shader optimizer in the driver for over a year ;) As I mentioned elsewhere, you'd be sad if we disabled it. Note that this is a general optimizer, not one that replaces detected shaders with hand-coded ones.

We're always working to improve the optimizer. Stay tuned!

I truely admire ATI for the fact that they didn't use this fact to do marketting. They just let the compiler do it's job.
It's just decent business practice. After all, it's ATI's job (and vested interest) to exploit the hardware's potential.
I don't intend to downplay ATI's achievements, on the contrary. It's just that IMO these kinds of things are best expressed through benchmarks and don't particularly require nor warrant press releases stating the obvious - ie "We are proud to announce that we now have managed to write a driver that finally does what it should have done eighteen months ago." :D

Magic-Sim said:
I have just one question : does this compiler work more for hand written shaders ? Or does it work the same for hand written and HLSL compiled shaders (globally, I know hand written shaders can be much better than automatic & genericly compiled shaders ?)
As has been stated, HLSL is layered ontop of DirectX Graphics, its results are passed to the driver through the plain jane PS2.0 interface. So while I obviously can't speak for ATI, it's rational to assume that the driver treats both types of shaders equally - it doesn't even know the difference, unless I'm hopelessly wrong ;)
 
DaveBaumann said:
Just talking to Derek Perez on IM at the moment. He has said there are shader replacements in the drivers (fine) and when asked if the compiler was still a work in progress he said that it was finished. Now, if the runtime shader compiler is "optimal" then it should be achieving as good, if not better, performance as a hand coded shader that is mathematically equivelent to the application shader - this being the case, why would you need hand coded shaders...?
What does he say to that? And how does he figure what he said is any different than application specific detection by a different name? (Oh, and you can ask him why he isn't returning my e-mails if'n you want a laugh/piss him off. ;) )
 
madshi said:
PaulS said:
On the contrary, that's exactly what's happening with some people on the forums (not just here). I've said it before, and i'll say it again: This constant black & white, nVidia = evil, ATi = good, has got to stop.
You know what has got to stop? Take a look at what kind of misleading PR nonsense nVidia has released just in the previous 48 hours. :!:

Actually, it's more like the previous 48 weeks (well more or less since 3dmark03 came out, it's when all this crap started anyways).

There's something wrong with the equations.
It's NOT nVidia = evil, ATi = good...
Actually it's more like nVidia PR= evil, ATi = good ....ATi PR...remains to be seen but cooler until today.
 
Is it all really the fault of Nvidia's PR department?

The more I see of this nonsense, the more I wonder if there's a serious systemic fault that extends well beyond marketing.

The reshuffling of design teams after the acquisition of 3dfx seemed to throw a wrench in the design pipeline. The fact that they had the old engineers defend the design ideas of the new additions and vice versa indicate that the teams had very divergent design priorities and operating procedures. The indications that there were multiple failed tapeouts is strong evidence that this supposed sign of dynamic leadership simply screwed up the system by breaking up working team dynamics without thought towards trying to keep what worked functioning.

That the top-class driver department now has the black mark of being caught churning out pathetic benchmark shader replacements is a sign that things are not right. Who knows how many unexposed capabilities like floating point render targets could have been exposed in the six+ months since the launch of the FX line if the driver team was actually working on the hardware instead of trying to develop the most stealthy way to bias a bunch of benchmarks and time demos (and only now really doing it in a relatively non-obvious way). I wonder if anyone has left or transfered to a different area instead of becoming a digital hack rewriting somebody else's code.

This isn't all just because of marketing, or if it is, then whoever should be in charge (but isn't) is a liability that is a threat to any further development of both the personnel and products. There's no way that some rogue band of driver coders is responsible for all this, and there's no way they'd do this willingly. In other words, someone has made them do this, and judging from the signs of poor morale, there are some seriously negative consequences to not supporting the untenable state of affairs.

It seems that marketing is wearing the management hat for the company, which means that somebody else who should be taking charge is walking around with a dunce cap. There better be a shake-up soon in the corporate hierarchy, and there's not a doubt in my mind that everyone in Nvidia knows it. In management, a leader is supposed to know what is going on, and somehow being kept out of the loop, being too stupid to see how things have gotten out of hand, or being too ineffectual to correct things does not make one blameless--it makes him dead weight.

Perhaps it is necessary to start firing from the top down, and either hoping to find some still vital leadership, or grafting in a team capable enough to then start pruning the rest of this tangle.
 
Back
Top