Futuremark Announces Patch for 3DMark03

Hanners said:
Bouncing Zabaglione Bros. said:
Reading this thread, there are still doubts that this "clean slate" has been achieved, because the performance drop in 340 does not mirror the drops we see in other benchmarks.

That remains to be seen I guess, but 'cleanish' slate at least - If nothing else it gives us a starting point for nVidia's almost guaranteed future deceptions in this application.

My point is that Nvidia is still cheating, but is not being penalised for doing so, no matter how successful those cheats are with this current patch.

It's like a thief being caught stealing from your house just being told to return the goods today, and try to burgle your house again tomorrow. What should happen is the thief is sanctioned for the attempted buglary, regardless of whether you got your posessions back or not.
 
Anthony,
that's what I thought, thanks for clearing it up.

For what it's worth, your policy has my "sign of approval". :)

Keep on the excellent work.
Cheers,
Mac
 
http://www.xbitlabs.com/news/video/display/20031112031947.html
Hans-Wolfram Tismer, a Managing Director for Gainward Europe GmbH said today: “According to my information patch 340 disables the GPU compiler. The compiler has to run on the CPU instead resulting in code harder to digest and taking away 20% of the performance. This may not reflect gaming performance and may point in the wrong direction. To me 3DMark03 may look less and less suitable to be used for benchmarking. By the end of the day the end users are running games and applications, not 3DMark03.â€￾
 
madshi said:
http://www.xbitlabs.com/news/video/display/20031112031947.html
Hans-Wolfram Tismer, a Managing Director for Gainward Europe GmbH said today: “According to my information patch 340 disables the GPU compiler. The compiler has to run on the CPU instead resulting in code harder to digest and taking away 20% of the performance. This may not reflect gaming performance and may point in the wrong direction. To me 3DMark03 may look less and less suitable to be used for benchmarking. By the end of the day the end users are running games and applications, not 3DMark03.â€￾
Haha, that's funny. It is the typical PR-FUD. Fact is, an application cannot disable the shader compiler.

Edit: What is a "GPU-Compiler"?
 
Exxtreme said:
Edit: What is a "GPU-Compiler"?

They mean the new compiler technology in the 52.16 drivers, which obviously are general optimisations and not application specific, and therefore there is no way it could be disabled by shuffling a few shader instructions in 3DMark.
 
Hanners said:
Exxtreme said:
Edit: What is a "GPU-Compiler"?

They mean the new compiler technology in the 52.16 drivers, which obviously are general optimisations and not application specific, and therefore there is no way it could be disabled by shuffling a few shader instructions in 3DMark.
They mean the shader compiler? I know that an application cannot disable them.
 
madshi said:
http://www.xbitlabs.com/news/video/display/20031112031947.html
Hans-Wolfram Tismer, a Managing Director for Gainward Europe GmbH said today: “According to my information patch 340 disables the GPU compiler. The compiler has to run on the CPU instead resulting in code harder to digest and taking away 20% of the performance. This may not reflect gaming performance and may point in the wrong direction. To me 3DMark03 may look less and less suitable to be used for benchmarking. By the end of the day the end users are running games and applications, not 3DMark03.â€￾

"According to my information" . That wouldn't be "information" from Brian Burke and his Nvidia Damage Control Team would it?
 
madshi said:
http://www.xbitlabs.com/news/video/display/20031112031947.html
Hans-Wolfram Tismer, a Managing Director for Gainward Europe GmbH said today: “According to my information patch 340 disables the GPU compiler. The compiler has to run on the CPU instead resulting in code harder to digest and taking away 20% of the performance. This may not reflect gaming performance and may point in the wrong direction. To me 3DMark03 may look less and less suitable to be used for benchmarking. By the end of the day the end users are running games and applications, not 3DMark03.”

Damn, my "For immediate release" are completely beaten on this one...

I'm pretty sure that Nvidia being part of the Beta program, they would have done something earlier, no ? Get ready for another round of FM smearing, this time probably through proxies (the guys with webpages) and board makers (like Gainward in this case).
 
Exxtreme said:
Hanners said:
Exxtreme said:
Edit: What is a "GPU-Compiler"?

They mean the new compiler technology in the 52.16 drivers, which obviously are general optimisations and not application specific, and therefore there is no way it could be disabled by shuffling a few shader instructions in 3DMark.
They mean the shader compiler? I know that an application cannot disable them.
Exactly. Also, were that the reason for the performance drop, you wouldn't have the IQ differences to the extent shown in Wavey's article. Slight precision differences? Why not. Lowering the amount of hair rendered on that chick's head? No.

The same applies to theories claiming the "re-ordering of the code to disable application detection" is the culprit--there's no logical reason why such a re-ordering, even if it leads to sub-optimal code for the GFFX architecture, should produce such IQ differences.

93,
-Sascha.rb
 
So first we had guidelines which warned the IHV's. Then we had clarifications which warned the IHV's. Now we have a patch which warns the IHV's.

When will some *real* action be taken? If you were nVidia, would you care about this? All you do is rewrite the cheats, forcing FutureMark to do the work. What kind of system is that? Just ban the drivers until you can verify that they don't cheat, and thus FutureMark saves itself an awful lot of work which puts the burden on nVidia and not FutureMark (unlike now where there is little to no incentive for nVidia to stop cheating).

I appreciate this half step in the right direction, but it really feels like a half-hearted and overly putting the burden on FutureMark, move.
 
Bouncing Zabaglione Bros. said:
madshi said:
http://www.xbitlabs.com/news/video/display/20031112031947.html
Hans-Wolfram Tismer, a Managing Director for Gainward Europe GmbH said today: “According to my information patch 340 disables the GPU compiler. The compiler has to run on the CPU instead resulting in code harder to digest and taking away 20% of the performance. This may not reflect gaming performance and may point in the wrong direction. To me 3DMark03 may look less and less suitable to be used for benchmarking. By the end of the day the end users are running games and applications, not 3DMark03.â€￾

"According to my information" . That wouldn't be "information" from Brian Burke and his Nvidia Damage Control Team would it?
The nVidia Damage Control Team is a myth, remember? ;)

Now I just wish I could say the same about Brian Burke & Derek Perez.... :rolleyes:

Does this mean we're seeing how nVidia plans to play this one already?
 
So obviously FutureMark are trying to make nVidia look bad (for reasons we can't understand) by disabling their GPU compiler but not ATI's ;)
 
Quitch said:
So obviously FutureMark are trying to make nVidia look bad (for reasons we can't understand) by disabling their GPU compiler but not ATI's ;)

Actually, according to a call from one Journo I had this morning its all Dell's fault... You know, Dell, who are currently carrying all ATI's high end, asked for a patch from Futuremark that purposefully disables NVIDIA's shader optimiser... :!:
 
DaveBaumann said:
Quitch said:
So obviously FutureMark are trying to make nVidia look bad (for reasons we can't understand) by disabling their GPU compiler but not ATI's ;)

Actually, according to a call from one Journo I had this morning its all Dell's fault... You know, Dell, who are currently carrying all ATI's high end, asked for a patch from Futuremark that purposefully disables NVIDIA's shader optimiser... :!:
Are you being serious or sarcastic with this one Dave? :|
 
Dave, you really must provide me with a list of names for these gullible people. I have some swamp land, oh I mean beach front property I would love to sell them. ;)
 
The problem with this is that if the compiler were disabled, wouldn't the PS2.0 scores have dropped significantly, too?
 
CorwinB said:
Damn, my "For immediate release" are completely beaten on this one...

Ahhh but they are a breath of fresh air that makes its just a bit easier to handeling this stuff. Thanks for your work on them :)
 
Ostsol said:
The problem with this is that if the compiler were disabled, wouldn't the PS2.0 scores have dropped significantly, too?
Hehe, you're right.

The truth is, you cannot disable the shader compiler. The application has no chance to do that.
 
Take a look at this PR released today. Talk about damage control.

New NVIDIA GeForce FX Graphics Processors Achieve Top Marks From Industry Pundits
Wednesday November 12, 9:01 am ET
GeForce FX 5950 Ultra, GeForce FX 5700 Ultra Receive Awards and Glowing Reviews From Top Technology Media; GeForce FX 5700 Ultra Proclaimed World's Fastest Mainstream GPU (i)


SANTA CLARA, Calif., Nov. 12 /PRNewswire-FirstCall/ -- NVIDIA Corporation (Nasdaq: NVDA - News), the worldwide leader in visual processing solutions, today announced that leading technology publications and technical Web sites have declared the Company's new fall lineup, the GeForce(TM) FX 5950 Ultra and the GeForce FX 5700 Ultra, the best in its class(ii). Leading technical Web sites such as AnandTech, CNET, FiringSquad, Hot Hardware, Tom's Hardware Guide and more, have all praised the new GeForce FX family for its unprecedented performance and unmatched features.
(Logo: http://www.newscom.com/cgi-bin/prnh/20020613/NVDALOGO )
"Our new GeForce FX 5950 Ultra and 5700 Ultra GPUs have been met with tremendous enthusiasm from the media, our partners, and customers," said Dan Vivoli, executive vice president of marketing at NVIDIA. "These reviews confirm that NVIDIA has delivered the most-powerful top-to-bottom family of DirectX 9.0-based GPUs for the holiday season."

In addition to the landslide of positive news and reviews, the new GeForce FX products have been broadly adopted by leading retail add-in-card partners, system builders, and PC OEMs including: ABS PC, Alienware, BFG Technologies, eVGA.com Corporation, Falcon Northwest, MicronPC, PNY Technologies, Inc., Polywell Computers, Velocity Micro, Voodoo PC, and more. A complete list of suppliers of GeForce FX-based graphics cards, which are available on retail shelves worldwide, can be found at http://www.nvidia.com/content/wheretobuy/consumer.asp .

What Critics are Saying

The GeForce FX 5950 Ultra reinforces NVIDIA's leadership in the high-end gaming segment, which demands the best performance and image quality available. PC OEMs are already reaping the benefits as the GeForce FX 5950 Ultra-based Velocity Micro Raptor Extreme Edition system has won the CNET Editor's Choice Award ( http://reviews.cnet.com/Velocity_Micro_Raptor_Extreme_Edition/4505-3119_7- 30588550.html?tag=promo2 ), and the Falcon Northwest Mach V was awarded the Maximum PC Kick Ass! Product Award (November Issue).

"As NVIDIA's new flagship product, it goes without saying that the GeForce FX 5950 Ultra is fast," wrote Brandon Bell, hardware editor in chief for FiringSquad (www.firingsquad.com) in his GeForce FX 5950 review. "This is the card for the guy who wants uncompromising performance."

In the all important mainstream market, NVIDIA GeForce FX 5700 Ultra also has critics raving.

"NVIDIA has flipped the tables in the midrange segment and takes the performance crown with a late round TKO," wrote Derek Wilson, graphics card editor for Anandtech (www.anandtech.com).

"It's almost a shame to label the 5700 Ultra as a mainstream unit, since its performance was top notch in all of our benchmarks," added Robert Maloney, hardware editor for Hot Hardware (www.hothardware.com), in awarding the GeForce FX 5700 Ultra the Hot Hardware Editor's Choice Award.

The NVIDIA Forceware(TM) release 50 graphics driver, featuring the revolutionary NVIDIA unified compiler technology, has also received praise.

"With the new ForceWare driver, NVIDIA has done much to improve performance, especially in the newer DirectX 9 games -- and without having to sacrifice image quality," wrote Lars Wienand, graphics card editor for Tom's Hardware Guide (www.tomshardware.com).

"It looks like NVIDIA has done an excellent job with their new ForceWare release 50 drivers," wrote Jason Cross, technical editor for Extremetech (www.extremetech.com).

For details on where to buy GeForce FX-based graphics cards, visit the NVIDIA Web site at http://www.nvidia.com/content/wheretobuy/consumer.asp .

http://biz.yahoo.com/prnews/031112/sfw028_1.html
 
Back
Top