NVIDIA Hawx AA quality issue

Status
Not open for further replies.

neliz

GIGABYTE Man
Veteran
http://www.kitguru.net/components/g...-gtx570-be-caught-in-new-app-detection-cheat/

Probably the most famous was revealed by Beyond3D when genius uber-geeks discovered that changing the name of the 3DMark 2003 executable file created a huge difference for the performance of nVidia cards.

14vb6o.jpg


They are still waiting for an official PR response from NV, but this might be another case of pride before the fall as they were way to proud to claim they didn't do executable detection in their blog just a couple of days ago.
 
Last edited by a moderator:
Oh really? Too bad, there's a response in the comments already.
Hi Everybody,

What is being experienced is not an “Antialiasing cheat” but rather a HawX bug that is fixed by our driver using an application specific profile.

In a nutshell, the HawX application requests the highest possible AA “sample quality” at a particular AA level from our driver. Without our driver fix, the game would be running 16xCSAA instead of standard 4xAA when you select 4xAA in-game. It runs the proper 4xAA with the driver fix. You defeat the fix by changing the .exe name, causing it to run at 16xCSAA.

You may remember that Geforce 8800 introduced Coverage Sampling AA (CSAA) technology, which added higher quality AA using little extra storage. Prior to 8800 GTX and CSAA, there was only one “sample quality level” for each AA level, so if an application requested four AA samples, the hardware performed standard 4xAA. However, with 8800 GTX GPUs onwards, our drivers expose additional sample quality levels for various standard AA levels which correspond to our CSAA modes at a given standard AA level.

The “sample quality level” feature was the outcome of discussions with Microsoft and game developers. It allowed CSAA to be exposed in the current DirectX framework without major changes. Game developers would be able to take advantage of CSAA with minor tweaks in their code.

Unfortunately, HawX requests the highest quality level for 4xAA, but does not give the user the explicit ability to set CSAA levels in their control panel. Without the driver profile fix, 16xCSAA is applied instead of standard 4xAA. Recall that 16xCSAA uses 4 color/Z samples like 4xAA, but also adds 12 coverage samples. (You can read more about CSAA in our GeForce 8800 Tech Briefs on our Website).

When you rename the HawX.exe to something else, the driver profile bits are ignored, and 16xCSAA is used. Thus the modest performance slowdown and higher quality AA as shown in the images.

To use “standard” 4xAA in a renamed HawX executable, you should select any level of anti-aliasing in the game, then go into the NVIDIA Control Panel and set 4xAA for “Antialiasing Setting” and turn on “Enhance the application setting” for the “Antialiasing mode”.

Nick Stam, NVIDIA
 
maybe I was too late: (still goes to show that they do exe detection for optimizations though)

A few days ago, a forum member posted information regarding image quality concerns with popular flight combat game Tom Clancy H.A.W.X. There has been much debate about this across the net over the weekend.
Nvidia’s Nick Stam has addressed these concerns, responding to KitGuru. Nick Stam is nVidia’s Technical Marketing Director. His team provides technical support to Web and print tech media, industry analysts, and business partners. Nick’s team also generates reviewer guides and technical whitepapers.
We felt it was important to highlight his reply via KitGuru to our readers so he can clear up any concerns.

Hi Everybody,
What is being experienced is not an ?Antialiasing cheat? but rather a HawX bug that is fixed by our driver using an application specific profile.
In a nutshell, the HawX application requests the highest possible AA ?sample quality? at a particular AA level from our driver. Without our driver fix, the game would be running 16xCSAA instead of standard 4xAA when you select 4xAA in-game. It runs the proper 4xAA with the driver fix. You defeat the fix by changing the .exe name, causing it to run at 16xCSAA.
You may remember that Geforce 8800 introduced Coverage Sampling AA (CSAA) technology, which added higher quality AA using little extra storage. Prior to 8800 GTX and CSAA, there was only one ?sample quality level? for each AA level, so if an application requested four AA samples, the hardware performed standard 4xAA. However, with 8800 GTX GPUs onwards, our drivers expose additional sample quality levels for various standard AA levels which correspond to our CSAA modes at a given standard AA level.
The ?sample quality level? feature was the outcome of discussions with Microsoft and game developers. It allowed CSAA to be exposed in the current DirectX framework without major changes. Game developers would be able to take advantage of CSAA with minor tweaks in their code.
Unfortunately, HawX requests the highest quality level for 4xAA, but does not give the user the explicit ability to set CSAA levels in their control panel. Without the driver profile fix, 16xCSAA is applied instead of standard 4xAA. Recall that 16xCSAA uses 4 color/Z samples like 4xAA, but also adds 12 coverage samples. (You can read more about CSAA in our GeForce 8800 Tech Briefs on our Website).
When you rename the HawX.exe to something else, the driver profile bits are ignored, and 16xCSAA is used. Thus the modest performance slowdown and higher quality AA as shown in the images.
To use ?standard? 4xAA in a renamed HawX executable, you should select any level of anti-aliasing in the game, then go into the NVIDIA Control Panel and set 4xAA for ?Antialiasing Setting? and turn on ?Enhance the application setting? for the ?Antialiasing mode?.
Nick Stam, NVIDIA
 
maybe I was too late: (still goes to show that they do exe detection for optimizations though)
I know you don't miss any opportunity to take a stab at NVIDIA, but don't you think, this is stretching it a bit?

If the game calls for 4xAA, but 16xCSAA is applied instead and there is no patch incoming, NVIDIA should just let it be? And if they fix it through profiles, it's suddenly an exe detection optimization? Oh, wow :LOL:
 
Of course they do exe detection. It's how their awesome game profiles work.
 
neliz said:
maybe I was too late: (still goes to show that they do exe detection for optimizations though)
What do you suggest GPU vendors should use instead?

CRC checks on the shaders for shader replacement? Specific optimizations in de shader compilers that only happen to trigger very specific cases? Dynamic shader recompilation based on real time statistics gathering? (Invent your own detection method...)

Why go through the trouble when name detection is very reliable. Heck, it even allows a rare insight into the kind of optimizations are possible if you know you target is a specific game only.

What exactly is wrong with no-matter-what detection technique as long as the final image is identical (say max 1 LSB difference) ?

As a user I couldn't care less how a game is made to run faster.
 
Anyway I take it that this whole topic is a non event or is something worth being made of this?
 
It seems like things develop quickly, nothing to see here, move along...
 
Status
Not open for further replies.
Back
Top