HL2 perf. from anand

tEd

Casual Member
Veteran
http://www.anandtech.com/showdoc.html?i=1862


You'll see my own numbers tomorrow night at midnight, but we've been given the go ahead to reveal a bit of information about Half-Life 2. I'll keep it brief and to the point and will explain it in greater detail tomorrow night:

- Valve is pissed at all of the benchmarking "optimizations" they've seen in the hardware community;
- Half-Life 2 has a special NV3x codepath that was necessary to make NVIDIA's architecture perform reasonably under the game;
- Valve recommends running geforce fx 5200 and 5600 cards in dx8 mode in order to get playable frame rates.
- even with the special NV3x codepath, ATI is the clear performance leader under Half-Life 2 with the Radeon 9800 Pro hitting around 60 fps at 10x7. The 5900 ultra is noticeably slower with the special codepath and is horrendously slower under the default dx9 codepath;
- the Radeon 9600 Pro performs very well - it is a good competitor of the 5900 ultra;
- ATI didn't need these special optimizations to perform well and Valve insists that they have not optimized the game specifically for any vendor.

There you have it folks, there's much more coming tomorrow.
 
Hmm. I wonder if "tomorrow night at midnight" means midnight September 11th or midnight September 12th. If he meant it correctly, then we should be seeing the numbers in 4 hours or so. Though I know sometimes people mess up and this could mean the transition from September 11th through September 12th.

Anyways, I hope this doesn't turn into the flipside of the Doom3 debacle. Save us jeebus the fanbois could have a field day with this, so I hope everyone keeps themselves in check. ;)
 
9800s and 5900s and such is all good and well, but what about the old former champ now relegated to workhorse, ie GF3? Sure, it's not such a sexy card anymore, but since I own one I'd like to see how it runs. And not in a 3+GHz system either. Say... 1.7, like my box. :LOL:


*G*
 
John Reynolds said:
tEd said:
http://www.anandtech.com/showdoc.html?i=1862


- Valve is pissed at all of the benchmarking "optimizations" they've seen in the hardware community

That line makes me wonder how far Valve will go (thinking future patches) if any IHVs start detecting their game and replacing shaders regardless of user settings?
I'm hoping as far as they have to. 8)

I wonder how nVidia is gonna spin this one...
 
digitalwanderer said:
John Reynolds said:
tEd said:
http://www.anandtech.com/showdoc.html?i=1862


- Valve is pissed at all of the benchmarking "optimizations" they've seen in the hardware community

That line makes me wonder how far Valve will go (thinking future patches) if any IHVs start detecting their game and replacing shaders regardless of user settings?
I'm hoping as far as they have to. 8)

I wonder how nVidia is gonna spin this one...

Yeah...I feel some special drivers coming on. I can see it now:
"Now from Nvidia...Gordonator 1.0! The new driver just for HL2 that is necessary to play HL2...but in no way alters the way the game is rendered!"

:p

Jack
 
Patchs to stop optimisations is a step backwards and unneccesary for a gaming company IMHO. It will simply drag resources away from bug fixing and adding new features.
 
Tahir said:
Patchs to stop optimisations is a step backwards and unneccesary for a gaming company IMHO. It will simply drag resources away from bug fixing and adding new features.

That depends on what the patches stop. If they stop shader replacement, this is a GOOD thing, because if the drivers are replacing shaders, that leads to UNPREDICTABLE results for developers when they tweak features.
 
digitalwanderer said:
I wonder how nVidia is gonna spin this one...

"Since NVIDIA cards were not used to develop Half-Life 2(a game that pales to Doom 3 anyways) we do not get a chance to work with Valve on writing the shaders we like. We don't know what they did but it looks like they have intentionally tried to create a scenario that makes our products look bad. This is obvious since our relative performance on games like Unreal Tournament 2003 and Doom3 shows that The GeForce FX 5900 is by far the fastest graphics on the market today."


Nvidia Pr ;)
 
Tahir said:
Patchs to stop optimisations is a step backwards and unneccesary for a gaming company IMHO. It will simply drag resources away from bug fixing and adding new features.

I agree, though as a game developer I personally wouldn't like watching my art being changed without permission.
 
Doomtrooper said:
digitalwanderer said:
I wonder how nVidia is gonna spin this one...

"Since NVIDIA cards were not used to develop Half-Life 2(a game that pales to Doom 3 anyways) we do not get a chance to work with Valve on writing the shaders we like. We don't know what they did but it looks like they have intentionally tried to create a scenario that makes our products look bad. This is obvious since our relative performance on games like Unreal Tournament 2003 and Doom3 shows that The GeForce FX 5900 is by far the fastest graphics on the market today."


Nvidia Pr ;)

:devilish:
 
John Reynolds said:
Tahir said:
Patchs to stop optimisations is a step backwards and unneccesary for a gaming company IMHO. It will simply drag resources away from bug fixing and adding new features.
I agree, though as a game developer I personally wouldn't like watching my art being changed without permission.
Shader replacement doesn't necessarily mean a different result is obtained. For end-users, shader replacement is a Good Thing (TM) as long as the result is the same as the original because end-users care about performance. For developers, shader replacement can be a little more tricky (why does Shader X run faster than Shader Y?).

C compilers, for example, optimize code to obtain the same result in less cycles; 3D drivers should strive to do the same.
 
Tahir said:
Patchs to stop optimisations is a step backwards and unneccesary for a gaming company IMHO. It will simply drag resources away from bug fixing and adding new features.
Yeah, but I bet it feels damned good if you're pissed off at an IHV. :devilish:
 
OpenGL Guy who is really a D3D Guy:

Shader replacement in your opinion is a Good Thing. What is a bad thing in ways of optimisations? I don't mean anything that harms IQ as we know that but what specifically can harm IQ? Filtering techniques we have already seen, lowering precision is another thing. Anything else?

I suppose ATI is going to take a look at Valves code and see where it can improve Shaders without impacting IQ. It most likely has already happened within HL2 with further improvements down the line.

I expect NVIDIA to release new drivers fairly soon to improve HL2 performance and we will most likely see comparisons of IQ made between drivers and vendors to see who is 'optimising' the most - or in other words to an unacceptable level.

I cannot see Valve devoting a team (or even a single member of staff) to looking at drivers from IHV's specifically for 'optimisations' that harm the overall experience. If some 'optimisations' get turned off with patches it probably means Valve found a better way to render or do AI etc, etc (thus improving performance) and replaced their own code, thus making any driver level 'optimisation' redundant in a later version of the game.

What do you think will happen? I am making some pretty large assumptions but I saw this in the past I am 90% sure I will see it in the future.
 
Tahir said:
I cannot see Valve devoting a team (or even a single member of staff) to looking at drivers from IHV's specifically for 'optimisations' that harm the overall experience.
I don't know about that, Gabe seemed pretty adamantly pissed about optimizations. 8)
 
Does this mean that the hl 2 benchmark will be running the same code path for both Nvidia and ATI cards then? Seems like the most fair way to do it.
 
Tahir said:
OpenGL Guy who is really a D3D Guy:

Shader replacement in your opinion is a Good Thing. What is a bad thing in ways of optimisations? I don't mean anything that harms IQ as we know that but what specifically can harm IQ? Filtering techniques we have already seen, lowering precision is another thing. Anything else?
Sure. Optimizations that only work for specific camera paths. Optimizations that don't translate to real world performance (such as benchmark detected load balancing).
I suppose ATI is going to take a look at Valves code and see where it can improve Shaders without impacting IQ. It most likely has already happened within HL2 with further improvements down the line.
We do this, but don't need to. We have a general optimizer in our driver now that we count on to give us good performance. As it gets more intelligent, then we can see performance improvements. Of course, shaders aren't the only place where we look for performance improvements.
I cannot see Valve devoting a team (or even a single member of staff) to looking at drivers from IHV's specifically for 'optimisations' that harm the overall experience. If some 'optimisations' get turned off with patches it probably means Valve found a better way to render or do AI etc, etc (thus improving performance) and replaced their own code, thus making any driver level 'optimisation' redundant in a later version of the game.

What do you think will happen? I am making some pretty large assumptions but I saw this in the past I am 90% sure I will see it in the future.
The IHVs can make suggestions to Valve where things can be improved, just like we (IHVs) do with other software developers. Ideal driver level optimizations should be general optimizations meaning that other applications doing the same thing will also benefit. If an application stops doing something, then it's not wasted effort because another app. may come along that does it. (Sorry if that's confusing logic, but I'm deliberately trying to avoid specifics.)
 
I think OpenGL Guy who is really a D3D Guy really doesn't like D3D and is one of the reasons why his name remains as OpenGL Guy, maybe because D3D Guy doesn't sound so cool, who knows :D
 
Doomtrooper said:
I think OpenGL Guy who is really a D3D Guy really doesn't like D3D and is one of the reasons why his name remains as OpenGL Guy, maybe because D3D Guy doesn't sound so cool, who knows :D
OpenGL is still my preferred API for coding, but I have to admit that D3D has come a long ways since I chose the name "OpenGL guy" :)

The biggest thing for me when writing test cases in D3D is that I don't have GLUT :p
 
Back
Top