Half Life 2 Benchmarks (From Valve)

Here it comes already:

http://www.extremetech.com/article2/0,3973,1261770,00.asp

nVidia has been circulating its Det50 driver to analysts in hopes that we would use it for our Half-Life 2 benchmarking. The driver contains application-specific optimizations that will likely improve nVidia's overall performance picture, however Valve's Gabe Newell expressed concerns that some of nVidia's optimizations may go too far. Doug Lombardi of Valve has explicitly asked that beta versions of the Det50 drivers not be used for benchmarking.

It'll be most interesting to see which sites HONOR the request of the software developers, and which ones don't....
 
NVIDIA damaged its own reputation and ATI have let their product speak for the company instead because ATI could afford to.

Before we all start pissing on NVIDIA PR et al we should realise if ATI was in a position that NVIDIA commands now (the underdogs) would they have acted with Chivalry or Malice? It isn't over until the fat lady sings.

One generation down another one to come. The worm could turn! ;)


Gah, I feel dirty using all those stupid cliches.
 
Joe DeFuria said:
Here it comes already:

http://www.extremetech.com/article2/0,3973,1261770,00.asp

nVidia has been circulating its Det50 driver to analysts in hopes that we would use it for our Half-Life 2 benchmarking. The driver contains application-specific optimizations that will likely improve nVidia's overall performance picture, however Valve's Gabe Newell expressed concerns that some of nVidia's optimizations may go too far. Doug Lombardi of Valve has explicitly asked that beta versions of the Det50 drivers not be used for benchmarking.

It'll be most interesting to see which sites HONOR the request of the software developers, and which ones don't....

NVIDIA were faster than I imagined. You can view it as pathetic or a last stand. Maybe, just maybe, it will end here.
 
ices_blah.gif


Gabe Newell should work with Futuremark ;)
 
Tahir said:
Before we all start pissing on NVIDIA PR et al we should realise if ATI was in a position that NVIDIA commands now (the underdogs) would they have acted with Chivalry or Malice?

How did ATI act from the time of the TNT all the way through the radeon 8500? Anything remotely like nVidia? All I generally witnessed was a relatively low profile.

I certainly agree that when you're behind, you resort to more PR to try and make up for it. That's natural and expected. However, I have never witnessed

1) A company doing this while they are AHEAD (See, nVidia PowerVR whitepaper, for example)

2) A company doing it to the extent that nVidia is doing it when behind.

One generation down another one to come. The worm could turn! ;)

Absolutely. And I'll be happy as long as if it turns, it turns because of a better product, not better PR.
 
Windfire said:
Seeing the different sights benchmarking HL2 and their interpretation of things will be interesting indeed.

On a side note (kind of humerous, kind of what-if?), back when 3dfx was struggling, near the end, they moved heavily into marketing via slogans and name recognition--everyone remember the "3dfx" stickers that were being put on game boxes? It's like 3dfx recognized they were behind and knew it would be a while before that was corrected, so they put lots of publicity and energy into marketing.

Hmm. What is Nvidia doing these days: "The way it's meant to be played?"

It appears that with the 5800 they had a big misstep. Now it seems like their entire DX9 line-up may be a major screw-up--if that turns out to be the case, I think I'll look at marketing blitz's in a different way (first 3dfx, now NVidia, etc.).

Anyone else remember..."So powerful...it's a little ridiculous"
:)

Jack
 
Lol....

http://firingsquad.gamers.com/hardware/hl2_performance_preview_part1/page2.asp

The FX5600 Ultra is the best value while the Radeon 9600 is the most expensive per FPS.

Um...methinks they got it backwards. :rolleyes:

Though in FiringSquad's defense, the FPS/$ graph does have the x axis mislabeled as $/FPS.

Oh...and nVidia is pretty busy with damage control:

This is NVIDIA's Official statement: "The Optimiziations for Half-Life 2 shaders are in the 50 series of drivers which we made available to reviewers on Monday [Sept. 8, 2003]. Any Half-Life 2 comparison based on the 45 series driver are invalid. NVIDIA 50 series of drivers will be available well before the release of Half-Life 2".

So...nVidia says only beta dets are valid, and Valve says they are not. Web sites....start aligning yourselves....
 
Jack_Tripper said:
Anyone else remember..."So powerful...it's a little ridiculous"
:)
I recall their response to comments about lack of 32-bit color, limit of 256x256 textures, etc. "At 60 frames per second, you can't tell the difference." (I may have paraphrased that and please don't ask me for a source.)
 
OpenGL guy said:
I recall their response to comments about lack of 32-bit color, limit of 256x256 textures, etc. "At 60 frames per second, you can't tell the difference." (I may have paraphrased that and please don't ask me for a source.)

Don't be dissin' 3dfx. ;)

Seriously, I don't recall 3dfx making such a statement at all. They did, however, say things along the lines of "what good is 32 bit color, or large textures if that means sub 60 FPS?" That is at least perfectly defensible. (Same argument they used against AGP texturing, BTW).
 
http://anandtech.com/showdoc.html?i=1862
- Valve is pissed at all of the benchmarking "optimizations" they've seen in the hardware community;
- Half-Life 2 has a special NV3x codepath that was necessary to make NVIDIA's architecture perform reasonably under the game;
- Valve recommends running geforce fx 5200 and 5600 cards in dx8 mode in order to get playable frame rates.
- even with the special NV3x codepath, ATI is the clear performance leader under Half-Life 2 with the Radeon 9800 Pro hitting around 60 fps at 10x7. The 5900 ultra is noticeably slower with the special codepath and is horrendously slower under the default dx9 codepath

thread over at NVnews http://www.nvnews.net/vbulletin/showthread.php?s=&threadid=17785
 
Joe DeFuria said:
Lol....

http://firingsquad.gamers.com/hardware/hl2_performance_preview_part1/page2.asp

The FX5600 Ultra is the best value while the Radeon 9600 is the most expensive per FPS.

Um...methinks they got it backwards. :rolleyes:

Though in FiringSquad's defense, the FPS/$ graph does have the x axis mislabeled as $/FPS.

Oh...and nVidia is pretty busy with damage control:

This is NVIDIA's Official statement: "The Optimiziations for Half-Life 2 shaders are in the 50 series of drivers which we made available to reviewers on Monday [Sept. 8, 2003]. Any Half-Life 2 comparison based on the 45 series driver are invalid. NVIDIA 50 series of drivers will be available well before the release of Half-Life 2".

So...nVidia says only beta dets are valid, and Valve says they are not. Web sites....start aligning yourselves....

sounds like not a valve approved driver... ;)

why did valve make an nv3x codepath if nvidia will replace their shaders anyway , they could have safed alot of time

http://firingsquad.gamers.com/hardware/hl2_performance_preview_part1/page3.asp

Optimization Investment

• 5X as much time optimizing NV3X path as we’ve spent optimizing generic DX9 path
• Our customers have a lot of NVIDIA hardware
• We were surprised by the discrepancy
• ATI hardware didn’t need it
 
Joe DeFuria said:
(I'd be lying if I didn't admit to getting some distrubing enjoyment out of seeing bullshit PR coming back to haunt the bullshitter...."Use real games!" heh....)
Yeah, it is fun to see the hubris filled humbled so..I can't wait to see tomorrows news!
 
Joe DeFuria said:
OpenGL guy said:
I recall their response to comments about lack of 32-bit color, limit of 256x256 textures, etc. "At 60 frames per second, you can't tell the difference." (I may have paraphrased that and please don't ask me for a source.)

Don't be dissin' 3dfx. ;)

Seriously, I don't recall 3dfx making such a statement at all. They did, however, say things along the lines of "what good is 32 bit color, or large textures if that means sub 60 FPS?" That is at least perfectly defensible. (Same argument they used against AGP texturing, BTW).
But they were just wrong on both counts :) AGP texturing doesn't have to mean sub 60 fps performance, it just means the HW has to hide enough latency to make the AGP bus work well. And you're telling me that if you got 59 fps with 32 bit color and large textures that it's too slow? You can't make blanket statements like this. Sometimes eye candy is worth the performance hit as different games have different requirements.

P.S. I'll diss 3dfx all I want... I bought their damn stock at $2.25 ("It can't go any lower.")... look where it is now :p
 
OpenGL guy said:
But they were just wrong on both counts :)

Sort of...they always stressed that features don't exist in a vacuum...the have to be usable. That is true, and I daresay ATI can make the same arguments against their competitor wrt 32FP shaders, for example. ;)

Now, whether or not those features were usable or not is the real question. But let's not get into that. ;)

Sometimes eye candy is worth the performance hit as different games have different requirements.

Agreed.

P.S. I'll diss 3dfx all I want... I bought their damn stock at $2.25 ("It can't go any lower.")... look where it is now :p

Lol...I own ATI stock at the moment, so er...GET BACK TO WORK! :devilish:
 
digitalwanderer said:
Joe DeFuria said:
Lol...I own ATI stock at the moment, so er...GET BACK TO WORK! :devilish:
Ouch, I didn't think of that...think nVidia's stock is going to take a hit in the morning? :devilish:

Actually, I don't know. Sometimes, "the street" totally ignores stuff like this...and sometimes the market as a whole reacts to other news.

I'm more worried at the moment that the street will get "nervous" about another bin-laden tape being aired, which sometimes preceeds terrorist attacks...
 
Back
Top