Half Life 2 Benchmarks (From Valve)

Joe DeFuria said:
Reverend said:
Don't get all upset now... I was thinking in terms of a possible article detailing why the beta Det50s is like what Gabe described ("optimizations that may go too far"), i.e. what NVIDIA is doing with the beta dets. Not in reviews.

How can you do an article like that without testing the drivers? Me confused. ;)
Why not? I can just label the article "A study on NVIDIA's Beta Detonator 50 drivers" and it can be about certain NVIDIA GFFX cards, with some image output comparisons of HL2 with ATI cards, to show any possible image output differences between ATI DX9 and GFFX cards.

I mean, it can be equally unfair to nVidia, if you start picking out things as quality comprimises which actually turn out to be bugs and are fixed in the official release. In short, I just assume you stay away from unlreased drivers in general.
And if something odd is happening, then the report is made in the article. It would be hard, for instance, to mistake lack of precision for a bug.

Surely we have all seen stuff described by NVIDIA as "bugs" in their official drivers... do we know if they are indeed bugs, do we take their word for it, even if the drivers are official?

Once they are released, then they can be praised (or bashed) for their performance / quality improvements (or degredations.)
If NVIDIA hands reviewers drivers that cannot be made available to the public, perhaps it may be a good idea for a reviewer to find out the reason why exactly these drivers are not public, and reveal their findings. I mean, you're not the slightest bit interested to know why NVIDIA gives these beta drivers to reviewers just for HL2 testing? You're not interested in an article that attempts to corroborate Gabe Newell's words about "optimizations gone too far"?

The point is, such articles are fine as long as the point is made about what exactly is the nature of the article.
 
Re: ATI is canadian

Cpt.Canuck said:
All you guys keep ragging on ATI saying they would act the same way if they were in NVidia's shoes. You guys have to realize ATI is CANADIAN. Up here in the great white north we lead by example, we don't shoot our mouth off about what we do. We let our products or services speak for themselves, and just look at our economy. It obviously works.

Please do not turn this into an us (Canadian) vs. them situation. We have enough bad companies here also (cough....banks.....cough).

With regards to the det. 50's why do I get the feeling that HL2 will look like PONG with these. ;)
 
Reverend said:
Why not? I can just label the article "A study on NVIDIA's Beta Detonator 50 drivers" and it can be about certain NVIDIA GFFX cards, with some image output comparisons of HL2 with ATI cards, to show any possible image output differences between ATI DX9 and GFFX cards.
Well, except for the fact that you'd be using a cheesy tactic to do an end-run around the game developers request.....absolutely nothing I guess. :(

EDITED BITS: Spelling error, I'm typing too fast tonight. :rolleyes:
 
Re: ATI is canadian

Cpt.Canuck said:
All you guys keep ragging on ATI saying they would act the same way if they were in NVidia's shoes. You guys have to realize ATI is CANADIAN. Up here in the great white north we lead by example, we don't shoot our mouth off about what we do. We let our products or services speak for themselves, and just look at our economy. It obviously works. So you guys who keep saying ati would act the same way nvidia did, you guys should rethink your reasons for stating that. Nvidia thought it could play the market to upgrade their products, and ended up hurting their name instead. And also, I think deep down everyone knew ati was starting to creep onto nvidia's turf at the top. Just look at their clock speeds and cooling solutions for their cards. All are lower than Nvidia, but look at the benchmarks. Nvidia needs leaf blowers for fans, and for some rather new ati cards, such as 9000, 9200, and so on, only need heatsinks, but still crank out the numbers, so it's only fitting one of the biggest games in gaming history help bring the "underdog" to the throne it deserves.

ATi is a company that spans multiple countries and continents, never forget. ATi has sizable operations in the US, not to mention Asia. A great deal of ATi's revenue, as well, comes from sales in the US. I think trying to paint ATi with a nationalistic brush is not only wrong, it's tacky...;)

BTW, although you might not be aware of the fact, other nations on Earth have economies that "obviously work," too...;) (Which is how ATi derives so much of its income from them.) Just a point to ponder...Heh...;)
 
digitalwanderer said:
Joe DeFuria said:
If Kyle actually admits he was wrong, I might actually visit his site again.
Ditto, but I doubt it'll happen.

But if nothing else, [H]'s reaction to this should be interesting to say the least... :LOL:

(I'm DEFINATELY enjoying this too much!)

SAme thoughts here.

But Kyle, we wanna see one sentence clearly: you saying 'yes, I admit it, I was wrong.

8)
 
digitalwanderer said:
Reverend said:
Don't get all upset now... I was thinking in terms of a possible article detailing why the beta Det50s is like what Gabe described ("optimizations that may go too far"), i.e. what NVIDIA is doing with the beta dets. Not in reviews.
I'm upset over the simpleton comment (which you didn't call me, you just implied), on this I'm just confused.
Wait a minute now... I really have no idea how you interpreted whatever I posted as implyiing you're a "simpleton". You called yourself a simpleton... nothing I posted even remotely implies that I agree with yoyur own assumption of yourself.
 
WaltC said:
OpenGL guy said:
But they were just wrong on both counts :) AGP texturing doesn't have to mean sub 60 fps performance, it just means the HW has to hide enough latency to make the AGP bus work well. And you're telling me that if you got 59 fps with 32 bit color and large textures that it's too slow? You can't make blanket statements like this. Sometimes eye candy is worth the performance hit as different games have different requirements.

P.S. I'll diss 3dfx all I want... I bought their damn stock at $2.25 ("It can't go any lower.")... look where it is now :p

Ever hear of the VSA-100? It's a 32-bit integer chip by 3dfx--I owned one, and can tell you without a doubt that whatever they said in 1999 about "32-bits" was constrained to 1999--which they made abundantly clear over and over, and also convincingly proved by releasing a product that fully supported 32-bits in June of 2000 (which had been originally slated for a December '99 release.) Additionally, the V5's FSAA put everybody to shame, as it was the first 3d card to offer it---that's a trend they started in this market that has a lot of legs under it yet. Sort of strange how 3dfx revisionists talking about their mistaken ideas that 3dfx "opposed" 32-bit integer rendering *always* manage to forget that at the same time 3dfx introduced FSAA they did it with a 32-bit product.

When they said what you thought they said in 1999 about "32-bits" they were responding to questions about why the V3 (released in early '99) didn't support 24-bits like nVidia's TNT. I had a TNT, too, and can tell you that 24-bits on it (let alone adding in the alpha channel) was quite unplayable. When 3dfx shipped the V3 the only thing nVidia was shipping was the TNT. Not to mention the fact that 99.9999% of all 3d game engines in '99 were (if not GLIDE engines) then surely 16-bit engines internally (some were even 8-bit, still). The great "debate" in '99 about "32-bits" consisted entirely of magnified screen shots of alpha-channel blending errors. Wow, what an enlightening argument.
That's odd. I recall playing many games in 32-bit color on my TNT and TNT2 Ultra.
As to 512x512 textures and >, gee, didn't quake2 have like, SIX of them (with the rest--several dozens of the rest-- being 256x256 and <)???...*chuckle* As I recall, this was also a very intelligent debate--with the 1% of the textures in a 3d game > 256x256 receiving 99% of the publicity and attention--also with many magnified screen shots...;) It was pretty funny at the time (but not, I imagine, for 3dfx, being the butt of all of that nonsense.)
But if the #1 selling graphics card had supported higher res textures, maybe Quake 2 (or other games) would have used them?
Last, I'm not sure what your point about AGP texturing was. IMO, it sucked then and sucks now...;) Just ask Intel about the i74x and i75x it tried unsuccessfully to foist on the market (I had at least one of those that I recall)--the cards maxed out with 8-16mbs of onboard ram and relied on AGP texturing to fill in the gaps--and they were all dog slow--I mean, abysmally slow--3dfx, nVidia, practically everybody, walked all over i7xx in performance. And they all carried substantially more onboard ram than i7xx, too. It was such a flop that Intel took its marbles, closed shop, and went home. And "AGP texturing" was the star of the i7xx show...whew....;) OpenGl guy, it's surprising to see you err so conspicuously in your recollection...!...:D
You're off base again. When you run 1600x1200 w/ 6x AA on a 9700/9800, where do you think most textures reside? ;)

Just because AGP can be slow, doesn't mean it has to be slow. When I was at S3, I deliberately put things in AGP because it helped performance. It all depends on how efficient your AGP interface is.
 
nVidia has been circulating its Det50 driver to analysts in hopes that we would use it for our Half-Life 2 benchmarking. The driver contains application-specific optimizations that will likely improve nVidia's overall performance picture, however Valve's Gabe Newell expressed concerns that some of nVidia's optimizations may go too far. Doug Lombardi of Valve has explicitly asked that beta versions of the Det50 drivers not be used for benchmarking.

Maybe its more like Valve pushing up the release of these benchmarks to circumvent nVidia's improper (in Valve's opinion?) optimizations in the new Det's. from distorting the truth. /speculate
 
whoah, you guys are quick off the draw eh? yes I am new to this site, and it's posts, so I must say I wasn't expecting such a retort so quickly. BUt yeah, of course Ati is an international company, but it's head office is in Ontario Canada. But enough about Canada, I always have trouble keeping my patriotism out of my rants. heh. But it's funny how you guys could only draw attention to my patriotism. How about some of my other points. But as far as I'm concerned, the fact that ATI is a canadian company is a big turn on for me. heh
 
digitalwanderer said:
Reverend said:
Why not? I can just label the article "A study on NVIDIA's Beta Detonator 50 drivers" and it can be about certain NVIDIA GFFX cards, with some image output comparisons of HL2 with ATI cards, to show any possible image output differences between ATI DX9 and GFFX cards.
Well, except for the fact that you'd be using a cheesy tactic to do an end-run around the game developers request.....absolutely nothing I guess. :(
No, all I'm interested in (and is the point of my hypothetical article) is to find out what Gabe meant by "optimizations gone too far". I want to know what sort of "optimizations" NVIDIA have included in their beta drivers, that's all. And then, when the official drivers are eventually released, this hypothetical article may be useful for comparing the two sets of drivers, if the official drivers have differences in terms of performance and/or IQ compared to the beta ones. That way, we have at least one reason why reviewers were given these beta drivers just for testing HL2.
 
digitalwanderer said:
Reverend said:
Why not? I can just label the article "A study on NVIDIA's Beta Detonator 50 drivers" and it can be about certain NVIDIA GFFX cards, with some image output comparisons of HL2 with ATI cards, to show any possible image output differences between ATI DX9 and GFFX cards.
Well, except for the fact that you'd be using a cheesy tactic to do an end-run around the game developers request.....absolutely nothing I guess. :(

Good Lord. Quit being so polar. I dont even own an nvidia card and I still want to see what improvements the Det50's are going to bring to HL2. Sheesh.
 
I hope when the game/benchmark does come out that you guys will check out and see if the R3xx line have any IQ degrading optimisations.

No offense to Ati employees, I just believe that EVERYTHING should be checked out before passing judgement.

These numbers from Valve are absolutely meaningless for me to make a judgment on ATI's IQ in the test.
 
As with previous tests on NV3.xx hardware, what really makes them improve is....precision drops...FX12 to be exact.

That is what I'm taking from that comment, and only image quality tests would probably show it.
 
K.I.L.E.R said:
I hope when the game/benchmark does come out that you guys will check out and see if the R3xx line have any IQ degrading optimisations.

No offense to Ati employees, I just believe that EVERYTHING should be checked out before passing judgement.

These numbers from Valve are absolutely meaningless for me to make a judgment on ATI's IQ in the test.
That's fine, but do you think Valve would have mentioned performance if there were image quality issues? Also, please don't forget that ATI's products with numbers 9500 and above don't support lower precision modes.

But feel free to investigate all you want.
 
I'm not talking about lower levels of precision.
How do I know that the shadows are rendered correctly for example?

I simply don't trust businesses/corporations.
I'm not implying that you slipped in some cheats or whatnot, I'm simply stating that I know nothing of the conditions of the tests.

The guys simply threw in some numbers and walla. Not much information for me to work with.

OpenGL guy said:
K.I.L.E.R said:
I hope when the game/benchmark does come out that you guys will check out and see if the R3xx line have any IQ degrading optimisations.

No offense to Ati employees, I just believe that EVERYTHING should be checked out before passing judgement.

These numbers from Valve are absolutely meaningless for me to make a judgment on ATI's IQ in the test.
That's fine, but do you think Valve would have mentioned performance if there were image quality issues? Also, please don't forget that ATI's products with numbers 9500 and above don't support lower precision modes.

But feel free to investigate all you want.
 
Reverend said:
digitalwanderer said:
Reverend said:
Don't get all upset now... I was thinking in terms of a possible article detailing why the beta Det50s is like what Gabe described ("optimizations that may go too far"), i.e. what NVIDIA is doing with the beta dets. Not in reviews.
I'm upset over the simpleton comment (which you didn't call me, you just implied), on this I'm just confused.
Wait a minute now... I really have no idea how you interpreted whatever I posted as implyiing you're a "simpleton". You called yourself a simpleton... nothing I posted even remotely implies that I agree with yoyur own assumption of yourself.
http://www.beyond3d.com/forum/viewtopic.php?t=7748&start=100
Reverend said:
It amazes me just how simple-minded the public can be with regards to the many considerations involved in the game and 3D industry.
About half-way down the page directed at as a response to a quote of mine. You are correct that you didn't call me a simpleton, (which I mentioned), but I do believe that "simpleton" is short for "a simple-minded individual" and it was shorter to type. :)
 
Valve's comment about the NV3X path taking 5 times longer to write for is stunning. With the NV40 coming next year, how much do you want to bet that the NV3X path will not show up in future games. Why write a NV3X path anyway because Nvidia will do that for you with their drivers.
 
http://www.techreport.com/etc/2003q3/valve/index.x?pg=1

As you can tell from looking at the list in the slide above, Newell was concerned particularly with some of the techniques NVIDIA has used in recent driver releases, although he didn't exempt other graphics hardware makers from his complaints. He said they had seen cases where fog was completely removed from a level in one of Valve's games, by the graphics driver software, in order to improve performance. I asked him to clarify which game, and he said it was Half-Life 2. Apparently, this activity has gone on while the game is still in development. He also mentioned that he's seen drivers detect screen capture attempt and output higher quality data than what's actually shown in-game.


Now their is a nifty optimization, something that has been specualted in this very forum that was being done. :!:
 
rwolf said:
Valve's comment about the NV3X path taking 5 times longer to write for is stunning.

? Where did you read that? Link, please! :)

With the NV40 coming next year, how much do you want to bet that the NV3X path will not show up in future games. Why write a NV3X path anyway because Nvidia will do that for you with their drivers.

LOL
 
It gets even better.
As you can tell from looking at the list in the slide above, Newell was concerned particularly with some of the techniques NVIDIA has used in recent driver releases, although he didn't exempt other graphics hardware makers from his complaints. He said they had seen cases where fog was completely removed from a level in one of Valve's games, by the graphics driver software, in order to improve performance. I asked him to clarify which game, and he said it was Half-Life 2. Apparently, this activity has gone on while the game is still in development. He also mentioned that he's seen drivers detect screen capture attempt and output higher quality data than what's actually shown in-game.
http://www.techreport.com/etc/2003q3/valve/index.x?pg=1
 
Back
Top