Anand's HL2 benchmarks

WaltC said:
Speaking of lighting, which scenes in the D3 alpha are outside on a rooftop in broad daylight with the sun peering down through the clouds?....;)..

I agree. And i seem to remember that Carmack has said that Doom3 won't be anything special when it comes to outdoor scenes.

and I wouldn't be surprised if upon seeing HL2 ID doesn't do a bit more work on the game prior to release (which is why I think D3 was put back in the first place--just MO...)

I kinda doubt that, since imo, the 1 year old D3 alpha looks way better then HL2.
 
I am not that impressed with halflife2 actually either. I like how doom lookjs much more. I will buy hl2 b/c I enjoyed playing hl1. I have heard that if you download stuff for half life2 you will have to connect for single player if true this is bad i.e. most of my SP experience is when the internet is broken.


I personally believe that HalfLife is a kinda mediocore engine with too many coders like battle field 1942. Well that is a poor statement hmm.. it is more that it looks nice but runs slow for what it actually accomplishes. i.e. from what I can tell the poor performance of all cards including 9800 is not justified by the leap in visual quality, but we will have to wait and see I suppose.
 
I'm not sure we can really compare the engines until we see both rendering similar styled scenes. Just about all of what we've seen of Doom 3 is dark corridors with lots of shadows and some smoke. In HL2 we've seen large outdoor areas and a bit of indoor scenes, both of which were quite well lit. As such, we cannot determine just from that if the difference is a matter of technology or art.
 
I was very disappointed with Anand's opening commentary. Specifically in the way he just brushed off the things nVidia's done in the last year by saying "they've all" done these kinds of things in the past. That certainly is true on its face, but the degree and kind of things nVidia has done in the last 10 or so months are unprecedented in my experience. The two things Anand glosses over that are the most egregious to me are:

(1) The factual misrepresentation of nV30/35's pipelines. nVidia published in its official spec sheets that the architecture was 8x1 when the truth has always been it's 4x2. This misinformation was put on its website, on its product boxes, and everywhere else you can think of--before it was disproved. (Is it still advertised as 8x1? I haven't checked lately.) I cannot *ever* remember a manufacturer deliberately misrepresenting something as fundamental as this in marketing a 3d chip at any time. The confusion it caused left people guessing about why nV30 was so slow at 500MHz compared to the 325MHz R300 for a much longer time than would have happened had this aspect of the architecture been accurately disclosed from the beginning.

Of course, the reason for nVidia doing this is obvious--R300 was a genuine 8x1 chip, and nVidia was loathe to advertise anything less for nV30, so it lied about it. The stupidity of something like that is amazing--that they actually thought no one would be the wiser. But the entire sad saga of nV3x all year has been one of unmitigated stupidity, in my view. "zixels" indeed...*chuckle--sneer*...

I think this one story alone is possibly the biggest journalistic coup imaginable for some enterprising Internet journalist--it's the stuff that investigative reporting is made of--but---whoosh--right over Anand's head, it seems...

(2) nVidia quitting the FM program last year and issuing public statements through convenient lackeys, proxies, and stooges, as well as its own white papers, explaining why FM was so woefully misguided to believe that DX9 and especially PS2.0 would ever conceivably be in the cards for the future of 3d gaming.

Considering that nVidia was an active partner with M$ through both the xBox contract and through its participation in the formulation of the DX9 API, what on earth could possibly make them reach such an unmitigatedly stupid conclusion as to the "future" of 3d gaming? I mean, we're talking about a company which had direct input to and direct knowledge of the DX9 API long before it shipped its first nV3x product.

This was truly remarkable behavior in and of itself, and quite distinct from the specific cheating instances that have been so thoroughly documented over the time period. It basically spells out that nVidia had become so full of itself, and so enamoured of its own imagined influence and power in the 3d-chip markets, that it believed it could literally sneer at everyone else and set its own irresistible path which the world would be compelled to follow.

But honestly, as well, I think there is a simpler explanation: nV3x is the product of nVidia's architecture milking policy, which it was well into at the time ATi dark-horsed the R300 last year. nVidia had become so comfortable in its temporary position of being "out front" in the 3d-performance sector of the market (which sets the pace for all other areas of that market) that it was far less concerned with *any* future of 3d gaming and was primarily concerned with milking its existing architectures as long as possible with maximum profit, under the misguided assumption it had eliminated most of its competition and would soon eliminate the rest. So it isn't surprising that nV3x is entirely lacking as DX9 hardware--from the beginning it was always meant to be merely an incremental improvement over nv25 offering incomplete DX9 hardware support.

Never have I seen a 3d-chip company be so wrong in reading the tea leaves (not only about the "future" of 3d gaming--which it should have easily been able to see, but also about the .13 micron manufacturing process--the emphasis nVidia placed on an UNPROVEN process for nV30 is unprecedented, in my view.) That's an interesting story in and of itself. But I guess Anand thinks that all 3d-chip companies have done this before. Problem is--I can't recall it...

Finally, the last criticism I have for Anand is while he makes several veiled references to the "multi-million dollar" marketing agreement between Valve and ATi, he--it seems to me--deliberately omits recounting that nVidia has been engaged in doing the same exact thing for most of the year, under the auspices of its TWIMTB marketing campaign. Hello, Anand? Are software bundling deals with various hardware IHVs a new and unprecedented event? Hardly--they've been going on ever since I can remember. I didn't mind him talking about the bundling deal.

Big deal, I say (Who cares?) But to omit the fact that nVidia has been engaging in the very same "multi-million dollar" marketing deals all year long with various companies (most notably with Epic/Atari last year), while you are making the thinly veiled accusation that although Valve stated it chose ATi on the basis of its technology that *might not* really be the case since the deal exists....well, that's just poor Internet journalism, in my view.
 
Bjorn said:
WaltC said:
Speaking of lighting, which scenes in the D3 alpha are outside on a rooftop in broad daylight with the sun peering down through the clouds?....;)..

I agree. And i seem to remember that Carmack has said that Doom3 won't be anything special when it comes to outdoor scenes.

and I wouldn't be surprised if upon seeing HL2 ID doesn't do a bit more work on the game prior to release (which is why I think D3 was put back in the first place--just MO...)

I kinda doubt that, since imo, the 1 year old D3 alpha looks way better then HL2.

I guess my remarks that the reason the lighting is much different in D3 is because there aren't any outdoor scenes in bright lighting--in which you don't see a lot of shadows--didn't quite register?....:) Or, to put it another way--if we moved the D3 scenes to the outdoors under the noon-day sun--how do you imagine they'd look "way better"...?
 
WaltC said:
I was very disappointed with Anand's opening commentary.

I was a bit too....though my biggest "duh!" came when I read this:

Anand said:
Because, for the most part, we've had absolutely nothing better to do with this hardware. Our last set of GPU reviews were focused on two cards - ATI's Radeon 9800 Pro (256MB) and NVIDIA's GeForce FX 5900 Ultra, both of which carried a hefty $499 price tag. What were we able to do with this kind of hardware? Run Unreal Tournament 2003 at 1600x1200 with 4X AA enabled and still have power to spare, or run Quake III Arena at fairytale frame rates.

Um, no. What you COULD have done is what B3Ders have been doing: investigating shader performance with "synthetic pixel shader" tests, in addition to running old games at insane seetings. But no....Anand, like others, dismissed such tests as largely irrelevant.

And then Anand acts like HL2 performance is some sort of revelation? And the "exciting times" are finally upon us?

The only revelation is that DX9 shaders are in a high-profile game. We all knew it was coming, just a matter of when. If you wanted to show the "difference" between the video cards other than meaningless "250 vs. 275 FPS", you could've started last year with 3DMark.....
 
Joe DeFuria said:
Anand said:
Because, for the most part, we've had absolutely nothing better to do with this hardware. Our last set of GPU reviews were focused on two cards - ATI's Radeon 9800 Pro (256MB) and NVIDIA's GeForce FX 5900 Ultra, both of which carried a hefty $499 price tag. What were we able to do with this kind of hardware? Run Unreal Tournament 2003 at 1600x1200 with 4X AA enabled and still have power to spare, or run Quake III Arena at fairytale frame rates.

Um, no. What you COULD have done is what B3Ders have been doing: investigating shader performance with "synthetic pixel shader" tests, in addition to running old games at insane seetings. But no....Anand, like others, dismissed such tests as largely irrelevant.

Dude - you missed the point. Anand and his readers couldn't care less about investigating shader performance with "synthetic pixel shader" tests.

His point is valid - there's no games out atm (afaik) that actually use DX9 level hardware. And if any tests B3D comes up with don't involve firing guns at scary monsters or girls with massive tits then Anand and his readers probably aren't all that interested.
 
UberLord said:
Dude - you missed the point. Anand and his readers couldn't care less about investigating shader performance with "synthetic pixel shader" tests.

That's interesting.

I'm an Anand reader, and I'M interested in such performance.

Why, I thought the purpose of reviews were to offer buying guidance for video cards, isn't it? Surely, there is a large portion of Anand's readership that feels the same way?

One would think it would be in the best interests of Anand's readers to be informed of the shader characteristics of products that, quite bluntly, advertise shaders as a key selling point.

His point is valid - there's no games out atm (afaik) that actually use DX9 level hardware.

Yes, and there is some reason to get exicted about that.

And if any tests B3D comes up with don't involve firing guns at scary monsters or girls with massive tits then Anand and his readers probably aren't all that interested.

(Futuremark...take note for 3DMark04 design..."big tits"...if you survive that long.)

So you're saying that Anand and his readers are juvenile, and just don't care about the actual techincal truth...but are far more interested in irrelevant fluff.

Yeah, well, that's pretty much my point. ;)
 
It basically spells out that nVidia had become so full of itself, and so enamoured of its own imagined influence and power in the 3d-chip markets, that it believed it could literally sneer at everyone else and set its own irresistible path which the world would be compelled to follow.

Shown in thier very statement on this issue:

Rel. 50 is the best driver we've ever built - it includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers.

Yeah, we know nvidia.. and it was obvious Gabe knows as well. Why else would they spend 5x the amount of time on special paths for nVidia products? :?


The coolest thing I've been reading is the dynamic implementation of AA and AF. Nothing I hate more than remembering to go into my settings for each game.. let alone remembering my favorite settings for each. And on top of that.. it sounds as if it will only be turned on when needed. Maybe this is already in other games.. this is probably no suprise to anyone. I just find it damn cool! :D
 
His point is valid - there's no games out atm (afaik) that actually use DX9 level hardware.

The new tombraider, which beyond3d did do tests on. It also showed the nv30 being blown away by the r300. And seeing as it features a girl with massive tits, it's perfect.
 
Joe DeFuria said:
I'm an Anand reader, and I'M interested in such performance.
Joe, you read EVERYTHING...you're not just an Anand reader! ;)

(I've never met anyone who frequents more boards than me 'til I met you! ;) )
 
I kinda doubt that, since imo, the 1 year old D3 alpha looks way better then HL2.

I can only presume you've seen far more of Doom3 than I have... what I've seen looks cool and all, but it's nowhere near the level of graphics shown in the HL2 demo videos; which shouldn't be surprising when it's using DX8-ish OpenGL 1.x shaders rather than DX9 or OpenGL2.0 shaders.

Personally I'm _far_ more interested in getting HL2 than Doom3 for precisely that reason: I'm just hoping that the gameplay doesn't get as tedious and repetitive as HL1 did by half-way through.
 
Joe DeFuria said:
UberLord said:
Dude - you missed the point. Anand and his readers couldn't care less about investigating shader performance with "synthetic pixel shader" tests.

That's interesting.

I'm an Anand reader, and I'M interested in such performance.

I don't want to topple you from your high horse, but I was referring to the bulk of his readers, not you as a person :rolleyes:

Why, I thought the purpose of reviews were to offer buying guidance for video cards, isn't it? Surely, there is a large portion of Anand's readership that feels the same way?


One would think it would be in the best interests of Anand's readers to be informed of the shader characteristics of products that, quite bluntly, advertise shaders as a key selling point.

You're correct, but I feel that a large portion of Anand's readership wants to know real world game performance. I don't really think they care about shader performance testing.

HL2 makes good use of shaders (so I'm led to believe) so he reviews HL2 performance. If they want a kick ass HL2 platform then get an ATI card - which he pretty much says.

Gamers care most about how games work overall, not "shader performance" which they may know little about. So whats the point of doing these tests when they mean little or nothing to gamers? Gamers know games - and they like knowing that their supa dupa $$$$ DX9 card can run Q3A at fairytale fps!
 
jjayb said:
His point is valid - there's no games out atm (afaik) that actually use DX9 level hardware.

The new tombraider, which beyond3d did do tests on. It also showed the nv30 being blown away by the r300. And seeing as it features a girl with massive tits, it's perfect.

Ah well, I stand corrected. Goes to show how much you miss being out of the computer gaming loop for a few months :oops:
 
UberLord said:
Dude - you missed the point. Anand and his readers couldn't care less about investigating shader performance with "synthetic pixel shader" tests.

His point is valid - there's no games out atm (afaik) that actually use DX9 level hardware. And if any tests B3D comes up with don't involve firing guns at scary monsters or girls with massive tits then Anand and his readers probably aren't all that interested.

No, you missed the point--burying your head in the sand is not wise.

Surely you can speak for yourself, but I doubt you can divine Anand's personal motives, nor the desire of his readership. First of all, no site on the Internet has a captive readership--anybody who chooses can visit any web site he chooses for the information he chooses to seek out. If the information isn't presented at Anand's, then Anand's readership will simply seek it out somewhere else--in sites like B3d, for instance.

But back to the point you've missed entirely. Based on the currently shipping DX9 Tomb Raider, and now the revelations concerning the DX9 HL2 shipping Real Soon Now, it has become glaringly evident that benchmarks with DX9 feature support like 3dMK03 have been right on the money all along. In fact, I can't really recall a time in which synthetic benches have been as accurate about "real 3d game performance" as they are today concerning shipping and immediately upcoming DX9 games. So, far from the synthetic benchmark story being "irrelevant" it has proven entirely relevant and reliable in forecasting the trend of 3d games. Those who have maintained otherwise over the past months have been proven wrong beyond a shadow of a doubt. So when you say you "could care less" about these things you are simply saying that you just don't care whether the 3d card you buy has any longevity. It's pretty much that simple.
 
UberLord said:
If they want a kick ass HL2 platform then get an ATI card - which he pretty much says.
Don't you think with all the mounting evidence against the FX that "pretty much says" is a bit weak?

I do.
 
UberLord said:
I don't want to topple you from your high horse, but I was referring to the bulk of his readers, not you as a person :rolleyes:

Good to know you speak for the bulk of his readers.

And don't worry, no one's been able to knock me off my high-horse to date...I don't see you as a threat. ;)

You're correct, but I feel that a large portion of Anand's readership wants to know real world game performance. I don't really think they care about shader performance testing.

You're just not understanding my point.

If Anand and his readers WOULD care about shader performance testing, they would have had a lot more to be "excited about" over the past year. It's my opinion that they don't care, because they don't understand the implications.

Anand was complaining about basically how "boring" it's been. And all I'm saying is: he's not looking in the right places.

Gamers care most about how games work overall, not "shader performance" which they may know little about.

Again, my point is, "shader performance" has a direct impact on how games work overall. So if they care about how games work overall, they should care about shader performance enough to be able to use it to make judgements concerning it.

So whats the point of doing these tests when they mean little or nothing to gamers?

Lol...check out some of those threads about GeForceFX owners VENTING about being cheated by nVidia (or by Valve, doesn't matter.) I wager these same people complaining about why their card "isn't doing so well in this GAME" make up a good portion of the "bulk" of readers who don't care about "synthetic shader performance."

Gamers know games - and they like knowing that their supa dupa $$$$ DX9 card can run Q3A at fairytale fps!

Again....I AGREE with that.

I'm stating how I'm disappointed that's apparently the case....Anand included.

Understand?
 
nvhl2.jpg

(Found on DH. ;) )
 
Back
Top