Anand's HL2 benchmarks

Joe DeFuria said:
I was a bit too....though my biggest "duh!" came when I read this:

Anand said:
Because, for the most part, we've had absolutely nothing better to do with this hardware. Our last set of GPU reviews were focused on two cards - ATI's Radeon 9800 Pro (256MB) and NVIDIA's GeForce FX 5900 Ultra, both of which carried a hefty $499 price tag. What were we able to do with this kind of hardware? Run Unreal Tournament 2003 at 1600x1200 with 4X AA enabled and still have power to spare, or run Quake III Arena at fairytale frame rates.

Um, no. What you COULD have done is what B3Ders have been doing: investigating shader performance with "synthetic pixel shader" tests, in addition to running old games at insane seetings. But no....Anand, like others, dismissed such tests as largely irrelevant.

And then Anand acts like HL2 performance is some sort of revelation? And the "exciting times" are finally upon us?

The only revelation is that DX9 shaders are in a high-profile game. We all knew it was coming, just a matter of when. If you wanted to show the "difference" between the video cards other than meaningless "250 vs. 275 FPS", you could've started last year with 3DMark.....

Too true...I mentioned this in passing a couple of times a few months back, but if you contrast Anand's 5800U review with practically everything he's done (and failed to do) since you get a really strange contrast. In the 5800U review he was able to do far more than run games at high resolutions with FSAA. In fact, his 5800U review centered around a minute examination of IQ, AF, and some other things of interest to people looking at this product at the time. Ever since then, however, what he's done (and failed to do) has been a distinct departure from the detail-oriented, investigatory approach--a healthy critical approach--he had no problem in applying to the 5800U. I've wondered ever since about what happened between his 5800U review and his 5900U review and thereafter. A definite sea change there, no doubt about it.
 
WaltC said:
Ever since then, however, what he's done (and failed to do) has been a distinct departure from the detail-oriented, investigatory approach--a healthy critical approach--he had no problem in applying to the 5800U. I've wondered ever since about what happened between his 5800U review and his 5900U review and thereafter. A definite sea change there, no doubt about it.

I totally agree. If you look at Anand's 5900U review, his Quake3, Splinter Cell, and Jedi Knight II benchmarks all had extremely high FSAA scores. He didn't even bother to check if FSAA was on! If you check current benchmarks, the scores aren't nearly as high. Granted, some other reviewers did this too at the time.
 
digitalwanderer said:
UberLord said:
If they want a kick ass HL2 platform then get an ATI card - which he pretty much says.
Don't you think with all the mounting evidence against the FX that "pretty much says" is a bit weak?

I do.

Anand said:
Half-Life 2 seems to be best paired with ATI hardware and as you've seen thorugh our benchmarks

I don't think so at all. DW what would you like him to say something childish? If you want him to say they suck or nvidia users are screwed, or something like that just realize it is not a 10 year old writing the reviews. However gamersdepot likes to use much stronger language so you can just read their content, if other sites are not provacative enough in their use of the english language for you.
 
Sxotty said:
DW what would you like him to say something childish?
Something like...
5900.jpg


Seriously, a bit of a stronger condemnation of nVidia or a stronger recomendation of ATi would be nice. He doesn't have to be childish, he just needs to be more clear. :)
 
Natoma said:
You know though, those faulty D3 benchmarks from anand were up for a good 3-4 weeks. I wonder how long these numbers stay at the top of the list?

:LOL:

Didn't even last a day. Can't say I'm too surprised though...
 
Hoping to not see a Valve-nVidia joint-statement like the one with Futuremark in the next days...

nv-fm.jpg


:D :D

Bye!
 
I was very disappointed with Anand's opening commentary. Specifically in the way he just brushed off the things nVidia's done in the last year by saying "they've all" done these kinds of things in the past. That certainly is true on its face, but the degree and kind of things nVidia has done in the last 10 or so months are unprecedented in my experience.

I can agree with him to a point depending how far back you wanna go. I can remember some pretty rampant cheating in the days of GDI accelerators competing like mad against eachother, misrepresenting featuresets, trying to discredit various benchmarks when they weren't represented well in them.

The differences between now and then though is that the audience is a lot larger and more mainstream than it was 10-15 years ago. There's a lot more coverage, and with the web a simple medium for mass exchange of information and debate. With the scale in the market/capital that it entails now, people/companies are more likely to get all bent out of shape when somebody cries foul or somebody does something a little shady. Back then the audience just wasn't big enough for anybody to make that big of a fuss. That and the annoying, extreme sportsish, "suck -it-down" attitude that's pervading some segments of the industry doesn't help much either...

Big deal, I say (Who cares?) But to omit the fact that nVidia has been engaging in the very same "multi-million dollar" marketing deals all year long with various companies (most notably with Epic/Atari last year), while you are making the thinly veiled accusation that although Valve stated it chose ATi on the basis of its technology that *might not* really be the case since the deal exists....well, that's just poor Internet journalism, in my view

I'll definitely agree with you on the poor internet journalism part. Another thing about it that bugs me in general with journalistic aspects of these enthusiast sites (and perhaps the community as a whole) with regards to the tech industry is the annoyingly shallow memory of it before '95...
 
WaltC said:
I guess my remarks that the reason the lighting is much different in D3 is because there aren't any outdoor scenes in bright lighting--in which you don't see a lot of shadows--didn't quite register?....:) Or, to put it another way--if we moved the D3 scenes to the outdoors under the noon-day sun--how do you imagine they'd look "way better"...?

My response was in reply to

"Waiting for D3 until next year doesn't bother me at all...and I wouldn't be surprised if upon seeing HL2 ID doesn't do a bit more work on the game prior to release (which is why I think D3 was put back in the first place--just MO...)"

As you yourself said "Speaking of lighting, which scenes in the D3 alpha are outside on a rooftop in broad daylight with the sun peering down through the clouds?.."
Since this is probably the case, why would Id bother with beefing up their outdoor scenes when they're not going to have that many of them anyway ?
Because the HL2 indoor scenes that i've looked at (video, screenshots..) does imo not come close to D3 quality.
 
archie4oz said:
I can agree with him to a point depending how far back you wanna go. I can remember some pretty rampant cheating in the days of GDI accelerators competing like mad against eachother, misrepresenting featuresets, trying to discredit various benchmarks when they weren't represented well in them.

The differences between now and then though is that the audience is a lot larger and more mainstream than it was 10-15 years ago. There's a lot more coverage, and with the web a simple medium for mass exchange of information and debate. With the scale in the market/capital that it entails now, people/companies are more likely to get all bent out of shape when somebody cries foul or somebody does something a little shady. Back then the audience just wasn't big enough for anybody to make that big of a fuss. That and the annoying, extreme sportsish, "suck -it-down" attitude that's pervading some segments of the industry doesn't help much either...

I'll definitely agree with you on the poor internet journalism part. Another thing about it that bugs me in general with journalistic aspects of these enthusiast sites (and perhaps the community as a whole) with regards to the tech industry is the annoyingly shallow memory of it before '95...

Well, the problem I have with going back in time to frame present situations is that the situations which occurred back then are no longer occurring and no longer relevant. For instance, one site used the word "Quack2" to describe what nVidia did this year relative to 3dMK03. IMO, references to past events are only used to obfuscate current events, and to make the current events seem less important than they actually are. I don't see such references as being helpful at all in describing current events.

I mean, how does saying "ATi cheated, too, years ago" have any relevance in a description of what nVidia does today, for instance? I can't see how it has any more relevance for nVidia's present conduct than it does for ATi's.

I remember what you're talking about very well and would certainly agree with you that those things happened. But I just can't see any direct connection with then and now--I mean, even in the respective companies many of the people and policies in place then are long gone, the technologies are vastly different, as are the issues themselves. I just think that for the interests of clarity it's better to talk about issues occurring in '02-'03 without confusing them with the dead, long-resolved issues of yesterday. IE, what ATi did, or didn't do, years ago, is no justification for what nVidia does today, and vice-versa.
 
Bjorn said:
As you yourself said "Speaking of lighting, which scenes in the D3 alpha are outside on a rooftop in broad daylight with the sun peering down through the clouds?.."
Since this is probably the case, why would Id bother with beefing up their outdoor scenes when they're not going to have that many of them anyway ?
Because the HL2 indoor scenes that i've looked at (video, screenshots..) does imo not come close to D3 quality.

I was talking about the HDR movie Valve released--it was on the roof in direct sunlight which makes the lighting much different that what you're seeing in the D3 Alpha, which is my point. You're talking about how much "better" you think D3 looks, but I was trying to draw your attention to how different the lighting is. IF you had a scene in D3 that was on the roof in the sunlight, you aren't going to see the kind of shadows visible in the D3 screenshots--because they wouldn't be there. The reflectivity of the materials in the frame--everything--would look much different--this is *why* D3 is restricted to the dark, dank, dungeon-type atmoshpere of the game with low lighting, heavy shadows, etc. The mood of the setting for D3 is totally different than what Valve presents in HL2--they are completely different games with different scene settings and atmosphere--and of course much different lighting accordingly. I think D3 looks great from what I've seen--but I think HL2 looks just as good--because I don't confuse the two games as to their settings, moods, and lighting. They are totally different in that respect. I don't think it's fair to say "D3 looks way better" because HL2 is not trying to imitate D3 with respect to lighting and atmosphere--but is representing something entirely different.
 
I dont know if this has already been posted but Anandtech posted some more benchmarks from HL2 on the mobile platforms from NV and ATI.
This time what Anandtech did was to even the settings out completely and forced the NV GFX5600 based part doing DX9 PS2.0 with trilinear enabled.

It looks ugly for NVIDIA:

http://www.anandtech.com/mobile/showdoc.html?i=1866&p=9

Shockingly, only 3 out of 8 times was the GeForce FX Go5650 able to surpass the 10 fps barrier. Even at its best, the GeForce FX Go5650 was only able to close the gap between the Mobility Radeon 9600 to 234%. “Slow as a pregnant yakâ€￾ was a phase that we often heard in reference to these scores. While we wouldn’t put it in this exact context, the Mobility Radeon 9600 beats the GeForce FX Go5650 “no questions askedâ€￾ in all of these scenarios, with the highest difference of 415% (36.6fps vs. 7.1fps).
 
The developer of Half-Life 2, Valve, is the first developer to voice their displeasure for the NV3x architecture with such intensity, because it has forced them to write additional codepaths particularly for NVIDIA hardware; thus, costing them time, money, and extra resources. This was something not needed to run on ATI hardware, which is why they entered into an agreement with ATI. The order of the agreement was based on already existing hardware benchmark scores to a marketing agreement, not the other way around as some have speculated.

That paragraph above caught my eye too. One of the reasons it caught my eye is that traditionally Anand has made blanket statements that have proved to be false. Now it seems he is doing it again but from the other side. Perhaps Anand should learn that he should take everything he hears from the horses mouth (whomever that horse is) with a pinch of salt.

At the end of the day it is all about money baby ;) I couldbe wrong though so please feel free to disagree guys. Not that some of you would NOT disagree with my blanket statement, hehe.
 
Tahir said:
That paragraph above caught my eye too. One of the reasons it caught my eye is that traditionally Anand has made blanket statements that have proved to be false. Now it seems he is doing it again but from the other side. Perhaps Anand should learn that he should take everything he hears from the horses mouth (whomever that horse is) with a pinch of salt.

At the end of the day it is all about money baby ;) I couldbe wrong though so please feel free to disagree guys. Not that some of you would NOT disagree with my blanket statement, hehe.

Well, Valve explicitly stated it had chosen ATi as its bundling partner, based on ATi's technology. Very similar to what M$ said when choosing ATi for the xBox2 contract. I cannot in a million years imagine nVidia "turning Valve down" had Valve elected to go with nVidia--regardless of the cost. nVidia's already smarting from a lot of things this year--not the least of which is Doom 3--which it obviously had been counting on--to be released this year. The HL2 title is of such a stature that I would imagine Valve would have had its pick of bundling partners from the IHVs. I'm equally certain ATi ponied up some cash--but nVidia would have joyfully done the same, had it been asked.

The thing I absolutely do not believe at all is that Valve put up their deal to the highest bidder....;) Might have been a great deal for ATi, though, to have lost the bid had that been the case--since word would quickly have spread how much faster the R3x0 was running HL2's DX9 code path--and ATi wouldn't have had to pony up anything...;) Nah, it just doesn't make much sense to assume that Valve did anything but choose ATi, and that ATi acquiesced to that proposal.

I don't mean to say that this is something that happened quickly, but rather something that firmed up over time as Valve worked on its software with products from both IHVs.
 
Tahir said:
I dont know if this has already been posted but Anandtech posted some more benchmarks from HL2 on the mobile platforms from NV and ATI.
This time what Anandtech did was to even the settings out completely and forced the NV GFX5600 based part doing DX9 PS2.0 with trilinear enabled.

It looks ugly for NVIDIA:

http://www.anandtech.com/mobile/showdoc.html?i=1866&p=9

Shockingly, only 3 out of 8 times was the GeForce FX Go5650 able to surpass the 10 fps barrier. Even at its best, the GeForce FX Go5650 was only able to close the gap between the Mobility Radeon 9600 to 234%. “Slow as a pregnant yak” was a phase that we often heard in reference to these scores. While we wouldn’t put it in this exact context, the Mobility Radeon 9600 beats the GeForce FX Go5650 “no questions asked” in all of these scenarios, with the highest difference of 415% (36.6fps vs. 7.1fps).
Anand is sure changing their tune fast. :rolleyes:
 
Back
Top