TechReport chimes in on the HL2 benchmark

digitalwanderer

wandering
Legend
Hmmmm...a lot of sites seem to be picking this story up, but Tech Reports article has some new info on it.

It's a good read with lots of slide pictures from Gabe Newells presentation, I highly recomend it. :)

EDITED BITS: Dang it, Doomtrooper beat me too it. Would a mod please delete this thread?
 
As you can tell from looking at the list in the slide above, Newell was concerned particularly with some of the techniques NVIDIA has used in recent driver releases, although he didn't exempt other graphics hardware makers from his complaints. He said they had seen cases where fog was completely removed from a level in one of Valve's games, by the graphics driver software, in order to improve performance. I asked him to clarify which game, and he said it was Half-Life 2. Apparently, this activity has gone on while the game is still in development. He also mentioned that he's seen drivers detect screen capture attempt and output higher quality data than what's actually shown in-game.

The quote from the article should be put in then. 8)
 
Yep that is exacty what was being speculated what was being done on the Futuremark improvements since IQ was being used to detect it ;)
 
So now everyone who's ever shown an image comparison screenshot needs to go back and redo them. It's simply stagering what nVidia is willing to do to trick people (particularily reviewers) into thinking the nV3x is good.
 
Ratchet said:
So now everyone who's ever shown an image comparison screenshot needs to go back and redo them. It's simply stagering what nVidia is willing to do to trick people (particularily reviewers) into thinking the nV3x is good.

but then we'd be back at square one...
 
TechReport said:
He also mentioned that he's seen drivers detect screen capture attempts and output higher quality data than what's actually shown in-game.
:oops: :oops: :oops: :oops: :oops: :oops: :oops:

This has to be the bombshell of the day. Apparently in addition to recording their own secret demos, 3d card reviewers are now going to have to carry around digital cameras with them???

Of course we need to wait for more details, but combined with the explicit instructions by Valve not to use the Det 50's.......

I've tried to keep an even keel throughout this whole mess, to seperate legimate criticism from fanboyist piling-on...but this is truly beyond the pale.
 
He also mentioned that he's seen drivers detect screen capture attempts and output higher quality data than what's actually shown in-game.

This is getting pathetic, really.
It's simply stagering what nVidia is willing to do to trick people
 
Dave H said:
This has to be the bombshell of the day. Apparently in addition to recording their own secret demos, 3d card reviewers are now going to have to carry around digital cameras with them???

I agree.

I was quite literally floored when I read this.

IMO, it's coming to the point where the hardware review press needs to get together and take a stand. If Valve (or anyone else) would positively identify such things with proof, web review sites should just REFUSE to give nvidia ANY press. No reviews...act like they don't exist.

nVidia is basically making hardware reviewer's job near impossible. So the attitutde should be "if we can't trust your drviers, we won't touch your card. We can't spend countless hours trying to figure out how you're trying to cheat. That's not our job, nor should it be...so we won't bother."

As if static clip planes were alarming enough...
 
These two slides shown in this article tie in with the fallacy of nVidia's argument toward Futuremark "not being like game developers, who optimize for our games."

image12.jpg


image13.jpg


Valve is the exception to the rule, in that they HAVE the resources to optimized for nVidia hardware.

And the conclusion? It's not worth it.. They wish they just treated the FX series as DX8 cards and left it at that.

Note to nVidia: Developers used the GeForce as the "defacto standard" DX8 platform...this doesn't mean they will even optimize for your DX9 platform at all.

Futuremark, take heed.
 
Joe DeFuria said:
Futuremark, take heed.
Heck, Futuremark say "bye-bye"!

Looks like we'll have a much better DX9 benchmark soon and the people who put it out will have a hell of a lot more integrity than FM ever did!

It's kind of funny considering the FM debates yesterday, they just became irrelevant along with Futuremark. 8)
 
Valve is the exception to the rule, in that they HAVE the resources to optimized for nVidia hardware.

And the conclusion? It's not worth it.. They wish they just treated the FX series as DX8 cards and left it at that.

I think that the way this played out is perfect. Now they don't have to defend their DX9 path as "optimized for ATI" and neither does any other DX9 game.
From now on other developers should just treat NV30 as DX8 and NV35 as DX 8.1.

If you look at it, NV3x is really a Doom3 accelerator and a good OGL workstation chipset. I know some Nvidia people dropping hints almost 2 years ago about HV30 and how fast Doom3 would be on it. IMO they made a mistake and thought that was the direction other games would go.

OTOH, I fully expect the next major Nvidia product generation to fix this and be quite good at DX9.
 
Yep, I think Nvidia was betting this time around that DX8 games will be the norm, and that true DX9 games wouldn't arrive till the second generation of DX9 products (was the case with DX6, 7,8)

They clearly thought that like Quake3, Doom3 engine was going to be used for a lot of games, so they targeted that. Nvidia did well in the past by focusing on ID. I guess they underestimated Valve and no one expected a "killer app" DX9 title to arrive only 9 months after DX9 hit the market.


However, unlike the ole days where people only benchmarked Q3 and UT engines, now we have the Source engine, and I bet all future video card reviews will include Source engine benchmarks as well and Doom3.

Prepare for flame wars over which numbers are more important. ATI will do well in both D3 and Source, NV30 will only do well in D3.
 
Novdid said:
He also mentioned that he's seen drivers detect screen capture attempts and output higher quality data than what's actually shown in-game.

This was disturbing! :oops:
Question? Is there any chance at all that could be a bug? (I don't think so, but there's a lot more learned people here than me. ;) )
 
digitalwanderer said:
Novdid said:
He also mentioned that he's seen drivers detect screen capture attempts and output higher quality data than what's actually shown in-game.

This was disturbing! :oops:
Question? Is there any chance at all that could be a bug? (I don't think so, but there's a lot more learned people here than me. ;) )
A very convenient bug, don't you think? "Hey, did you notice there's no fog?" "No, I didn't. Let's take a screenshot and send it over so they can check it out." "What the... There's fog in the screenshot!"

I'd say this sort of bug, is about as convenient as the clip plane "bugs" of the past.
 
OpenGL guy said:
digitalwanderer said:
Novdid said:
He also mentioned that he's seen drivers detect screen capture attempts and output higher quality data than what's actually shown in-game.

This was disturbing! :oops:
Question? Is there any chance at all that could be a bug? (I don't think so, but there's a lot more learned people here than me. ;) )
A very convenient bug, don't you think? "Hey, did you notice there's no fog?" "No, I didn't. Let's take a screenshot and send it over so they can check it out." "What the... There's fog in the screenshot!"

I'd say this sort of bug, is about as convenient as the clip plane "bugs" of the past.

I don't want to get you in trouble so I'll ask you as discreetly as possible. ;) Is the last part speculation of what is occuring . Yes , No , or No Comment would be appreciated. :D
 
OpenGL guy said:
A very convenient bug, don't you think? "Hey, did you notice there's no fog?" "No, I didn't. Let's take a screenshot and send it over so they can check it out." "What the... There's fog in the screenshot!"

I'd say this sort of bug, is about as convenient as the clip plane "bugs" of the past.

I would completely agree on an infinitesimal chance that something like this could be a "bug"...I would think that generally a bug in the drivers relative to frame grabs would result in a grab inferior to the image displayed on screen. I can't see the reverse as happening other than with premeditation. I think your guess about the fog and frame grab being related was intuitive--and possibly right on the money...;) I wouldn't have thought of that--heck, I wouldn't have thought it would be easy to manipulate the driver to do something like that, either. I can't see Gabe making a statement like that though unless he'd been able to repeat it at will a number of times to validate the behavior. It just seems bizarre because of the amount of work and thought that would have to go into something like that--spooky, almost... :oops:
 
The saddest part about all of this is that there are a ton of people out there who still defend Nvidia and their actions to the very end. They blame Valve, they blame ATI, they blame Microsoft etc.... Everyone but the culprit who is pulling such nasty deeds as raising IQ only in screen grabs. If that isn't outright fraud I don't know what is.
 
Back
Top