TechReport chimes in on the HL2 benchmark

Vince said:
If they cheated, then show verifiable and reproducable proof and we'll all support it. But just making half-assed assumptions is low and highly unprofessional.

People are viewing Gabe as an "expert witness". He represents Valve, and Valve are writing one of the most technically advanced and anticipted games of the last few years. They've worked with Nvidia for years, they've seen and used the mythical Det 5 drivers for months. They spent a lot of time following Nvidia guidelines and optimising the game for NV3x. They have a lot to *lose* by coming out and saying what they did.

And yet a well respected devloper like Gabe, representing an experienced cutting edge company, who has done the work from inside the Nvdia hardware *still* risked his reputation by breaking ranks and saying what he said.

In addition, this is all backed up by the other evidence that has been mounting since the end of last year, and by the clearly established pattern of Nvidia's unethical behaviour.

If you're talking about burdens of proof, Gabe seems to be a pretty strong expert witness to me, who supports (and is supported) by the big pile of evidence we already have.

Now you may say that Gabe has some sort of axe to grind. It wouldn't surprise me because Valve have been screwed by Nvidia like everyone else. They put a lot of work into the NV3x path and still the game is going to suck because Nvidia hardware is bad - and Valve is going to carry the can for that, no matter what they say. That doesn't mean that Gabe is wrong or lying. It just means that he's decided to get the truth out in the open in the hope that it gets the message out and gives his customers a chance to enjoy HL2 via an informed choice, instead of having the wrong hardware due to Nvidia lying PR.

I mean, if John Carmack was writing a DX9 game and came out and clearly said the things he's hinted at and implied in his .plans, would you cast him aside as a weighty expert witness because he hasn't come down to your house and showed you the disassembled code?

Besides, I have no doubt that when the Det 50's arrive, they will be examined and cheating will be found and proved, just as Gabe says - he does after all have the advantage of working with them for months already. If it's the case that no cheats are found then I'm sure that B3D will be the first group of people to be amazed, pat Nvidia on the back and pass the word on to others, not because we are suddenly Nvidia supporters, but because it would be the *truth*, and so the right thing to say - something Nvidia should think more about doing.
 
Dave H said:
TechReport said:
He also mentioned that he's seen drivers detect screen capture attempts and output higher quality data than what's actually shown in-game.
:oops: :oops: :oops: :oops: :oops: :oops: :oops:

This has to be the bombshell of the day. Apparently in addition to recording their own secret demos, 3d card reviewers are now going to have to carry around digital cameras with them???

I'm not that surpriced, though I didn't expect it to go quite that far. I know I mentioned the possibility to do just that some time ago on this forum, though I didn't expect it to actually happend since I guessed it would not be that trivial to implement in drivers.
 
Vince said:
While I should have learned my lesson when questioning anything that could possibly be construed as not pro-ATI ;),...

Nah, you should have learned your lessen when making wholly irrelevant anologies, ignoring a years worth of evidence....

I find your comment to be quite obtuse.

And I find your arguments to be quite irrational and impractical....but they've been exposed as such already by others...so I have no reason to respond further. :(

Except this one particular gripe you have:

His customers don't "happen" to have that hardware as if God created it on the 8th day and destined them to have it as if some determinist means were responsible.

True. His customers bought nVidia hardware for a variety of reasons. His points (as they pertain to "nVidia owners"), since you haven't figured it out:

1) First and foremost, unless you are of the very small portion of the "100 million" nVidia owners who own a DX9 "featured" card, the DX9 paths have no relevance to you. You jsut use the DX8 path like everyone else, and we have no issues with nVidia hardware on the DX8 path.

2)...if his customers bought nVidia hardware to play Half-Life, that's not a particularly good choice.

3)...If his customers are buying nVidia cards because of supposed DX9 performance and features touted by nVidia, and bluffed via cheating benchmarks....they bought it for the wrong reason.

4) If his nVidia FX customers are happy with the DX8 gaming performance and quality on every other game its offered to date, Half-Life2 will be just fine.
 
Vince said:
Your right, it's nonsensical to ask why he's criticizing an IHV for making him work to produce a product that allows him to expand his userbase by 100million.

Well, firs the 100M quote hardly pertains to their DX9 series – he’s happy to optimise for the GeForce 4 series, and the benchmark indicate that they actually perform admirably in relation to the NVIDIA’s DX9 parts. That optimisation for the mainstay of the installed base is done and its there.

I’d say that one of the issues they were addressing is the fact that the benchmark data out there doesn’t really correlate to actual DX9 performance of NVIDIA’s DX9 parts. Without accurate data these are going to be more and more widespread and they will be forced to make these optimisations which are costly and expensive – why should they make these optimisations when there is hardware that people can buy some hardware that doesn’t require any optimisation, runs DX9 and HLSL fine – it only costs them money to support boards that have been misrepresented in benchmarks.

There are other issues at stake here as well. One being the direct issues with sales of the HL2 game – if a user purchases it and then finds out that it performs half as well on some levels where NVIDIA haven’t opimised the shaders then many may return the game saying it runs like shit (hence the “Our users will be pissedâ€). Valves users are also other games developers and may look at the HL2 engine for their own game – these titles may not be as popular (or may not contain benchmark modes) and when these developers start looking at the performance of the engine on the FX series (without being aware of the “pre-nvidia-optimised†performance) then they are likely to blame the engine.

I think these statements from Gabe are about getting a level of awareness out there of what the genuine DX9 performance of the FX series and to point out to their users that when they download new shaders from steam or when game devs use their engine if there is a huge performance disparity between what the benchmarks say and what actually happens isn’t an issue with the engine, but an issue with the way the hardware is designed and its interaction with DX9.
 
Well, the real pisser is that some of us have done "benchmarking as best as we damn well can" but inevitably we're just pawn in this whole thing. The best thing that has happened out of all of this is that its been raised to such a level that users are aware of it, OEM's are aware, MS is aware and now AAA developers such as Valve are willing to speak out in such a forthright manner.

I was sitting next to Dave Salvator during the presentations and I think both of us were a little relieved when he spoke in such a manner.
 
DaveBaumann said:
There are other issues at stake here as well. One being the direct issues with sales of the HL2 game – if a user purchases it and then finds out that it performs half as well on some levels where NVIDIA haven’t opimised the shaders then many may return the game saying it runs like shit (hence the “Our users will be pissedâ€￾). Valves users are also other games developers and may look at the HL2 engine for their own game – these titles may not be as popular (or may not contain benchmark modes) and when these developers start looking at the performance of the engine on the FX series (without being aware of the “pre-nvidia-optimisedâ€￾ performance) then they are likely to blame the engine.

Excellent point--I had not considered it from the view of Valve selling the Source engine to other developers. Certainly, they are Valve's customers, too.
 
bdmosky said:
I need proof that isn't based on your damn speculation or anyone else's. Speculation is what it is... and if that's all you've got to show, then you can either call it what it is, or keep it to yourself. YES, Nvidia cheated in 3DMark 2003 with certain drivers. This was proven through systematic testing procedures. Now, whether or not Nvidia is willing to admit that is a whole different matter. Is Nvidia still cheating in 3DMark 2003 with current drivers? I don't know! Why? Because as far as I've read, it can no longer be systematically shown it is doing this. Same thing goes for the detonator 50 series. No proof was offered and that is what is pissing me off. I really, really hate how many people will abuse this word. To prove something as fact, it takes systematic reasoning. You keep skirting the issue here by throwing in all sorts of marketing gimmicks and personal emotions, but isn't proof so stop pretending it is. I expect more out of the Beyond3D community than this crap and I should hope you would too. If Valve isn't willing to offer proof for us, then I think we as a community need to test these drivers in Half Life 2 so that the issue CAN be resolved despite Valve's opinion on the matter just like the Reverend already said.

Sorry, but your problem is that you don't know what constitutes proof and what doesn't. Valve is putting its credibility on the line and doing it *before* their game goes on sale--that's enough proof for me. I think the proof is grossly abundant over the past 10 months--but not of course if you don't wish to see it. I'd say Valve has been extremely systematic in its approach to nV3x hardware over the past several months.

Face it: you are not in a postion to prove or disprove anything yourself. All you can do is rely on the word of people who are in such a position, like Valve. If Valve has not "systematically" proven the issues to their satisfaction, I'll eat my hat....;)
 
Look what Gabe did I think was dispicable, there is already plenty of evidence to point users in the direction of ATI products that is why my last 2 cards have been such. I really have more problem with what he did and how it looks than I can describe adequately. As for personal attacks on him like he needs a haircut I mean geez that was pretty minor and as opengl guy said if it walks like a duck, oh wait I think he is still a man.
 
DaveBaumann said:
Well, the real pisser is that some of us have done "benchmarking as best as we damn well can" but inevitably we're just pawn in this whole thing. The best thing that has happened out of all of this is that its been raised to such a level that users are aware of it, OEM's are aware, MS is aware and now AAA developers such as Valve are willing to speak out in such a forthright manner.

I was sitting next to Dave Salvator during the presentations and I think both of us were a little relieved when he spoke in such a manner.

Honestly, I rather think that the investigative approach you take in benchmarking as you do allows you to rise above the level of being a pawn. In fact, it is the only way to avoid becoming one. Some web sites remind me pf Pilate when he said "What is truth?" They are the pawns, in my view. Tyring to determine the truth instead of merely asking what it might be and making no determination is what sets B3d apart, IMO.
 
bdmosky said:
Perhaps I'm also a little disappointed with the lack of clarity of the allegations "implied" by Gabe Newman, on the behalf of Valve as well. I find it just a little irresponsible to imply cheating by Nvidia without positively identifying the driver version or offering absolute proof of the misbehavior.

FYI, Gabe (Newell) listed a number of optimisations that have been seen for benchmarks so far, these are just a list of general issues that have been present in a variety of benchmarks and drivers - he bringing awareness of the types of optimisations that have been seen already. We have proven without much doubt that mosty of these have occured with 3DMark and UT2003 alone - the one that comes as a surprise and warrants further investigation is the screen grab issue.

Those issues did not specifically pertain to Det50's and HL2 but I have the impression that the presentation that he gave was coloured by the fact that these drivers were dropped to press and them the day before ATI's Shader day, and that they had discovered fog was missing from the driver. (And FYI, Matt's 3DGPU Aquamark preview shows that colour precision has probably been droped for Auqamark in the 50's).
 
LOL! Scott from The Tech Report was asking about the existence of such a thing whilst there!

Sadly there's no price, and I suspect it will be fairly pricey.
 
Sxotty said:
Look what Gabe did I think was dispicable, there is already plenty of evidence to point users in the direction of ATI products that is why my last 2 cards have been such. I really have more problem with what he did and how it looks than I can describe adequately. As for personal attacks on him like he needs a haircut I mean geez that was pretty minor and as opengl guy said if it walks like a duck, oh wait I think he is still a man.

He wasn't rying to point out out that ATi was rendering faster with greater precision. If that had been true he would have put up some frame-rate bar graphs and said nothing else. He was attempting to educate his customer base on exactly why there is such a DX9 performance differential between competing IHV products in HL2. The frame-rate bar graphs were only a small fraction of his overall presentation.

To characterize this as pimping for ATi is inaccurate and unfair. What he was doing was pimping for the benefit of Valve, as he did not want anyone to think that Valve had deliberately slanted its software to favor ATi. Of course, some are going to think that regardless...;) But it is far better for him to get that out of the way prior to shipping the game than for him to try and explain it after the game ships. Far better, from Valve's perspective, as they would have had to explain it either way.

I think what you are not looking at is the fact that Newell wanted to *explain* the bundling deal with ATi--explain why Valve made a deal with ATi. Had he not made this presentation, and simply shipped the game, then people would have *wrongly* assumed that Valve slanted its software in favor of ATi for the sake of the deal. He wanted it known that was not the case at all. He distinctly wanted it known that Valve spent 5x as much time on the nV3x code path as they did on the DX9 code path, which is what R3x0 uses in the game by default. For some reason this is not registering with some people--people who should be clapping Valve on the back and congratulating them for the herculean effort they made for nVidia's nV3x hardware when developing the game. It's irrational to expect that Valve can turn nVidia's hardware into something it's not.
 
WaltC said:
He wasn't (t)rying to point out out that ATi was rendering faster with greater precision.
Are they? I am honestly not sure, like if nv3x is usuing only fp16 then obviously yes, but since it is mixed mode who knows


WaltC said:
To characterize this as pimping for ATi is inaccurate and unfair. What he was doing was pimping for the benefit of Valve, as he did not want anyone to think that Valve had deliberately slanted its software to favor ATi.
Look I don't think they deliberately tried to make nvidia look bad in the actual development as it would be shooting themselves in the foot. I personally think that he was pissed at nvidia b/c he had to take forever to optimize for them and b/c he saw them cheating so it made him seem pissed and grumpy.

WaltC said:
I think what you are not looking at is the fact that Newell wanted to *explain* the bundling deal with ATi--explain why Valve made a deal with ATi.
I thought they bid on it and ATI won and this was why they were bundling it.

WaltC said:
Had he not made this presentation, and simply shipped the game, then people would have *wrongly* assumed that Valve slanted its software in favor of ATi for the sake of the deal.
Perhaps this is so but I am not sure how helpful it will be for this.

WaltC said:
He wanted it known that was not the case at all. He distinctly wanted it known that Valve spent 5x as much time on the nV3x code path as they did on the DX9 code path, which is what R3x0 uses in the game by default.
The way they talk about this is confusing to me, it is like one day in august they put and nv3x card in and were like OMG it is slow how could this happen it never crossed my mind as a possibility what a tragedy what shall we do. Really that is how it comes across to me.

WaltC said:
For some reason this is not registering with some people--people who should be clapping Valve on the back and congratulating them for the herculean effort they made for nVidia's nV3x hardware when developing the game. It's irrational to expect that Valve can turn nVidia's hardware into something it's not.

In this you are 100% correct Valve deserves appluase for working to make the nv30 work well and look good at the same time.
 
bdmosky said:
I need proof that isn't based on your damn speculation or anyone else's. ....

http://www.driverheaven.net/articles/aquamark3/index3.htm

Now, im sure most of you have read Gabes recent comments regarding the detonator 51.75s, and Nvidia's offical response but I really do have to say, having seen this first hand it confirms to both myself and Veridian that the new detonators are far from a high quality IQ set. Alot of negative publicity is currently surrounding Nvidia, and here at driverheaven we like to remain as impartial and open minded as we possibly can, but after reading all the articles recently such as here coming from good sources and experiencing this ourselves first hand, I can no longer recommend an nvidia card to anyone. Ill be speaking with nvidia about this over the coming days and if I can make anything public I will.

Like I said, Bdmosky, the proof is abundant everywhere--you might say it's ubiquitous...it has been for months...
 
The way they talk about this is confusing to me, it is like one day in august they put and nv3x card in and were like OMG it is slow how could this happen it never crossed my mind as a possibility what a tragedy what shall we do. Really that is how it comes across to me.

I asked Gary Mactaggart when they first used and FX and he said it was just before E3 (presumably an NV35 given the timing) and he said that when they put it in it just failed to render in DX9. Its at that point they realised they'd have some work to do.
 
Sxotty said:
Are they? I am honestly not sure, like if nv3x is usuing only fp16 then obviously yes, but since it is mixed mode who knows

The DX9 full precision code is not mixed, and taking the same DX9 code path as ATi the 5900U is 1/3 to 1/2 as fast. It's nVidia's *recommendation* to every developer that a mixed mode, nV3x code path be created in games to support their hardware (called an "optimized code path" in nVidia's parlance.) nVidia does much better in framerates using the non-mixed DX8.1 code path for the game, but still not as good as R3x0 does with the DX9 code path. There is no R3x0 code path in the game.


Look I don't think they deliberately tried to make nvidia look bad in the actual development as it would be shooting themselves in the foot. I personally think that he was pissed at nvidia b/c he had to take forever to optimize for them and b/c he saw them cheating so it made him seem pissed and grumpy.

Agreed.

I thought they bid on it and ATI won and this was why they were bundling it.

Nope--according to Newell--he said verbatim that ATi was chosen as bundling partner for the game by Valve because of the R3x0 product line and its performance with the DX9 code path. The "bid" spin seems to have originated in a circulated email which nVidia has purportedly sent out to some of its employees.


The way they talk about this is confusing to me, it is like one day in august they put and nv3x card in and were like OMG it is slow how could this happen it never crossed my mind as a possibility what a tragedy what shall we do. Really that is how it comes across to me.

That's not the impression I got--at all. The slide illustrating 5x the development effort on the nV3x code path compared to the DX9 code path pretty much eliminates that possibility. You should look at the slides he presented during the presentation.

In this you are 100% correct Valve deserves appluase for working to make the nv30 work well and look good at the same time.

Yep....with the caveat that it's understood they were able to make it work as well and look as good as possible under the circumstances....

But their final conclusion after all that work was pretty interesting...they felt simply approaching the nV3x from the standpoint of DX8 was really the optimal path to take for the cards, from their point of view as developers.
 
The mixed mode does just refer to some of it being forced to PF16, but also because some sahders are dropped down to PS1.4 rather than PS2.0. Also some operations like vector normalisation will be done via cubemaps on the NV30 path (burning texture read/writes and bandwidth) whereas the HLSL path will use maths within the PS (burning ALU cycles).
 
Toaster

I recently bought a toaster; the man at the corner of the street from whom I purchased the toaster promised me that the toaster would not only allow me to play Half-Life 2 at 2048 x 1536 at a constant 120 FPS, but it would also double the length of my penis. I trust the man at the corner of the street because I have no proof that he is lying. Why would he lie to me?

Gabe recently told me that Half-Life 2 would not run on my toaster. He told me that even if he spent 5x times as long optimizing a special toaster path, it would still not run on my toaster. Gabe is clearly a liar and a lazy bum; had he actually cared about the 3 BILLION people who own toasters, he could have easily made Half-Life 2 run on it. Gabe is clearly anti-toaster; he obviously has an axe to grind.

By the way, have you seen that guy's hair? I refuse to trust any man who has more important things to do than to keep his hair in perfect condition. I personally get a haircut every day because I have nothing better to do.

Anyways, I'm not worried. I went to talk to the man at the corner of my street again, and he assured me that the toaster would be able to run Half-Life 2 without any problems using the secret Breadanator 50 drivers he has in his trench coat. He of course did not let me see the Breadanator 50 drivers, but I trust him because I have no proof that he would lie to me.

La la la – I can’t hear you – la la la.
La la la – There is no reason to think that the man at the corner of the street would lie to me – la la la.
La la la - there is no proof that my toaster can not run Half-Life 2 at 2048 x 1536 – la la la.
La la la – the Breadanator 50 will fix everything – la la la.

This is clearly an anti-toaster conspiracy.
 
Back
Top