nVidia's response to Valve

(ripped from http://www.nvnews.net/)

Over the last 24 hours, there has been quite a bit of controversy over comments made by Gabe Newell of Valve at ATIs Shader Day.

During the entire development of Half Life 2, NVIDIA has had close technical contact with Valve regarding the game. However, Valve has not made us aware of the issues Gabe discussed.

We're confused as to why Valve chose to use Release. 45 (Rel. 45) - because up to two weeks prior to the Shader Day we had been working closely with Valve to ensure that Release 50 (Rel. 50) provides the best experience possible on NVIDIA hardware.

Regarding the Half Life2 performance numbers that were published on the web, we believe these performance numbers are invalid because they do not use our Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's optimizations for Half Life 2 and other new games are included in our Rel.50 drivers - which reviewers currently have a beta version of today. Rel. 50 is the best driver we've ever built - it includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers.

Pending detailed information from Valve, we are only aware one bug with Rel. 50 and the version of Half Life 2 that we currently have - this is the fog issue that Gabe refered to in his presentation. It is not a cheat or an over optimization. Our current drop of Half Life 2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public before the game is available. Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers.

Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.

The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half Life 2.

We are committed to working with Gabe to fully understand his concerns and with Valve to ensure that 100+ million NVIDIA consumers get the best possible experience with Half Life 2 on NVIDIA hardware.

And this from Uttar:

http://www.notforidiots.com/GPURW.php

nVidia has released the response as seen in the link. Particularly interesting, however, is this part of the e-mail sent to certain nVidia employees ( this was not posted at the given link ):

We have been working very closely with Valve on the development of Half Life 2 and tuning for NVIDIA GPU's. And until a week ago had been in close contact with their technical team. It appears that, in preparation for ATI's Shader Days conference, they have misinterpreted bugs associated with a beta version of our release 50 driver.

You also may have heard that Valve has closed a multi-million dollar marketing deal with ATI. Valve invited us to bid on an exclusive marketing arrangement but we felt the price tag was far too high. We elected not to participate. We have no evidence or reason to believe that Valve's presentation yesterday was influenced by their marketing relationship with ATI.
 
As someone else said on another forum(I've been reading alot). I also find it strange that Nvidia makes a point to mention their 100 million + customers twice in their PR statement. It's almost like an attempt to remind Valve of the size of their customer base.
 
Screen Capture detection by the driver then applying some form of IQ improvement to the output is considered 'good', and this was found during the development of the game ITSELF, by VALVE themself.

This gets funnier all the time
biggrimjackbox.gif
 
Unbelievable. Nvidia yet again stoops to imply that the reason NV3x runs PS 2.0 like molasses on a cold day on the dark side of Pluto is after all only because ATI has paid off first Futuremark and now Valve with their...wait for it.....one MILLION dollars!

(Never mind that Nvidia paid the same membership fee to Futuremark, and spends many many millions more on cross-marketing deals with every other game publisher on the planet.)

Meanwhile, I found it interesting to see at [H] that Gabe didn't start this PowerPoint presentation until Sept. 8, just a couple days ago. And all accounts are that even ATI was very surprised to hear what came out of his mouth at what was supposed to be a presentation on shader techniques.

Apparently something Valve found in the Det 50's did not make Gabe very happy...

[EDIT: realized that the dark side of Pluto is necessarily at night, not in the day.]
 
Well, let's wait for confirmation on that. As Simon said in another thread, it could be an artifact of downconversion.

I can also easily believe fog getting dropped due to a bug.

I don't expect the 5900 to perform at the level of the 9800, but if Valve can hand optimize shaders and tweak such performance out of them, then obviously there is something seriously wrong with Nvidia's driver optimization in handling shaders and precision hints.

I still expect an assembly language programmer to beat a C compiler, but if the ratio were 2:1, I'd have serious questions about the C compiler's quality.


Valve should have just let NVidia do all the shader fixing with detection or working with Nvidia engineers on a "port", instead of doing it themselves. Although NVidia has a huge customer base, Nvidia is already embarassed by their performance on HL2, and I bet they would dedicate tremendous resources to make HL2 run better on their cards.

Hopefully, when HL2 comes out, we'll be able to analyze the PS2.0 shaders (and the NV3x specific ones) and see if Nvidia's drivers are failing to take advantage of obvious advantages.
 
I fail to see how drivers can miraculously make up for the fact that ATI 9500+ cards have twice as many shader units as NV3x cards in a shader limited game. Am I missing something here while reading every nvidia site say "just wait for the 50.xx drivers"? What legitimate optimization could be done to overcome this fact of architecture? :?
 
how can they say they dont know why valve didnt use the det 50 drivers?

gabe newell came out and said that he thought Nvidia had gone to far with there optimisations and that he was pissed at them about it

and i fail to see it aswell Natoma

though i can see unlegitimate drivers doing it :p
 
DemoCoder said:
Well, let's wait for confirmation on that. As Simon said in another thread, it could be an artifact of downconversion.

I can also easily believe fog getting dropped due to a bug.
True, but what are the odds that the screencapture/detection/image quality improvement is a bug? :LOL:
 
I got $20 USD on certain sites using the Det 50s here real soon. 8)

Edit: I should add that while I personally would be reluctant to use these new drivers due to Gabe's comments, I probably would for comparison purposes against older Dets. What I wouldn't do is rush my own benchmark #s up tonight using just the 50s.
 
Why did Nvidia send out this response? God, it almost seems to me that one of their goals recently has been to alienate the developer community. Sorry Nvidia, I don't feel that it's my responsibility to keep you filled in on every decision my company makes. Perhaps Gabe feels the same. And if you really meant half of what you said in this response it would have been directed towards Valve and not the public. Does that mean this post is hypocritical? Perhaps, but I think a company should be held to a higher standard then a lone developer.

And this smear tactic of saying ati is paying valve $2 million is ridiculous. While I don't know anything about this rumor, but I know for a fact that Nvidia paid EA over twice that amount in a marketing deal. Did we hear ATI whining about that (and I'm sure they knew about a deal of that size)? The answer as far as I know is no.
 
You misunderstand what I am saying. I am not claiming that there is anything NVidia can do that will make the NV30 run shaders as fast as the R300 cards.

I'm saying that if hand-coding PS2.0 shaders to be NV3x specific can produce an almost 2x speedup for the NV3x, AND, if Valve is using DX9 HLSL with hinting, then something is seriously wrong with NVidia's driver.

NVidia's driver should be able to deal with PS2.0 + PP output from Microsoft's FXC compiler and do the neccessary register packing, instruction reordering, and FP32->FP16 conversions (due to hints)

If it can't do it, then there are two conclusions

a) either Valve didn't include the hints

or

b) They included them, but the drivers fail to deal with them properly


So I'm not expecting Det50s to be miracles. I'm saying that currently, things look doubly bad for Nvidia: #1 bad DX9 HW, and #2 poor optimizations

I base my comments about poor optimizations by looking at how Cg compiles to NV_fragment_program2 under OpenGL. If they're doing such a bad job with the full shader source, imagine what they're doing with PS2.0 as input to the optimizer.
 
DemoCoder said:
You misunderstand what I am saying. I am not claiming that there is anything NVidia can do that will make the NV30 run shaders as fast as the R300 cards.

I'm saying that if hand-coding PS2.0 shaders to be NV3x specific can produce an almost 2x speedup for the NV3x, AND, if Valve is using DX9 HLSL with hinting, then something is seriously wrong with NVidia's driver.

NVidia's driver should be able to deal with PS2.0 + PP output from Microsoft's FXC compiler and do the neccessary register packing, instruction reordering, and FP32->FP16 conversions (due to hints)

If it can't do it, then there are two conclusions

a) either Valve didn't include the hints

or

b) They included them, but the drivers fail to deal with them properly


So I'm not expecting Det50s to be miracles. I'm saying that currently, things look doubly bad for Nvidia: #1 bad DX9 HW, and #2 poor optimizations

I base my comments about poor optimizations by looking at how Cg compiles to NV_fragment_program2 under OpenGL. If they're doing such a bad job with the full shader source, imagine what they're doing with PS2.0 as input to the optimizer.

Your forgot c) at the current quality lvl the hardware is maxed out and the drivers can not squeeze anymore performance . A quality drop is required to gain more speed.

To me a quality drop is both image quality and lowering the fp persion lower than what dx 9 calls for. Which valve already did for the card.
 
jvd said:
Your forgot c) at the current quality lvl the hardware is maxed out and the drivers can not squeeze anymore performance . A quality drop is required to gain more speed.

To me a quality drop is both image quality and lowering the fp persion lower than what dx 9 calls for. Which valve already did for the card.

Well, partial precision isn't neccessarily a "quality drop", it depends on the shader. But my point is, I've looked at the output from Cg's compiler vs MS's, and it stinks.

I derive from this, that NVidia's drivers aren't really doing that good a job of dealing with shaders. If Valve can get a near 2x performance boost from "mixed mode" (which presumably means using DX9 partial precision hints), but using mixed mode required 5x the labor, I suspect Valve was hand-tweaking shaders, instead of letting NVidia's driver deal with the partial precision hints.

Presumably, if Valve wrote HLSL shaders with the "HALF" datatype, it would automagically run at full precision on ATI hardware, and run potentially at HALF precision (FP16, if the driver deals with it properly) on NV3x hardware, and there would be no need to maintain two separate sets of shaders at all. The same HLSL shader would run on max quality on ATI HW, and FP16 on NV3x HW as driver allows.

Let's wait and see what Mixed Mode looks like compared to PS2.0 Full, but I'd really like clarification from Valve, exactly what changed they had to make to the shaders themselves.

I don't think anyone at the current moment has enough data to say what the real problem is. We know the HW won't match up with an R300, but I'd also like to know why the hand tweak shaders get such a large boost.

(outside of the fact of using normalization maps)
 
At what point does having missing fog become driver bug or a deliberate removal to gain a few more fps - some may remember the comparison reviews with NV5800Ultra v 9700Pro where yet again it was shown that fog was missing. (I believe it was HardOCP that demonstrated it).
 
THe_KELRaTH said:
At what point does having missing fog become driver bug or a deliberate removal to gain a few more fps - some may remember the comparison reviews with NV5800Ultra v 9700Pro where yet again it was shown that fog was missing. (I believe it was HardOCP that demonstrated it).

The point when it gets put back in at no performance cost.

* Edit Quote Fix *
 
Dean said:
As someone else said on another forum(I've been reading alot). I also find it strange that Nvidia makes a point to mention their 100 million + customers twice in their PR statement. It's almost like an attempt to remind Valve of the size of their customer base.

It's about the silliest part of their response--which was all pretty silly, IMO. Problem is, the issues Gabe talked about have to do with DX9, which rules out 99% of the chips nVidia's sold in the last decade anyway--so what's the point. The point with respect to Valve is that nVidia's having great difficulty making ONE DX9 chip, let alone 100M of them...:D

I would imagine ATi has sold a similar number of chips since 1988, but what that has to do with DX9 support in HL2 is apparently an enigma to which only nVidia has the answer... :rolleyes:
 
John Reynolds said:
I got $20 USD on certain sites using the Det 50s here real soon. 8)

Edit: I should add that while I personally would be reluctant to use these new drivers due to Gabe's comments, I probably would for comparison purposes against older Dets. What I wouldn't do is rush my own benchmark #s up tonight using just the 50s.

I think it would be OK to use them provided nVidia officially releases them for download from its website. OTOH, I don't think using a non-officially released driver set fits with Derek P's many reminders to nVidia users to "only use officially released drivers"...:)

How convenient, though, for nVidia to tell everyone that the 50's, which are not publicly available for anyone, are the "solution" whereas the current driver set (which is the one officially available) are "invalid" for HL2. But wait--it's obvious that HL2 is not a DX9 game because nVidia's been advertising nV3x as "DX9" for several months and--but wait--Valve says that HL2 *is* a DX9 game--but wait--how can it be that nVidia's current DX9 driver for its nV3x DX9 products are invalid for the DX9 HL2 game?---But wait....:)
 
I wish i could benchmark half life 2 with both my 5800ultra and my 9800pro. Esp since i have the 3.7 cats and the 51 dets
 
Joe DeFuria said:
(ripped from http://www.nvnews.net/)

And this from Uttar:

http://www.notforidiots.com/GPURW.php

nVidia has released the response as seen in the link. Particularly interesting, however, is this part of the e-mail sent to certain nVidia employees ( this was not posted at the given link ):

We have been working very closely with Valve on the development of Half Life 2 and tuning for NVIDIA GPU's. And until a week ago had been in close contact with their technical team. It appears that, in preparation for ATI's Shader Days conference, they have misinterpreted bugs associated with a beta version of our release 50 driver.

You also may have heard that Valve has closed a multi-million dollar marketing deal with ATI. Valve invited us to bid on an exclusive marketing arrangement but we felt the price tag was far too high. We elected not to participate. We have no evidence or reason to believe that Valve's presentation yesterday was influenced by their marketing relationship with ATI.

You know, if this is something nVidia's been sending its employees--my gosh--somebody at nVidia has gone bonkers and thinks that nVidia employees, potential nVidia 3d-card customers, and the online community are pure-T idiots....

Of course, that's why nVidia loves non-officially released betas--so that whenever they get caught cheating they can deny it as a "bug"--because what else can you expect from a beta...? OTOH, nVidia maintains that this buggy beta is good enough to declare their current officially released DX9 drivers "invalid" for HL2. Also, according to Newell's presentation as I read it, he directly stated that the reason ATi got the deal was because of its superior DX9 technology--not because nVidia did not "bid." (Shades of xBox2.) What Newell said as I gathered it is that nVidia wasn't asked to "participate" which ostensibly is why nVidia "elected" not to. I got the distinct impression from Newell that if the positions of the IHVs had been reversed from the hardware standpoint nVidia would have gotten the deal--because Valve based it on the best hardware for their software. And he amply demonstrated which hardware was best, I thought.

I wonder how many nVidia employees would actually believe this kind of an email (assuming it was actually sent out)...? But who knows...maybe in addition to providing a coat room for its employees nVidia also asks them to check their minds at the door each morning, as well...
 
Walt it's beginning to look like nVidia employees are the only potential nVidia customers left. After all I sure they a good employee discount. ;)

I all seriousness I look forward to hearing Gabe's response to this. IMHO nVidia picked the wrong guy to have a pissing match with.
 
Back
Top