Some usual funny nonsense from THG...

WaltC said:
.. for HL2 and recommend its use over the 45.xx driver set, but yet not catch the fact that the 50.xx's have a "bug" which doesn't render fog properly (if at all) in HL2? I would presume that if they were able to optimize the 50.xx's for HL2 to the extent claimed--such that they disown their existing driver set for use with the game--they would have noticed the problem with fog prior to sending the drivers along to Valve with their recommendation to use those drivers.
..

As you can read above, Nvidia says that fog is rendered correctly on the 5900, but not the 5600. Although we don't know if this is true or not at this moment. Well, maybe some of us do :)

But i agree with you about the invalid part. They should simply have said that they think that the 45.xx drivers aren't representative of performance that the users will see when the game/new drivers are released. Much like Ati did when the Doom3 benchmarks was released.
 
DaveBaumann said:
Much like Ati did when the Doom3 benchmarks was released.

Did they actually release an official statement?

Hmm, don't remember. Only remember that they commented on the benchmarks with something like "by the time Doom 3 is released, our drivers will be optimized..".
 
AFAIK they didn't say that ('optimized") but they said "D]|[-benchmark was running on machines provided by NVIDIA, tuned by NVIDIA, optimized by NVIDIA and so on... ATI can't comment anything due to these circumstances... ATI will be ready by the time D]|[ will be released..." or whatever like this.
 
OpenGL guy said:
nonamer said:
THG has point believe it or not. HL2 benchmarks have currently left a bad taste in my mouth because the performance for the NV cards against the ATI cards in HL2 are at or below the theoritical minimum for a pure pixel shading tests.

This doesn't make sense at all. Pixel shading tests have shown that there is a large performance difference, HL2 is no different.

Yes but it shouldn't perform at the absolute worst it could, unless there's not a single other bottleneck in the game other than pixel shading. Then add in the refusal to use Det51, which seem to improve PS significantly which for a game like HL2 should improve performance greatly. The cheating issue makes no sense, why would they cheat? They couldn't possibly get away with it. Oh well, it's a beta piece of software anyway so performance can't be really found.
 
T2k said:
AFAIK they didn't say that ('optimized") but they said "D]|[-benchmark was running on machines provided by NVIDIA, tuned by NVIDIA, optimized by NVIDIA and so on... ATI can't comment anything due to these circumstances... ATI will be ready by the time D]|[ will be released..." or whatever like this.

This is all that i found (Terry Makedon):

Anyways.... Doom III.
Interesting little game, even more interesting that reviews would appear on an unreleased game. All I can say at this point is that we have not had that particular benchmark before the review sites started using it. What a shame that people are getting to look at something that we havent had a chance to play around with a bit.

Anyways please dont pay any attention to that benchmark. It is on an unfinished product that we have not looked at yet. Wait till it really comes out and we will be ready to rock.

This is rather funny:

Interesting little game, even more interesting that reviews would appear on an unreleased game.

Although to be fair, HL2 seems to be just weeks away which wasn't exactly the case with Doom 3 at the time of those benchmarks.
 
Well, there doesn't appear to be much in the way of an official ATI statement - unlike NVIDIA's statement that was mailshotted to just about every site. The circumstances are also different in that ATI hadn't recieved any D3 builds since E3, Valve had been cooperating with NVIDIA.
 
nonamer said:
Yes but it shouldn't perform at the absolute worst it could, unless there's not a single other bottleneck in the game other than pixel shading.

First, what makes you think that's worst-case? It may not be. Second, it's a bonafide fact that nVidia officially recommends game developers do customized nV3x code paths for their nV3x hardware. nVidia even has a developer program set up to assist developers in doing that. nVidia's last statement, for instance, talks about them moving 2.0 shader code to 1.4. I think that 2.0 nV3x performance is sup-par, which is why nVidia is doing all of this.

Then add in the refusal to use Det51, which seem to improve PS significantly which for a game like HL2 should improve performance greatly. The cheating issue makes no sense, why would they cheat? They couldn't possibly get away with it. Oh well, it's a beta piece of software anyway so performance can't be really found.

So, if nVidia PR people like D. Perez constantly tell people to only use officially released nVidia drivers--and not to use betas--is Valve wrong in following that advice? Did Valve need to use an other-than-currently-shipping DX9 driver from ATi? This was a *beta* driver, not released to the public and has, at the very least, *bugs* in it--like fog dropping out in areas--that coincidentally just so happen to relate to performance. I see nothing unfair at all about using shipping DX9 drivers for shipping DX9 products for the DX9 codepath in HL2 for the comparison.

It's beginning to look very bad for nVidia here--very much like every time a DX9 game is released nVidia's going to have to release a new set of drivers which have been "optimized" for that game to get any kind of performance out of it at all under DX9. But it's probably not that bad since most DX9 games will just drop nV3x back to the DX8.1 code path used by nv25--and so unless a game is a blockbuster like HL2 will be, nVidia will probably live with the 8.1 path for these games for the lifespan of nV3x, I would imagine.

As to "why would they cheat?" Come on--why does nVidia to this day defend the removal of full Trilinear texturing in UT2K3? Frame-rate benchmark scores--that's what its all about as far as nVidia's concerned these days.
 
DaveBaumann said:
Well, there doesn't appear to be much in the way of an official ATI statement - unlike NVIDIA's statement that was mailshotted to just about every site. The circumstances are also different in that ATI hadn't recieved any D3 builds since E3, Valve had been cooperating with NVIDIA.

Agreed. I guess that Ati wasn't that worried about final Doom3 performance. The same can't be said for Nvidia and HL2 though .
 
Bjorn said:
Although to be fair, HL2 seems to be just weeks away which wasn't exactly the case with Doom 3 at the time of those benchmarks.

The intriguing aspect to me here is that when the nv35-promo-cum-D3-preview was published at a few select sites, the expectations of those sites was, along with everyone else, that D3 was imminent and would ship by year's end at the latest. The phrase "near-final build" was bandied about to describe it. Of course we now know that the "D3 preview" was far more relevant in chronological proximity to the release of nv35 than to Doom3. I'm satisfied that the D3 preview was in actuality merely a promotional marketing gimmick cooked up for nv35.

Even more interesting is that not only is HL2 imminent, but also Valve is releasing a HL2 benchmark into the community at the end of this month, which means that everything Valve has alluded to will be quite provable, in public, in just a couple of weeks. Contrast this to the D3 preview which was not and will not be open to the public--and there's quite a different picture, IMO.
 
Well, the problem is that we need to find out what happened in the background between NV and Valve.

One explanation could be:

- Valve found out that the performance with NV3x is bad
- They optimized but still no chance
- NV has no more options
- They published this info to let everybody know

One other could be:

- They want something from NVIDIA
- NVIDIA did not react or they rejected it
- Valve published the info to put pressure on NV

The first one would be my choice in a good and beautifull world. But we don´t live in such a world. If so, why did Valve choose an ATI event for the announcement? Valve wants to make money and the Source engine means a lot of money for them. It would be a very bad idea from their part to scare all owners of NV cards. They would not buy the game. And every developer would have the fear that using the Source engine means to crowd out customers with NV cards.

I think the second one is also not correct. But it´s not impossible. I believe something happened in the background between NV and Valve. I would like to know what it is. We all know that ATI cards run better when it comes to DX9 shaders. But there are also ways to improve shader code for FX cards. And that´s not only using FP16 instead of FP32. There´s a lot more in shader design/compiling you can do to let it run better on the CineFX architecture. It does not make sense to me that a 9600P runs faster than a 5900U with optimized code. I think you also saw the CineFX article at 3D Center... Gabe explained in his presentation that he will only code PS2 shaders in the future. Other game developers explain that they chose a different way. Since there´s no need to code everything in 2.0. An engine must also cover older cards to get a larger customer range... etc.

I would like to get some more detailed info from Valve on what they did in the mixed mode and the confirmation if there´s really no hope for owners of NV cards - meaning no more optimization is possible. That´s why I say: it´s up to Valve to find a solution together with NV. It will help them to sell more copys of their game as well. If they don´t want to invest more time (=money) in the optimization if the code then it´s up to NV to pay the price. I can´t tell people: Hey, you need to pay $300 bucks again to go with ATI and throw away your FX card. They ask: Are you sure? I only can say right now: Well, it looks like it... but is "it looks like" really enough? HL2 is not out yet. So why should we shock NV card owners if it´s not sure yet?

NVIDIA pays the price for how they handled the situation over the past months. It might be entertaining for some readers to see bashing onto NV now. But it won´t help people who already own a FX cards. It´s much more the time to find explanations and to find solutions. That´s what I mean with that sentense in the conclusion. Valve and NV have to sit together and try to find an solution. Yes, it´s much more up to NV to get a solution but it will help Valve as well (in sales numbers). In the end, the customers will benefit from such a solution.

I´m pretty sure that the performance numbers as we see it right now is not the end of the story.

Lars
 
Hey Lars,

There is an issue with those two scenarios: the former is pretty much the picture that Valve painted, the latter is a theory that you appear to have put forth but without any actual information. The manner in which they spoke of the ATI OEM deal was pretty much there to head this sort of thing off. I'm certain there is some behind the scenes friction between with NV and Valve at the moment (and not necessarily because they "want" something), but it strikes me that Gabe already pretty much outlined what had happened and its close to your first scenario.

The other issue is that I actually don't think there is any solutions to the performance WRT DX9 performance. What Gabe was saying is that as more is added to the engine this will be DX9 only content, and thus the performance difference will get worse than it is with the mixed modes. HDR is a great example of this: it will force any board that its enabled on to run exclusively in high precision float (i.e. over the PS2.0 pipeline) - because the FX series is so lacking in FP performance in the PS the relative performance will be what we are seeing in the current DX9 (or even worse). I think its more of a case of building that awareness because they just aren't that great for Floating Point rendering performance.
 
Dear Lars, no offense here but I think it's STILL ridiculous that you think Valve has anything to do on this issue after they SPENT A LOT OF TIME creating SPECIAL MODE for FX-line, following NV's claims.

My guess, sorry to say this, you guys just trying to keep up the nice face for both - in this case, especially the troubled one - IHV, under ANY circumstances, which is pretty obvious, for everybody here but you, I think.

This situation, right here, right now is CLEAR. All these yak-yak-yak seems to me some weak try to justify your real intention (desription above), even if it's a well-intention.

FYI: that's why I gave this title for this topic.

EDIT: That's the same reason I gave up reading THG LONG-LONG time ago.
 
Borsti said:
Well, the problem is that we need to find out what happened in the background between NV and Valve.

One explanation could be:

- They want something from NVIDIA
- NVIDIA did not react or they rejected it
- Valve published the info to put pressure on NV

Come on, this is just complete speculation. The problem with this kind of blackmail (which is what you are describing), is that it only works as a threat. Once you actually reveal the information you are using as a threat, the blackmailer loses all power over the blackmailee.

You are suggesting here that Valve blackmailed Nvidia for something, and when Nvidia didn't give in, for pure revenge Valve issued a statement (which appears to be factually correct) that NV3x cards cannot run DX9 very well at all.

I'm sorry Lars, this is complete nonsense. You have posted no evidence of the incredible claim you make. You are simply throwing around vague and veiled accusations to try and deflect from the shattering (to Nvidia) statements that Valve has made.

Your statement reads just like the "leaked" internal email from Nvidia saying that ATI paid Valve for a cross marketing deal, and then pointedly states that Nvidia of course have no evidence that this means ATI paid Valve to make Nvidia cards perform poorly or that Valve took a bribe to do so.

How about this instead:

1. Nvidia wanted something from Valve.
2. Valve did not react or they rejected it.
3. Valve published the info in order to take away the power for Nvidia to continue to demand things from them.

Evidence? Why should you of all people need any evidence?
 
nonamer said:
Yes but it shouldn't perform at the absolute worst it could, unless there's not a single other bottleneck in the game other than pixel shading. Then add in the refusal to use Det51, which seem to improve PS significantly which for a game like HL2 should improve performance greatly.
You mean it seems to improve performance in HL2 and Aquamark 3.
The cheating issue makes no sense, why would they cheat?
Because it's to their benefit to cheat?
They couldn't possibly get away with it.
That hasn't stopped them from abusing 3D Mark 2003, UT2003, etc.
 
Borsti said:
The first one would be my choice in a good and beautifull world. But we don´t live in such a world. If so, why did Valve choose an ATI event for the announcement? Valve wants to make money and the Source engine means a lot of money for them. It would be a very bad idea from their part to scare all owners of NV cards. They would not buy the game. And every developer would have the fear that using the Source engine means to crowd out customers with NV cards.

Exactly. Therefore, if Valve was misrepresenting something, it would have behooved them to say nothing until after the game shipped at some point. By talking about these things prior to shipment, knowing it could possibly hurt their sales, through no fault of their own, they have proven they are releasing the info without bias, IMO.

As to why it was released at an ATi event, that's simple. Valve has a bundling deal with ATi for HL2, and they wanted to make it plain to people why Valve chose ATi to be its bundling partner. I thought Gabe did a good job of explaining it.

I think the second one is also not correct. But it´s not impossible. I believe something happened in the background between NV and Valve. I would like to know what it is. We all know that ATI cards run better when it comes to DX9 shaders. But there are also ways to improve shader code for FX cards. And that´s not only using FP16 instead of FP32. There´s a lot more in shader design/compiling you can do to let it run better on the CineFX architecture. It does not make sense to me that a 9600P runs faster than a 5900U with optimized code. I think you also saw the CineFX article at 3D Center... Gabe explained in his presentation that he will only code PS2 shaders in the future. Other game developers explain that they chose a different way. Since there´s no need to code everything in 2.0. An engine must also cover older cards to get a larger customer range... etc.

However, it is not Valve's fault in any way, shape, or form, the state that nV3x is in with respect to 2.0 shader support. The choice of things like shaders is the the choice of the developer. One developer will certainly disagree with another, but neither is wrong regarding the choice it makes for shader support in its own software. The only problem in this case is the nV3x doesn't do 2.0 very well. Also, Gabe was speaking of HL2 and not pretending to speak for anybody else's software, which is irrelevant regarding the state of HL2.

I would like to get some more detailed info from Valve on what they did in the mixed mode and the confirmation if there´s really no hope for owners of NV cards - meaning no more optimization is possible. That´s why I say: it´s up to Valve to find a solution together with NV. It will help them to sell more copys of their game as well. If they don´t want to invest more time (=money) in the optimization if the code then it´s up to NV to pay the price. I can´t tell people: Hey, you need to pay $300 bucks again to go with ATI and throw away your FX card. They ask: Are you sure? I only can say right now: Well, it looks like it... but is "it looks like" really enough? HL2 is not out yet. So why should we shock NV card owners if it´s not sure yet?

What confirmation would you expect other than what has been given? It seemed pretty concrete to me. I think Gabe enumerated Valve's future intentions very clearly with what it would do with HL2 going forward.

NVIDIA pays the price for how they handled the situation over the past months. It might be entertaining for some readers to see bashing onto NV now. But it won´t help people who already own a FX cards. It´s much more the time to find explanations and to find solutions. That´s what I mean with that sentense in the conclusion. Valve and NV have to sit together and try to find an solution. Yes, it´s much more up to NV to get a solution but it will help Valve as well (in sales numbers). In the end, the customers will benefit from such a solution.

The people who own FX cards should be looking to nVidia, not Valve, to solve whatever problems they have with nV3x. Secondly, regarding nVidia's handling of the situation over the last months, there has been ample information published throughout the Internet about the state of nV3x hardware, DX9, shader support, etc., beginning with the FutureMark expose'. Perhaps, some readers were mislead by inaccurate information circulating at some web sites during that time which painted a much rosier picture for nV3x than it deserved? I think that is certainly true. So maybe those people who bought nV3x cards on the advice of some of those web sites should not only look to nVidia but to the web sites which hyped those products with glowing recommendations? Just a thought.

I´m pretty sure that the performance numbers as we see it right now is not the end of the story.

Lars

I wholeheartedly agree--just as I agree we haven't yet seen the bottom for IQ, either.
 
@Dave,

you´re right. All I want to say is that there "might" be hope for owners of NV cards. The benchmarks and the presentation from Valve shows pretty clearly what´s your choice at the moment if you want to buy a new card. But I want to find out if there´s really not a chance for owners of NV cards. HL2 is not out yet and it´s time to find out if there´s a solution and how it could look like. I don´t care about the IHV behind it. I care about the people who spent their hard earned money for a product.

It remembers me a little bit on the A-class fiasko in europe years ago where the Mercedes roll over in the "elk test". That was a horror story for Mercedes but it also was the time to put pressure on the Company to find a solution and to think about how this solution could look like (other tires, ESP).

It´s pretty clear that NV has to do the work for a solution. But it won´t be possible without Valve.

Do you feel safe at the moment to tell owners of NV cards that the performance numbers won´t change and they should throw away their card?

@Bouncing Zabaglione Bros.
I do not say it happened that way nor do I suggest something. I thought this is a Forum and a Forum normaly means that you can discuss, read, learn, think and hear different opinions.

Lars
 
WaltC said:
Perhaps, some readers were mislead by inaccurate information circulating at some web sites during that time which painted a much rosier picture for nV3x than it deserved? I think that is certainly true. So maybe those people who bought nV3x cards on the advice of some of those web sites should not only look to nVidia but to the web sites which hyped those products with glowing recommendations? Just a thought.

Interesting that you should say that. Take a look at this Guru3D thread from a new FX5900 Ultra owner that I spotted in my referrals list. As a reviewer this is the type of responce that really concerns me - afterall, what are we here to do? Give consumers an accurate representation of what they can expect.
 
Borsti said:
Do you feel safe at the moment to tell owners of NV cards now that the performance numbers won´t change and they should throw away their card?
Lars

It has NOTHING to do with your preposterous statement regarding necessary steps from Valve.

You're still trying to paint better picture for the favor of one IHV - I'm still asking: why?

EDIT: typo
 
Borsti said:
Do you feel safe at the moment to tell owners of NV cards now that the performance numbers won´t change and they should throw away their card?

I think that there will unquestionally be more performance to be gained from more "mixed modes" programming, however that will only have limited gains, especially when there are further updates and other developers use this engine.

The real question is whether there is much more to be gained in terms of more "pure" DX9 performance, and now I'm not so sure. We've seen a multitude of benchmarks and cases now that just goes to indicate that the PS2.0/Floating point performance of the FX series just isn't too hot - having looked at NV30 specifically I think its clear that it was woefully underpowered in float performance in relation to what I knew about R300 at that point. After having spoke with John Spitzer at ECTS I genuinely expected that there could be some real performance gains for NV35 with the det50's, however these gains look to be very limited in real terms and you have to say that if they were going to make large performance strides then these should have been found by now - that have had the hardware for half a year (and simulating it before then).

However, for me Gabes presentation at Shader Day wasn't actually the most eye opening - it was Eric's presentation that really interested me because his R300 shader diagram actually shows that for some operations the R300 shader core can handle twice as many ops per cycle as we had previously thought. To put this in context: with the changes from NV30 to NV35 this puts the float performance of one of NV35 pipes at its best to more or less the number ops at R300 at its worst, and R300 has twice as many pipes (this is over simplified).
 
Back
Top