Some usual funny nonsense from THG...

Heh imagine if ATI had done the same thing as NV and said "Yes our next card will be fully DX9 compliant" whilst making something that was extremely fast at DX8/8.1 but could only just about handle DX9 if written a certain way. We would all be screwed.... HL2, TRAOD, Halo, Vamp 2, etc etc - none of them would be playable with full features on any card.

Three cheers for ATI!!
 
your whole post is silly

Borsti said:
Well, the problem is that we need to find out what happened in the background between NV and Valve.
Why what does this change ? And why do you feel something "happened" anyway that affects all this? Especially when we've seen the same from any TRUE ps/vx 2.0 dx9 app or benchmark. Although I have no evidence, I'm sure it's nothing more than nvidia trying to use it's influence to tell valve what to do. Especially when you look at the way the 'nvidia press release' was written. But again how does this make you come to the conclusions you did in your article ? Why do lay the burden of more proof on valve just because of a few words from nvidia who has been in a number of scandals recently ?

Borsti said:
One explanation could be:

- Valve found out that the performance with NV3x is bad
- They optimized but still no chance
- NV has no more options
- They published this info to let everybody know
This is currently what has happened according to the info release and comparing the results to original 3dmark scores, TR:AOD, aquakmark and etc.

Borsti said:
One other could be:

- They want something from NVIDIA
- NVIDIA did not react or they rejected it
- Valve published the info to put pressure on NV
What could valve possible want from nvidia ? How can doing this out of spite to nvidia benefit valve for not recieving something they asked for ?
Borsti said:
The first one would be my choice in a good and beautifull world. But we don´t live in such a world. If so, why did Valve choose an ATI event for the announcement? Valve wants to make money and the Source engine means a lot of money for them. It would be a very bad idea from their part to scare all owners of NV cards. They would not buy the game. And every developer would have the fear that using the Source engine means to crowd out customers with NV cards.
I think that your living in a imaginary world in the past their wasn't such a strong alliance to certain hardware by developers, EDITORS, and gamers. People simple bought, followed and made sure to developed for the best. What's funny is the double standards you hold ATI to, why not complain to nvidia & EA about the TWIMTBP program ?

Borsti said:
I think the second one is also not correct. But it´s not impossible. I believe something happened in the background between NV and Valve. I would like to know what it is. We all know that ATI cards run better when it comes to DX9 shaders. But there are also ways to improve shader code for FX cards. And that´s not only using FP16 instead of FP32. There´s a lot more in shader design/compiling you can do to let it run better on the CineFX architecture. It does not make sense to me that a 9600P runs faster than a 5900U with optimized code. I think you also saw the CineFX article at 3D Center... Gabe explained in his presentation that he will only code PS2 shaders in the future. Other game developers explain that they chose a different way. Since there´s no need to code everything in 2.0. An engine must also cover older cards to get a larger customer range... etc.
Simply the 9600 have true ps 2.0 support the fx's don't, The need to code something is a developers choice. Funny how when ATI cards have so many problems with certain games and need patches, it's deemed automatically bad hardware regardless if it's their fault or not. When nvidia has bad hardware and get bads performance then the sky is falling, not enough was done, Something happened between nvidia and the developer, ATI paid them off and etc. This shows you to be really slow, stupid or simply biase.

Borsti said:
I would like to get some more detailed info from Valve on what they did in the mixed mode and the confirmation if there´s really no hope for owners of NV cards - meaning no more optimization is possible. That´s why I say: it´s up to Valve to find a solution together with NV. It will help them to sell more copys of their game as well. If they don´t want to invest more time (=money) in the optimization if the code then it´s up to NV to pay the price. I can´t tell people: Hey, you need to pay $300 bucks again to go with ATI and throw away your FX card. They ask: Are you sure? I only can say right now: Well, it looks like it... but is "it looks like" really enough? HL2 is not out yet. So why should we shock NV card owners if it´s not sure yet?
300 bucks not needed get a 9600 pro and again this is all nvidia's fault for lying/cheating to hide true performance with cheating drivers anyway.

Borsti said:
NVIDIA pays the price for how they handled the situation over the past months. It might be entertaining for some readers to see bashing onto NV now. But it won´t help people who already own a FX cards. It´s much more the time to find explanations and to find solutions. That´s what I mean with that sentense in the conclusion. Valve and NV have to sit together and try to find an solution. Yes, it´s much more up to NV to get a solution but it will help Valve as well (in sales numbers). In the end, the customers will benefit from such a solution.

I´m pretty sure that the performance numbers as we see it right now is not the end of the story.

Lars
you right nvidia will continue to use their influence on people such as yourself, to have the blame pointed at those telling the truth in a attempt to discredit them, while they come up with better cheats in the drivers.
 
Bottom Line HL2 is the first game that actually follows and uses a lot of the DX9 API, FX cards suck in DX9. Either blame nvidia or Blame MS for writing an API that was to difficult for Nvidia to follow but in no way should Valve be blamed.

BTW the correct asnwer is simple, the FX series of cards suck, and the only way Nvidia can make them playable on new games is to write drivers for each specific game that bring down IQ and take away choice from the consumer.

Why doesn't Nvidia just produce game specicifc drivers and be down with it since thie is obviosuly there way of thinking.

Maybe Hallife2v.3 drivers or Doom3.9 drivers. Nvidia seems to think it can push out crappy hardware and then speed things up in driver releases.
 
I haven't a clue if it was me or not. I may have heard it somewhere else and then passed it on...
 
swanlee said:
Bottom Line HL2 is the first game that actually follows and uses a lot of the DX9 API, FX cards suck in DX9. Either blame nvidia or Blame MS for writing an API that was to difficult for Nvidia to follow but in no way should Valve be blamed.
It's not about it being "too difficult" for nVidia to follow. It has nothing to do with that. It's about producing a spec that is simply contrary to the abilities of the processor (which is true when talking about the NV30-NV34...the NV35 fares much better).

With the rest of it, nVidia needs to put a lot of time and effort into putting forth good compilers for the NV3x architecture. To put it simply, it's a VLIW instruction architecture. The performance of a VLIW architecture depends very heavily upon the compiler.

If anything, the most damning thing about DX9 for the NV3x architecture is the intermediate assembly. The NV3x would have a much greater capacity for performance with compilation (through the graphics drivers) directly to machine code. After all, with VLIW, machine code typically has control over far more than is exposed in assembly.

So, put another way, I think:
A. Developers should focus on solely using HLSL or another higher-order language.
B. Microsoft's next version of DirectX should do away with assembly shader programming altogether.
 
Back
Top