Gabe Newell on Graphics companies

Stryyder

Regular
E-mail to Gabe and his response

I sent this mail to the CEO of Valve software because im a real Nvidiot who wants to play HL2 in full detail with his new Geforce 6800 ULtra from Asus . You can read what he sent me back, its a ATI commercial. But thats my opinion.


have just one or maybe two questions about the cooperation with ATI.

1. Was it a clever move to teamup with one of the big graphics company,s?

The point of this question is, that you joint with ATI but there are fans who are real Nvidia minded. And those people gonna play Halfile2 too. Are you gonna say to those peolpe buy a ATI because those cards are a (better) choice for this game. People just won,t buy a card for one game(engine), and i,am one of those poeple.

2. What gonna be the solution for this clash of the graphic cards?

I know that Nvidia playes the same ugly game too but why must the gamer be the one who suffers. My opinion is that is the developers of games are the ones who can stop this behaviour from the ventures like Nvidia and ATI because if a developer loses a part of his fanbase his venture is in big trouble.

This is what he sent me back.


I was a long term NVIDIA fan. All of my machines had NVIDIA graphics cards for home and work back to the TNT. The last hardware survey I checked, about 70% of our users had NVIDIA hardware.

However their DX9 generation of cards were very slow running DX9 games. ATI's hardware was very fast at DX9. ATI's hardware was the best solution for performance, image quality, and functionality and not by a small amount.

Our customers win and Valve wins when there is great fast hardware out there. In the past, NVIDIA set the pace. Now it's ATI. In the future, it may be NVIDIA again or maybe Intel or maybe somebody we've never heard of.

Will we somehow unfairly disadvantage NVIDIA owners to benefit ATI? Of course not. We've spent a lot more time trying to get NVIDIA's hardware to run faster than ATI's. Will we be really clear to gamers what we think is the best current hardware out there? Of course.

Our fundamental responsibility is to the people who pay the bills - the gamers who buy our games.
__________________
AMD XP 3200
1 gig ddr 400 dual channel
Audigy2 platium EX
Nvidia gforce 4 ti4400(nv 40 soon)

Sorry i dont buy Ati, i just dislike those things

Link http://www.halflife2.net/forums/showthread.php?t=27817

Thoughts
 
Everything he said about vidcards is common knowledge. Re. fx vs r3xx, it truly was no contest. Anyone who's been paying attention since the time of 3dmark03 knows that the fx cards are, umm... "far from ideal" for running shader heavy programs.

nv4x vs r4xx is another story. But it appears that hl2 won't use shader model 3.0 until it's widely supported by more cards in the hands of more gamers. Therefore, the main deciding factors are ps 2.0 support, dx 9 driver quality, and the speed/quality of aa.

I'm sure any nv40 will handle whatever hl2 throws at it very well. But I expect r420 will handle it better.
 
I agree.

NV40 vs R420 is whole different story compared to last generation R3x0 vs NV3x.

One thing for sure, IMO. Performance difference between NV40, and R420 in Half-Life2 will be A LOT smaller than their performance difference in DOOM3.
 
Really don't see anything wrong with his comments at all, just seems like common knowledge to me.
 
PowerK said:
One thing for sure, IMO. Performance difference between NV40, and R420 in Half-Life2 will be A LOT smaller than their performance difference in DOOM3.

With or without AA?
 
I also believe him when he mentions the work put into making NV hardware perform competitively with ATI hardware. It's not just Gabe doing this, it's also Carmack and I'm sure other less vocal developers.

To sell the most software, it must run on the most platforms plausible. That means not only the most current from ATI and NVIDIA, but all the old stuff from Matrox and Intel and whatever other goofy setups there are out there. If a developer created a DX9-only game right now, their sales would be abysmal and their customer perception would be worse. Nobody wants to buy a $40 game and have it either completely fail to run or run at such a degraded state as to be unplayable...

There aren't any (good) developers out there who ignore an entire hardware market segment, or purposely write bad code to spite a certain IHV. It would be like Pirelli making tires that refuse to be installed on any American manufactured vehicle; it's bad business practice.

But I also believe that, if a certain prominent IHV is making your life miserable, that you're allowed to call them out. If Pirelli was having a hell of a time making tires for a certain popular automobile company because they use horrible stock camber adjustments that consistantly wear the tires wrong, I think Pirelli is allowed to say such a thing out loud.

As mentioned before, we all know the truth behind the NV3x and the R3x0 line, why is it so hard to swallow? The NV40 is doing great, so there shouldn't be any more worries.
 
Someone drop Gabe another email asking the performance of NV40 and R420? ;) It'll be interesting to hear from him, because last time he mentioned the performance on new generation cards, R420 was said to have an edge.
 
PowerK said:
I agree.

NV40 vs R420 is whole different story compared to last generation R3x0 vs NV3x.

One thing for sure, IMO. Performance difference between NV40, and R420 in Half-Life2 will be A LOT smaller than their performance difference in DOOM3.

You've played HL2 and Doom 3 on NV40 and R420 already?
 
Off on a slight tangent....

Well, this isn't what we're looking for, but I found it interesting.

Gary McTaggart said:
5600 runs HL2 at 5-10fps with ps_2_0 in common maps, 5600 runs HL2 at
30-40fps with ps_1_1 in the same map. This is running at 1024x768. The
ps_2_0 shader may be a little bit more complex in this case, but not
much. I could push down the resolution to get a decent framerate, but
why if it runs great as ps_1_1 hardware?
(from directxdev mailing list)

At least it gives us an idea of what to expect with lower end hardware. I shudder to think what those poor souls with fx5200 "dx 9" cards are gonna experience...
No worries for me, though. :D
 
Unit01 said:
Wow Gabe Newell is CEO of VALVE?
You, sir, get a cookie for making me laugh on the most boring day I've had in months.

PREF
ID=0c2bbc3f26415ff6:LD=en:TM=1089472339:LM=1089472339:S=_jie4nGiA4gMo0PO
google.co.uk/
1536
2618878336
32111634
4035259712
29648528
*

^ yours ^
 
Gabe said:
Will we somehow unfairly disadvantage NVIDIA owners to benefit ATI? Of course not. We've spent a lot more time trying to get NVIDIA's hardware to run faster than ATI's. Will we be really clear to gamers what we think is the best current hardware out there? Of course.

Now WTF is that supposed to mean? :?
 
LeStoffer said:
Gabe said:
Will we somehow unfairly disadvantage NVIDIA owners to benefit ATI? Of course not. We've spent a lot more time trying to get NVIDIA's hardware to run faster than ATI's. Will we be really clear to gamers what we think is the best current hardware out there? Of course.

Now WTF is that supposed to mean? :?

What's unclear about it? They spent more time optimizing for nvidia hardware than they did for ATI. Most likely because the FX series has issues with DX9.
 
AlphaWolf said:
What's unclear about it? They spent more time optimizing for nvidia hardware than they did for ATI. Most likely because the FX series has issues with DX9.

No, I meant what is up with them 'trying to get NVIDIA's hardware to run faster than ATI's'? Any decent game developer try to get the best performance and image quality for a given hardware generation and card. You don't try to get card A 'to run faster' than card B.

I don't care if NV3x is only half as fast as R3x0 in shader-limited areas, I just don't want a game developer to have as goal to get one card 'to run faster' than another. A case of bad wording from Mr Gabe? I hope so!
 
LeStoffer said:
AlphaWolf said:
What's unclear about it? They spent more time optimizing for nvidia hardware than they did for ATI. Most likely because the FX series has issues with DX9.

No, I meant what is up with them 'trying to get NVIDIA's hardware to run faster than ATI's'? Any decent game developer try to get the best performance and image quality for a given hardware generation and card. You don't try to get card A 'to run faster' than card B.

I don't care if NV3x is only half as fast as R3x0 in shader-limited areas, I just don't want a game developer to have as goal to get one card 'to run faster' than another. A case of bad wording from Mr Gabe? I hope so!

Ok, I think you are misunderstanding. More time optimizing for nvidia to be faster than time spent for ATi to be faster; not optimizing for nvidia to be faster than ATi.
 
im a real Nvidiot who wants to play HL2 in full detail with his new Geforce 6800 ULtra from Asus

If performance of HL2 was so important to him, why doesn't he wait until the game is benched before buying that card? The "Sorry i dont buy Ati, i just dislike those things" comment demonstrates how rational his thought process is.


[edit] Reworded as he hasn't bought the 6800 Ultra yet.
 
Back
Top