FP16 and market support

The 5200 runs half-life2 - it may run it slowly, thats beside the point - if you want performance, buy a more expensive card, not a value/entry level card.

I believe it was mentioned that the 5200's will be running hl2 on a dx8 path. As will the 5600. The 5900 will be running it with _pp.

How is a card that devs are defaulting to a dx8 path affecting the coding for dx9?
 
AlphaWolf said:
The 5200 runs half-life2 - it may run it slowly, thats beside the point - if you want performance, buy a more expensive card, not a value/entry level card.
I believe it was mentioned that the 5200's will be running hl2 on a dx8 path. As will the 5600. The 5900 will be running it with _pp.
I believe Valve said the 5900 will be running with a mixed DX8/DX9 mode.
 
radar1200gs said:
The only fud around here is coming from the ATi fanboys - "FP16 is not supported by DX9", "Dawn is a DX9 showcase"...

The 5200 runs half-life2 - it may run it slowly, thats beside the point - if you want performance, buy a more expensive card, not a value/entry level card.
no, its not beside the point.
No developer is going to develop DX9 content to run on the GFFX 5200. Its simply TOO FREAKING SLOW.

Ergo, your argument about marketshare (if even true) is what is beside the point, because developers are treating the 5200 like a DX8 video card, because thats the only way they CAN treat it if they want people to play thier games.
Do youg et it now, or do you want me to draw you a DX9 surface to show you where the point is?
 
The 5200 is capable of running HL2 at DX9 levels if the developer allows it to.

I'm sure that if Valve hardcodes 5200 & 5600 to DX8 with no option for the user to alter it, some enterprising person will devise a patch that tells the game the card is really a 5900 or whatever.
 
radar1200gs said:
The 5200 is capable of running HL2 at DX9 levels if the developer allows it to.
Ok, talk about missing the point.
Sure, it can run it. But If its TOO SLOW, then no developer will have it run in DX9 because people who own one will COMPLAIN thats its slow, and cause more problems.
Or are you gonna be satisfied with 5fps?
And if your solution is "buy another card" as you said earlier - you've just proved my point that the 5200 will not/should not/is not being/will be/can be considered a DX9 card for gaming.
 
radar1200gs said:
The 5200 is capable of running HL2 at DX9 levels if the developer allows it to.
That's nice, care to back it up somehow? :rolleyes:

edited bits: "roll" for "rolleyes", there needs to be a universal emoticon system...
 
And again you completely miss the point.

I never said it was uncapable of running the code, I (and others) merely pointed out that developers are not treating the 5200 as a dx9 card. Therefore it is not having the effect you claim.

<edit> damn you guys are fast. :p
this post obviously replying to radar1200gs
 
radar1200gs said:
If I had $6 million from ATi in my bank account, I wouldn't be treating their competitors products fairly either...
How, exactly, is the 5200 unfairly treated when Valve defaults to a DX8 codepath so to not totally castrate its performance with HL2?
 
Althorin: What part of "entry level" or "value" product do you not understand?

Baron: you need to read what I originally posted:
The 5200 is capable of running HL2 at DX9 levels if the developer allows it to.

I'm sure that if Valve hardcodes 5200 & 5600 to DX8 with no option for the user to alter it, some enterprising person will devise a patch that tells the game the card is really a 5900 or whatever.
 
radar1200gs said:
If I had $6 million from ATi in my bank account, I wouldn't be treating their competitors products fairly either...

Just like the $4 million paid by NV for doom3 eh?

Yes I know not the same thing. My point was giving money to a developer to have their game in all of your new products does not mean they are screwing the other IHV uniess radar12000 has the proof...
 
radar1200gs said:
Althorin: What part of "entry level" or "value" product do you not understand?
What part of TOO SLOW TO RUN DX9 EFFECTIVELY do you not understand?

If DEVELOPERS dont code with the 5200 as a DX9 part, then for DX9 marketshare purposes IT ISNT.
Do you comprehend that that defeats your comment:
Developers will almost always code for what is the most popular hardware, not what is the technically superior hardware. There are more nVidia owners than ATi owners, even in the DX9 class.

By your own admission, the 5200 isnt DX9 class, because according to you (you've said it 3 or so times now), anyone who wants to play a DX9 game shouldnt own the 5200.
 
To the best of my knowledge, nVidia hasn't used John Carmack to attack the competition at any recent nVidia sponsored events, unlike ATi, Gabe Newell and the ATi shader (shady) day.

By the way has anyone managed to prove so much as one point that Gabe brought up about the Release 50 Detonators so far?
 
radar1200gs said:
To the best of my knowledge, nVidia hasn't used John Carmack to attack the competition at any recent nVidia sponsored events, unlike ATi, Gabe Newell and the ATi shader (shady) day.

By the way has anyone managed to prove so much as one point that Gabe brought up about the Release 50 Detonators so far?
holy irrelevance, batman!

*aussie jungle guide voice*
"Today, we are stalking the elusive radar1200gs. When cornered, he spits out absurd off topic rants as a smokescreen. Dont be fooled, folks - ignore the FUD and ask again."
"Ah, here he is in his psuedo home, pretending to be part of a discussion! A rare shot of him cornered and sputtering! You'll want to remember this one folks!"
*/aussie jungle guide voice*
 
Althornin said:
radar1200gs said:
Althorin: What part of "entry level" or "value" product do you not understand?
What part of TOO SLOW TO RUN DX9 EFFECTIVELY do you not understand?

If DEVELOPERS dont code with the 5200 as a DX9 part, then for DX9 marketshare purposes IT ISNT.
Do you comprehend that that defeats your comment:
Developers will almost always code for what is the most popular hardware, not what is the technically superior hardware. There are more nVidia owners than ATi owners, even in the DX9 class.

By your own admission, the 5200 isnt DX9 class, because according to you (you've said it 3 or so times now), anyone who wants to play a DX9 game shouldnt own the 5200.

Actually, if you re-read my posts I stated that no-one who wants to play DX9 games at a high performance level should buy the 5200. The 5200 will play the games, just not at high performance levels since it is a value oriented part.
 
radar1200gs said:
To the best of my knowledge, nVidia hasn't used John Carmack to attack the competition at any recent nVidia sponsored events, unlike ATi, Gabe Newell and the ATi shader (shady) day.

By the way has anyone managed to prove so much as one point that Gabe brought up about the Release 50 Detonators so far?

Dave has made this point many different times. The "Attack" on NV at ATI's shader day was not ment, nor planned by anyone by ATI. Gabe did that on his own free will. If you have issue with it, take it up with Gabe. Also IIRC all of the cheats he posted except for the HL2 fog issues (and I hope you can understand why we the public could not verify the HL2 ones) and Screen Grabs have been reported in detail many many times (lowering IQ, shader replacecment, Clip planes, ect)....
 
radar1200gs said:
Actually, if you re-read my posts I stated that no-one who wants to play DX9 games at a high performance level should buy the 5200. The 5200 will play the games, just not at high performance levels since it is a value oriented part.

Prove it.
All benchies i've seen indicate it wont play them at *playable* rates, period.
Getting 17 fps at 640x480 in TR:AOD is NOT "playing the game", sorry.
 
no of course not, they just release pdf files trying to smear the competition, brought up the whole quack issue, tried to bury futuremark, forced edios to remove a benchmark mode from a game, etc

so stating the facts:

1. that when you force the GFFX series to run at full percision it is slower than even a 9500Pro in some cases

2. That you have to hand code your game to get acceptable performance where as you do not have to do that with ATi's offerings.

3. Nvidia also tried to smear ATi with accusation that UT Aquamark and various programs are cheated in when IT has been clearly stated that UT is a bug (Ati and epic stated this) Aquamark was being looked into as they were not sure of the reason... all the while nVidia is FORCING LOWER IQ for speed in just about all of the popular applications...

4. Did you know that in the majority of the cases a Radeon 8500 is FASTER than a 5200 not to mention the fact that the 9100 is as well?

radar1200gs said:
To the best of my knowledge, nVidia hasn't used John Carmack to attack the competition at any recent nVidia sponsored events, unlike ATi, Gabe Newell and the ATi shader (shady) day.

By the way has anyone managed to prove so much as one point that Gabe brought up about the Release 50 Detonators so far?
 
radar1200gs said:
Althornin said:
radar1200gs said:
Althorin: What part of "entry level" or "value" product do you not understand?
What part of TOO SLOW TO RUN DX9 EFFECTIVELY do you not understand?

If DEVELOPERS dont code with the 5200 as a DX9 part, then for DX9 marketshare purposes IT ISNT.
Do you comprehend that that defeats your comment:
Developers will almost always code for what is the most popular hardware, not what is the technically superior hardware. There are more nVidia owners than ATi owners, even in the DX9 class.

By your own admission, the 5200 isnt DX9 class, because according to you (you've said it 3 or so times now), anyone who wants to play a DX9 game shouldnt own the 5200.

Actually, if you re-read my posts I stated that no-one who wants to play DX9 games at a high performance level should buy the 5200. The 5200 will play the games, just not at high performance levels since it is a value oriented part.
Dude, no one will play a game that gets 1-3 fps...NO ONE!

It's like you're trying to argue with me that with the proper wrapper/driver my daughters Voodoo3 could run HL2 in DX9 mode. It might just be technically possible, but there ain't no one who is going to do it except for the challenge of the technical achievement.

It will be an unplayable slideshow, capiche?
 
radar1200gs said:
The 5200 is capable of running HL2 at DX9 levels if the developer allows it to.

I'm sure that if Valve hardcodes 5200 & 5600 to DX8 with no option for the user to alter it, some enterprising person will devise a patch that tells the game the card is really a 5900 or whatever.

hl21.gif


Yea it runs it alright. 8fps . Thats with a 2.8 ghz cpu which is not entry lvl at all. Imagine what it will run like with a 2ghz cpu. Thats pathatic on nvidias part and yours for defending such a crap part. The 5600 should be in the price zone of the 5200 and even then i'd have trouble recommending it for any dx 9 code.
 
Back
Top