More 3DMark from ExtremeTech

Its not even Remotely a *fair* review. Not to mention they dont know what they are talking aobut. How can you say its a fair review when in the conclusion they post this???
Given the difficulty of programming in DX8, the greatly improved development tools for DX9, and the fact that this PS 1.1 vs. PS 1.4 issue goes away in DX9 thanks to PS 2.0 , it's unlikely that we'll ever see that many games using PS 1.4.

This benchmark does overemphasize PS 1.4, and this probably wasn't the best way to go for all three tests. It might have been better to start with a PS 1.1 test, then move on to a PS 1.4/1.1 test (as they do now), and then finish with a PS 2.0 only test. This would more accurately model how games are developed now, and will be built in the future.
The above statement is complete Poppycock. It displayes that Extremetech has no actual freaking clue about PS 1.4, PS 2.0 Or Dx9, or what game developers are going to do.

And i am frankly sick of sites doing nothing but propagandizing Nvidias Misleading one sided bull****.
 
Mmm.. but rather meaningless as they're using Nvidia's cheat drivers.. I'll bet the Nv scores will be similar to the 42.63 driver results when the tests are performed with WHQL drivers. With that in mind I expect the NV30 WHQL drivers will take a tad longer than usual to appear.
 
Dave,

Why isn't Ati sending the rather large list of PS 1.4 supported games out to all these sites? They are just sitting by while Nvidia Systematically Propagandizes the entire internet.

Why cant this list be publically posted??? Why should it be a secret???.
 
Hellbinder[CE said:
]Dave,

Why isn't Ati sending the rather large list of PS 1.4 supported games out to all these sites? They are just sitting by while Nvidia Systematically Propagandizes the entire internet.

Why cant this list be publically posted??? Why should it be a secret???.

Maybe they have a same problem as Matrox... Suuround Gaming / Dual Head gaming is pretty well supported as well as their list games supporting EMBM, but they just don't want to keep big noise about something they think being obious.
 
They are just sitting by while Nvidia Systematically Propagandizes the entire internet.

ATI is likely just smarter than that. They know any resources put to thwart propaganda are just resources wasted since people will always believe what they want to believe.

No amount of marketing is going to raise a target viewer's IQ by 50 points, nor undo 5+ years of conditioning.

"Fair" (chuckle) reviews like the one in the link above aren't going to sway anyone's ideal one way or another. People go into such an "article" with a pre-conceived side of things and either agree with it or oppose it. It's as simple as that. There just isn't any substantiation nor evidence presented, so it makes a fine piece of commentary from which no real tangible persuasion is exerted.
 
london-boy said:
Are you going to buy a game because it supports PS1.4 or because you like the game?

Thats not the point. The point is one IHV is saying that there will not be many games that use PS1.4 Some webs sites are saying no games will use it. I am not buying a game based on the PS level it supports. But I would at least like to know how many are out there and not some spoon feed PR crap from either side.
 
Sharkfood said:
There just isn't any substantiation nor evidence presented, so it makes a fine piece of commentary from which no real tangible persuasion is exerted.

No evidence presented?

"And more to the point, it appears that developer acceptance of PS 1.4 has been lukewarm at best, since using it necessitates writing two different shader programs to get optimal performance out of both nVidia and ATI DX8 hardware. One developer I interviewed who requested anonymity stated that his company had essentially targeted PS 1.1 for "less aggressive" games, and DX9's PS 2.0 for more "forward-looking" graphics technologies. But they also added that in one instance they are using PS 1.4 for one shader effect in their current game."

They at least talked to one dev which is more than I can say for most sites.

And yes, I do think it was a fair article due to the lack of taking sides on the issue. They didn't say 3dMark was worthless like Nvidia, but they thought it overemphasized PS1.4. I thought their suggestion to use a test with 1.1, another with 1.4, and the final with 2.0 would be a good idea until it becomes clear what kind of support the different versions have. It would also be good test at how coding for each version would affect performance differences.

They also bothered to find out how 3DMark was viewed by Dell.

We spoke with company officials from ATI and Dell, and both companies believe that 3DMark03's methodology and implementation are essentially sound. Dell, in particular, will be using the benchmark as one of its metrics for qualifying 3D GPUs.

So it seems that nVidia may just have to suck it up and live with whatever issues they have, since the biggest system OEM in the world will be judging its GPUs with this benchmark.

Good news for ATI.
 
"And more to the point, it appears that developer acceptance of PS 1.4 has been lukewarm at best, since using it necessitates writing two different shader programs to get optimal performance out of both nVidia and ATI DX8 hardware. One developer I interviewed who requested anonymity stated that his company had essentially targeted PS 1.1 for "less aggressive" games, and DX9's PS 2.0 for more "forward-looking" graphics technologies. But they also added that in one instance they are using PS 1.4 for one shader effect in their current game."

Easy to post BS with no evidence to back it up, sorry no developer wants more work if they don't need to...browse through www.cgshaders.org or do a search on Pixel Shader 1.4 complaints (or lack of support)..

Hey, am I the only one who is having "look" problems with per pixel specular lighting in ps1.1 over Cg??? But, if I change my code to DX(non Cg) ps1.4 on my Radeon 8,5k... looks good... are really ps.1.1 so bad?

See thats exactly the problem. It generates PS1.1 code, why not PS1.4 code ? Developers who use this in a game will automatically end up using shaders which are sub-optimal for competing hardware.

What you really want from a high level language is that it automatically adapts to whatever capabilities the hardware has. As it stands now all Cg does it output NVIDIA tailored programs which happen to also run on competing hardware (at least on the DX side).

So should ATI, Matrox, SIS, PowerVR and all the others now suddenly jump and start coding a complete compiler to match the NVIDIA Cg syntax, or should developers use common sense and wait for a true industry standard from Microsoft or OpenGL ARB ?

Also what if HW supports features not supported by current or future NVIDIA hardware ? Will NVIDIA allow, say ATI, to extend the Cg syntax with extra commands so they can expose this functionality ? Obviously not so the whole standard is no standard, its for NVIDIA by NVIDIA and thats where it ends.

The only use I currently see is the offline compiler that generates shader code that can then be tweaked/modified by hand and even thats limited use since PS1.4 is quite different.

THESE ARE game developers :rolleyes:
 
I agree that the concept of Cg is broken - nVidia should allocate its programming resources to something more useful.

I do not think however that PS1.x shaders are complicated enough to require high level programming.

It's VS>=1.1 and PS>=2.0 where the high level languages are more useful.
 
Doomtrooper said:
Easy to post BS with no evidence to back it up, sorry no developer wants more work if they don't need to...browse through www.cgshaders.org or do a search on Pixel Shader 1.4 complaints (or lack of support)..

Hey, am I the only one who is having "look" problems with per pixel specular lighting in ps1.1 over Cg??? But, if I change my code to DX(non Cg) ps1.4 on my Radeon 8,5k... looks good... are really ps.1.1 so bad?

See thats exactly the problem. It generates PS1.1 code, why not PS1.4 code ? Developers who use this in a game will automatically end up using shaders which are sub-optimal for competing hardware.

THESE ARE game developers :rolleyes:

What exactly does cg not supporting PS1.4 tell you? It looks like your supporting the ET's argument. Devs will almost always support the lowest common denominator. Sometimes they will code multiple paths to take advantage of superior hardware.

If the LCD is pre-PS will they make a separate codepath for for 1.1 let alone 1.4? Would they bother to do just 1.4 and forget all the 1.1 only cards?
If the LCD is PS 1.1 will they make a separate codepath for 1.4?
Will they use an LCD of 1.4 and spurn the 1.1 cards?
If they want to be really advanced would they target 1.4 or jump straight to 2.0?

PS 1.4 is superior, but the question is how often will devs make the extra effort to use 1.4 considering the installed base of cards? To me, it seems unlikely they will do so as often as 3Dmark represents and until we have clearer picture I think ET's suggestion of 3 tests 1.1, 1.4/1.1, and 2.0 is a good idea. Maybe a 2nd 2.0 test would be good as well since it is futuremark. Older cards can just run the older 2001 version.

Another thing to note from the article is the FX is perfectly capable of supporting 1.4 but Nvidia has not let it do so in the FX's original drivers. That may be because of early drivers or they might be protecting their older cards. If the FX did support 1.4 more devs might make the decision to use it.
 
PS 1.4 is superior, but the question is how often will devs make the extra effort to use 1.4 considering the installed base of cards?

Huh?

The point is, if Cg supported PS 1.4, then they could support the installed base of PS 1.4 hardware with little to no effort.

What dev would not want to get their code to run as well on ALL hardware as possible?

No one is suggesting that Cg should support PS 1.4, and NOT support PS 1.1. As for the "installed base" of cards, the "ONLY" cards that support PS 1.1, and NOT ps 1.4, is the GeForce3 and GeForce4 Ti. All R200 based cards, and ALL DX9 based cards will also support PS 1.4.
 
What exactly does cg not supporting PS1.4 tell you?

It tells me they are holding graphics progression back because they chose not to support it (we know the advantages of PS 1.4) nothing more.

but the question is how often will devs make the extra effort to use 1.4 considering the installed base of cards?

The installed base of PS 1.4 is also very large:

8500
9000
9100
9500NP
9500
9700NP
9700 Pro

I think there is significant reason to support it, don't you.Why should these card owners be not supported by DEVS..and a list coming soon will show you there is significant support..in fact the two games that Borsti used as examples "Tiger Woods" "UT 2003" support PS 1.4.

If the future Nvidia cards support it, then this entire arguement about the use of PS 1.4 is a moot point.
 
PS 1.4 is superior, but the question is how often will devs make the extra effort to use 1.4 considering the installed base of cards?

What people seem to forget is the effect of the IHV's developer relations. Sometime developer may not implement specific paths in their code themselves, but that doesn't mean that an IHV's developer relations won't drop a bit of code in their engine that will make things run faster/better on their hardware.

Somehow, even if developers overlook PS1.4, I don't think ATI's dev rel will...
 
I don't think the installed base will become a moot point until the 1.4 capable cards vastly overwhelms those that aren't.

And that won't happen for a long time because of the large numbers of GF3/GF4's that have been sold to date.
 
I don't think the installed base will become a moot point until the 1.4 capable cards vastly overwhelms those that aren't.

But that's only 1/3 the quation.

The decision to implementing a particular path depends on three things:

1) The advantage that path has on one architecture compared to another path.

2) The number of customers that would impact.

3) The ease of implementing that path.

If there is any advantage to implementing a PS 1.4 path on particular hardware over 1.1, it will be done by any dev if the effort is minimal. Almost regardless of the installed base. It's not like there just aren't "many" Radeon 8500, 9000, 9500+ cards out there. It is certainly significant, even if not as many as the GeForce3/4.
 
DaveBaumann said:
PS 1.4 is superior, but the question is how often will devs make the extra effort to use 1.4 considering the installed base of cards?

What people seem to forget is the effect of the IHV's developer relations. Sometime developer may not implement specific paths in their code themselves, but that doesn't mean that an IHV's developer relations won't drop a bit of code in their engine that will make things run faster/better on their hardware.

Somehow, even if developers overlook PS1.4, I don't think ATI's dev rel will...

I don't pretend to have the sure answer to my question. I'd be glad to see the ATI's dev rel active enough to make sure most games do support 1.4 where they can.
 
Deflection remember that unless the review says ATI is he best and godlike most people here will think it is biased. So before you say it is fair, think if it says something to that effect.

Everything is an irony, people patent certain things, or put their research into producing it, then when it is done, if the other company doesn't do it they are holding back development, but really the marketplace will in the end decide what is better, some things will just be skipped over in the end and it doesn't really matter. Nvidia cards did not have ps1.4, that is life they were built w/o it and they are not going to go back and put it in. If ATI and Nvidia did not do everything differently then they would both work the same and there would be no point in having two companies. Of course it is rather a pain that they each have to do things their own way, but hey look the 9700 was ahead of Nvidia, so this time, all the stuff that Nvidia does differently will be left out of the next set of games.

You could say ATI was holding back progression by not writing a backend for CG, which I understand they are going to do with R350.

Nvidia is holding it back by not writing a back end for render monkey, but does it really matter.

Why doesn't someone write a conversion program and get rich off it.
 
Back
Top