NVIDIA Are the Industry Standard Shader Driving Force?

radar1200gs said:
I can. NOLF 2.

OK, so which 3D cards selling when Nolf2 shipped did not support hardware T&L? I was talking about the period in which nVidia originally hyped T&L--back in '99-'00. Nolf2 hadn't been written at that time.

The simple fact is the 5200 will become the baseline standard by sheer force of numbers. The reason is simple - cost. At the 5200's pricepoint ATi can't touch nVidia.

So you are saying that ATi has nothing to sell in the 5200 price range...? I don't understand your point. As stated, the people for whom a primary use of their platforms is 3D gaming are simply not going to be interested in what a 5200 has to offer simply because it is too slow. It seems to me you are mischaracterizing cards in this price range as being sought after by people for whom a primary focus of computing is 3D gaming. I disagree. I think people for whom 3D gaming is a primary consideration will be looking at R9600/FX5600-level cards and up.
 
Walt, the great thing about the 5200 is that if developers support it, then they by default support the rest of the GF-FX line. That works in reverse too - If a developer supports NV35, he automatically supports NV34.

nVidia really can have their cake and eat it too, here. Unified drivers are a wonderful thing.
 
radar1200gs said:
No, my beef was more with Mr Huddy trying to compare the speed of a hacked and incomplete Dawn on ATi with the real Dawn on NV3x. You can't compare apples and oranges. Mr Huddy should know that.

Yea lets look at Dawn again. It uses CUSTOM OPENGL CALLS FOR NV ONLY!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

Do I need to repeat my self or dont you really understand that point. Thus no matter what ATI does it will NEVER BE AN "apples" to "apples" comparisons. Yea Mr. Huddy was not 100% correct but most people hear know what point he was driving at and dont have an issue with it. Only die hard fanboys will even try to agure this fact. Get some common sense please.
 
The thing that is important as regards the 5200 is not the DX9 performance, but the DX8 performance. Let's face it, almost no games are going to support DX9 any time in the near future. Even then, I expect many of them will be similar to 3DMark2003 i.e. using DX8 shaders whenever possible with just a few PS/VS 2.0 shaders where necessary.

The problem is that the performance of the 5200 with DX8 shaders isn't too hot, either, if I remember correctly. Kudos to NVidia for bringing out a cheap DX9 chip, but it's a pity about the overall performance. I, personally, wouldn't consider anything less than a RV350 or NV31 if I was looking to buy a card now, unless I could get a good price on a more established DX8 card i.e. GF4 4200 or Radeon 8500/9100.
 
jb said:
radar1200gs said:
No, my beef was more with Mr Huddy trying to compare the speed of a hacked and incomplete Dawn on ATi with the real Dawn on NV3x. You can't compare apples and oranges. Mr Huddy should know that.

Yea lets look at Dawn again. It uses CUSTOM OPENGL CALLS FOR NV ONLY!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

Do I need to repeat my self or dont you really understand that point. Thus no matter what ATI does it will NEVER BE AN "apples" to "apples" comparisons. Yea Mr. Huddy was not 100% correct but most people hear know what point he was driving at and dont have an issue with it. Only die hard fanboys will even try to agure this fact. Get some common sense please.

I'm not entirely sure here (Uttar would know), but I think Dawn uses OpenGL through Cg. If that is the case, the custom OpenGL calls should not matter, nor should a custom wrapper be required. The only requirement would be for ATi to impliment a backend Cg compiler that properly exploits their hardware.

Yes, ATi is permitted, in fact, encouraged to write their own Cg compiler -it's no use ATi complaining that Cg does not support say PS V1.4 - it is ATi's job to make it support it, not nVidia's.
 
No, I think its the dependance on the NV_ extensions! As for Cg/PS1.4 there's no point exposing it if there's no syntax for it withing Cg, which I believe is the case - NVIDIA has control of the syntax.

This "apples-to-apples" argument is slightly specious anyway since Dawn uses a mix of precisions, mostly lower than FP24 and ATI's hardware will be doing them all at FP24.
 
Oh boy, do I have to go through all the Cg explanation again, like on Anandtech?

The syntax is generic, and independent of the compiler.

Cg is split into two parts - the language and the back end compiler for the language.

The actual language is the same for everybody. The backend compiler differs for everybody - it is where the vendor provides support for his chipsets features. It is up to each individual vendor to create his own optimised back-end compiler - that is not nVidia's responsibility (though I bet they would love to write an ATi backend :LOL: :devilish: )

Of course ATi would rather claim that Cg has problems than support something nVidia co developed - but that is another kettle of fish entirely.
 
radar1200gs said:
Oh boy, do I have to go through all the Cg explanation again, like on Anandtech?
Cg has nothing to do with the problems with Dawn running on Radeon cards. Here's what I get when I run Dawn without the wrapper:
Code:
This program requires an NVIDIA GeForceFX-class GPUY or better to run.
Unsupported features:
NV_primitive_restart
NV_vertex_array_range
NV_pixel_data_range
NV_fence
NV_register_combiners
NV_register_combiners2
NV_vertex_program
NV_fragment_program
WGL_EXT_allocate_memory
Now can you please stop talking about things you have no clue about? Thanks.

-FUDie
 
radar1200gs said:
Notice the ATi model has coarser hair and the lighting/coloring in it isn't quite right. Note also the total absense of eylashes.
I see eyelashes on my system. And the lighting and coloring in the hair are different because the two screenshots you posted are taken from different positions.

-FUDie
 
I'm always amazed at the length fanboys can go to learn techno-babble they understand strictly nothing about...
 
well if Kyle did not talk so much crap and would remove his head from under the sand, people would not spit on him so much. It take INTEGRITY to admitt when you are wrong and he does not have it. Brent on the other had seems to be a straight shooter and tells it like it is..

Kyle lets his friendship with nVidia employess get in the way of the facts and that is why everyone gets on his case so much....

RussSchultz said:
I liked it when both Brent and Kyle used to come over here and talk to you guy's.

Well, if every time I came and visited some place, and people spat on me, I'd stop visiting. And that is exactly what happens here when Kyle comes and visits. Whether he's right or wrong in his facts, or counter to the prevailing opinion, the amount of vitriolic antagonism spewed in his direction is downright embarrasing.
 
radar1200gs said:
It does not matter whether the wrapper is broken or not. The fact is, not everything is being rendered as it should be, therefore (as a developer like Mr Huddy should well realise) you cannot claim that Dawn runs faster on Ati than on nVidia - you are comparing apples with oranges...

did someone say eyelashes they are only missing in the ultra mode and they are present in the normal mode of the demo....
 
I guess the same could be said about nVidia and 3dmark, SC and a few other games that have troubles with missing featurs but Nvidia still points to the fact that they are faster........LOL cannot have it both ways pal....
RussSchultz said:
Squega said:
So radar1200gs, were you trying to say that because ATi's cards are running the Dawn Demo with a hacked wrapper that was not created by them, and the wrapper has problems with the demo, that ATi's shaders have something wrong with them? :rolleyes:
No. Christ, don't you people read? He said it (the wrapper) was having problems and wasn't rendering the same thing that the NVIDIA cards were--pieces of the model, for example, were missing. Therefor, its disingenuous to point to the fact that its running faster.
 
actually ATi did NOT tip off anyone to 3dmark 2K3.....and that is what started this whole mess.....

radar1200gs said:
I notice Mr Huddy claimed in his response that ATi renders the Dawn demo faster than nVidia can.

Perhaps he should examine ATi's output more closely. The hair is incorrectly rendered and the eyelashes are missing altogether. These are computationally expensive components of the Dawn demo - little wonder ATi hardware runs the demo faster!

For those who will claim that that ATi's shaders are superior I ask "then why can you not render Dawns hair and eyelashes correctly"? If your reply is the extensions are unsupported I ask "why are they unsupported, or equivelant extensions not in place? After all you claim to have a superior product..."

nVidia got what they wanted ATi dragged into this whole cheating scandal.....
Actually, it was ATi who tipped tech-report off and started the current round of squabbling...
 
MuFu said:
My list looks like this:

1. Iraqi Information Minister
2. Bose
3. nVidia


They still have a way to go to beat Bose, IMO. At least nVidia actually make pretty decent products.

MuFu.


There is no ATi !

There is no FutureMark !

There is only nVIDIAMark !

We got them all surounded !
 
radar1200gs said:
[...]although you would reasonably expect a card widely touted as being superior to the competition to run the Dawn demo without problems, especially when a developer claims the "superior" cards advantage is clearly in the shaders.
Now, why would you assume that nVidia would code a demo that would run flawlessly on ATi hardware? And if nV hardware is less capable than ATi in terms of FP shaders, don't you think nV would tend to use fewer rather than more FP shaders in their demos?

I agree that you can't really compare Dawn performance on both cards if they're not processing the same thing, but the fact that ATi cards are running Dawn faster through a wrapper is slightly impressive, especially when you compare it to nV's FP32 speed.
 
radar1200gs said:
Walt, the great thing about the 5200 is that if developers support it, then they by default support the rest of the GF-FX line. That works in reverse too - If a developer supports NV35, he automatically supports NV34.

nVidia really can have their cake and eat it too, here. Unified drivers are a wonderful thing.

Radar, I'm not quite getting your drift. You yourself pointed out in an earlier post that the "lowest common denominator" is what is programmed for, and I responded with a conditional agreement that developers often support both low-end and high-end hardware in the same software release, but not always, of course. On the assumption that the low-end (lowest denominator) is programmed for then you're looking at DX6-DX8 game support, probably moreso than any strict DX9 support for at least the rest of this year. So at the low end of the market I see no advantages for the 5200 over competing hardware.

I don't know what you mean by your reference to unified drivers...;)
 
YeuEmMaiMai said:
well if Kyle did not talk so much crap and would remove his head from under the sand, people would not spit on him so much.
And you, sir, have just proven my point.
 
Doomtrooper said:
RussSchultz said:
And you, sir, have just proven my point.

What calling a spade a spade ?? Please, people have a right to voice their opinion...you do that very often don't ya know
There's a difference between voicing your opinion and being a complete dick. Feel free to voice your opinion, just be somewhat polite about it.

The lament was "why doesn't kyle come here?" and the answer is "because we've got some people on this board that are complete dicks to him".
 
Back
Top