NVIDIA Are the Industry Standard Shader Driving Force?

Dave Baumann

Gamerscore Wh...
Moderator
Legend
Following the ShaderMark results at 3D volcoity they spoke to NVIDIA concerning the poor results, this this (paraphrased) was their responce to him:

[url=http://www.3dvelocity.com/cgi-bin/ikonboard/ikonboard.cgi?s=427ea9650d0cebab2f8aa618df3a67cf;act=ST;f=2;t=7 said:
NVIDIA to 3DVelocity[/url]]NVIDIA works closely with games deveopers and 9 out of 10, and eventually nearer 10 out of 10 games will be either developed on their hardware, developed with Cg or developed with their direct input. How can they be accused of not conforming to industry standard shader routines when they ARE the driving force that sets the industry standards in shaders. Games developers are not likely to go shuffling instructions the way benchmark creators are and any games developer that wants to succeed will write their code so that it runs the way NVIDIA's shaders expect it to. the fact that their shader don't cut it on rarely heard of benchmarks and code that's been "tinkered" with is of little or no concern to them. They won't spend valuable time defending themselves against something they don't see as worth defending. Changing code doesn't expose cheating, it simply feeds code to their shaders in a way that they're not designed to handle it. Games developers will never do that, and if it's games performance that matters then NVIDIA is where the clever money is being spent.

When I asked about the FX's relatively poor performance in our "real" game tests the reply wasn't entirely clear but they certainly claim to have doubts on the reliability of FRAPS and the reliability of those using it.
In a nutshell they're saying that you can analyse all you want, future titles will be coded to run at their best on NVIDIA hardware and they suggested we ask the developers who they think is on top right now.
 
3DVelocity should get ATI's opinion on their "9 out of 10, and eventually nearer 10 out of 10" statement.

Hey Dave, make sure you ask Huddy this!
 
Let me get this straight - According to nVidia, reviewers should now:

  • a. Not use 3DMark
    b. Not use any other synthetic benchmarks
    c. Not use real games to benchmark with FRAPS

I guess we can look forward to reviews based on how nice the PCB looks now, can we, or would that be unfair too? :rolleyes:


I also thought there was a shade of Iraqi Information Minister in there too: 'ATi do not exist'....'The infidels using ATi to develop games will burn in their tanks'...
 
I think what nVIDIA are doing/saying is right.

Right to their PR dept.

Of course, let's just hope that gamers don't get screwed in the end.
 
Richard was happy to provide a responce! ;)

Richard Huddy said:
3DVelocity said:
NVIDIA works closely with games developers and 9 out of 10, and eventually nearer 10 out of 10 games will be either developed on their hardware, developed with Cg or developed with their direct input. How can they be accused of not conforming to industry standard shader routines when they ARE the driving force that sets the industry standards in shaders. Games developers are not likely to go shuffling instructions the way benchmark creators are and any games developer that wants to succeed will write their code so that it runs the way NVIDIA's shaders expect it to. the fact that their shader don't cut it on rarely heard of benchmarks and code that's been "tinkered" with is of little or no concern to them. They won't spend valuable time defending themselves against something they don't see as worth defending. Changing code doesn't expose cheating, it simply feeds code to their shaders in a way that they're not designed to handle it. Games developers will never do that, and if it's games performance that matters then NVIDIA is where the clever money is being spent.

It's fair to say that NVIDIA hardware is used in the development of most games. That's true - but it's not as spectacular a domination as NVIDIA would have people believe. ATI is also used in the development of most games. In fact I suspect that you couldn't find a single example of a game which was developed without some use of both Vendor's hardware. That's the way the process works. No game could be released with a respectable QA process which involves everyone's hardware.

But what NVIDIA are trying to claim is that developers produce code which is specifically tuned to work best on their hardware - and that claim is completely bogus. Sure they have an active DevRel team who try to intervene in the development process and steer things NVIDIA's way - but we all know that the two real dominating forces in games are (a) schedule and (b) the game. For that reason most shaders are written by the games developers in general kinds of ways - most of them are not tuned for NVIDIA hardware at all. NVIDIA don't control this industry nor will they ever.

They claim to be "the driving force that sets the industry standards in shaders". If that's the case then it's odd that they arrived late with their DX9 support (about 6 months behind ATI), that they have been shown to re-write several DX9 benchmark shaders to run on their DX8-style fixed point interfaces, that the OpenGL ARB declined to use Cg as the basis for OpenGL's high level shading language, that their own demo 'Dawn' runs faster on ATI hardware than on NVIDIA hardware even with the extra layers of software involved etc., etc.

NVIDIA are trailing for the moment. Now I don't think they're going to trail forever - but they still haven't come to terms with the fact that they're very much second best at the moment. And in that sense they're like alcoholics... The first step to recovery is to come to terms with the truth in the situation. Right now NVIDIA can't seem to do that, instead they're just saying that everyone else is wrong.

As for the claim that games developers don't shuffle shader ops around well that's an odd statement. Developers clearly do tweak shaders during the development process mostly to make experimental tweaks to the functionality - and often that means that reordering happens many times along the way. But sure, when a game is released it tends to remain unchanging. Then if NVIDIA become interested in the benchmarking built into the game then if they want to, they can go in and detect the shaders and consider substituting in other shaders that do 'similar' things (usually with some image quality reduction) but run faster. At that point NVIDIA would describe them as Application Specific Optimisations - but since they are authored solely with the purpose of getting higher benchmark scores then the so-called optimisations may be totally inactive during actual game play.

It's also clear that NVIDIA have been involved in this process in a very extensive way. The revelations regarding both ShaderMark and 3DMark03 make it abundantly clear that NVIDIA do re-write their shaders specifically to rise their scores in synthetic benchmarks. Clearly they're very interested in benchmark scores no matter how synthetic.

The statement that, "Changing code doesn't expose cheating, it simply feeds code to their shaders in a way that they're not designed to handle it" is also very obviously not true. If this were the case you might expect to see a small reduction in shader performance - but you cannot explain the massive performance drops that have been seen in recent cases. It would be remarkable indeed if NVIDIA had designed hardware that could only run the shaders from this "rarely heard of benchmark" at decent speed and any changes to that setup would many times slower. That would suggest that their hardware was perhaps the most badly designed you could imagine. Where's all this programmability they keep claiming to have? If you use it then you lose all your performance?

Actually on reflection I guess that you could argue that the above quote from NVIDIA _is_ true. Take it literally - and don't worry about the word 'cheating' in there - we'll let them use their "Get Out Of Jail Free" card for that. What the NVIDIA defence could be claiming is that their hardware is not designed to handle DX9 shaders. Something I guess I'd be happy to accept.

3DVelocity said:
"When I asked about the FX's relatively poor performance in our "real" game tests the reply wasn't entirely clear but they certainly claim to have doubts on the reliability of FRAPS and the reliability of those using it. In a nutshell they're saying that you can analyse all you want, future titles will be coded to run at their best on NVIDIA hardware and they suggested we ask the developers who they think is on top right now."

It's a fine sight isn't it? A company that used to lead by example with innovative technology and honest product positioning is reduced to saying that anyone who uses FRAPS to check on NVIDIA's story is unreliable. There's no reason I know of to doubt FRAPS - it's widely used and well respected.

It reminds me of the guy who was talking to his psychologist and his psychologist said, "You're in denial". To which the guy's simple response was, "No I'm not".

Developers genuinely like the fact that there's some intense competition in graphics these days. They see that as a good thing - and many of them like the spectacle of the struggle for technological supremacy. I don't think they're impressed by this kind of nonsense.
 
:oops: The ramifications of such an arrogant line of reasoning is unbelievable. Do game dev.'s really like the idea that Nvidia wants to set the industries standards and that only with extra effort will their games run well? Truth has a funny way of enduring over hype. Nvidia has had two kicks at the can to make a GPU with good PS 2.0 performance. They have failed. If I was developing a game right now, I do not think that I would be able to disregard ATI's line of cards. As of late Nvidia has been on the road too nowhere. That quote sounds like desperation and leaves me with the impression that they have serious "delusions of grander".

sorry if it sounds like I am babbling. I am going on very little sleep. I will elaborate later. :D
 
Omg, now even fraps is shitlisted :oops:

i wonder what Kyle will say :)

we will probably be suggested to ask a shaman to measure card's performance. :rolleyes:
 
mczak said:
Sabastian said:
Mummy said:
we will probably be suggested to ask a shaman to measure card's performance. :rolleyes:

lol, I think nvidia would approve. hehe.
I just imagine a nvidia-approved shaman, with a big "The Way it's Meant to be Benchmarked" tatoo.

Nvidia approved Reviewer
partikle-voodoo_t.jpg
 
this is the part that bothers me

claim to have doubts on the reliability of FRAPS and the reliability of those using it

I have corresponded with the maker of Fraps in the past and he seems very knowledgable. He had full DX9 support back in December. Maybe B3d should interview him :D Fraps does some real cool stuff , you can even record movies for those not familiar with it.
However back to the point , why does Nvidia have doubts as to the reliability of people using Fraps ? You think there was a hidden "message" for them?
 
indio said:
However back the point , why is does Nvidia have doubts as to the reliability of people using Fraps ? You think there was a hidden "message" for them?
Maybe they're just foreshadowing an upcoming driver release with Fraps app detection... :rolleyes:
 
Re: Fraps...
Accuracy is not a issue, I've loaded up games with a in game counter and ran Fraps with it, the counter was matching exactly what the 'in game' counter was.

There has been some classic statements come out of NV PR latlely, that award Nvidia recieved for best 'GPU' of the 2002 should have been a Academey Award for best 'comical PR' of 2002/2003.
 
wow, thats all i can say



FRAPS accurecy is excellent, I don't think thats in question

the only issue I have with fraps is the human error factor

when using fraps to manually take the average of a recorded timedemo there is human error in the exact time of starting and the exact time of stopping, but really if you keep an eye on it and watch carefully you can hit it pretty close each time, close enough that it works well for me doing it

personally i think fraps is a great tool if the game doesn't have a built in frame counter
 
Brent said:
wow, thats all i can say



FRAPS accurecy is excellent, I don't think thats in question

the only issue I have with fraps is the human error factor

when using fraps to manually take the average of a recorded timedemo there is human error in the exact time of starting and the exact time of stopping, but really if you keep an eye on it and watch carefully you can hit it pretty close each time, close enough that it works well for me doing it

personally i think fraps is a great tool if the game doesn't have a built in frame counter

Well the margin of error will decrease as the size of the sample taken increases. 200 frames difference doesn't matter much when the total frames counted is over 18000 (which is 5 minutes at 60 Fps) . The margin of error is 1.1%
 
Back
Top