Shadermark

mczak said:
indio said:
I pretty sure I saw a quote somewhere when a Nvidia rep was asked why the NV30 performed so bad in Shadermark and the guy responded that he never heard of it . Then the next driver set tripled the performance.
Anyone know where that interview was?
Do you mean the quote by Ante P in this thread http://www.beyond3d.com/forum/viewtopic.php?t=6019&postdays=0&postorder=asc&start=0 ?

It appears like they did exactly what he said and "checked it out " :?

Ante P said:
exactly, that's why I was so interested in the results

btw the card is a FX 5800 Ultra from Gainward

I spoke to nVidia a month ago about their low shadermark 1.7 performance, at the time they were unaware that the benchmark even existed and said they'd check it out

as for quality it's indentical with both drivers which in turn is indentical to ATis
 
http://www.3dvelocity.com/cgi-bin/i...7ea9650d0cebab2f8aa618df3a67cf;act=ST;f=2;t=7

Seems nVidia replied to Pixelat0r (3dvelocity.com) regarding his concerns over the ShaderMark results. The relevent post is the fifth one in the thread I have provided a link to. I'll copy paste it in full below.


" Okay, having spoken to NVIDIA I'm going to try and spell out their thoughts with as little of my own opinion thrown in as possible so you can decide for yourselves. Here's what they told me.

NVIDIA works closely with games deveopers and 9 out of 10, and eventually nearer 10 out of 10 games will be either developed on their hardware, developed with Cg or developed with their direct input. How can they be accused of not conforming to industry standard shader routines when they ARE the driving force that sets the industry standards in shaders. Games developers are not likely to go shuffling instructions the way benchmark creators are and any games developer that wants to succeed will write their code so that it runs the way NVIDIA's shaders expect it to. the fact that their shader don't cut it on rarely heard of benchmarks and code that's been "tinkered" with is of little or no concern to them. They won't spend valuable time defending themselves against something they don't see as worth defending. Changing code doesn't expose cheating, it simply feeds code to their shaders in a way that they're not designed to handle it. Games developers will never do that, and if it's games performance that matters then NVIDIA is where the clever money is being spent.
When I asked about the FX's relatively poor performance in our "real" game tests the reply wasn't entirely clear but they certainly claim to have doubts on the reliability of FRAPS and the reliability of those using it.
In a nutshell they're saying that you can analyse all you want, future titles will be coded to run at their best on NVIDIA hardware and they suggested we ask the developers who they think is on top right now.

As I said, I have many, many thoughts on this statement but for now I'm biting my tongue to let others have their say first. "

Edited by pixelat0r on June 18 2003,18:31

END.
 
I've just discovered it was a telephone conversation & Pixelat0r has just stated in the thread I provided a link to that it didn't sound as arrogant as it reads. :?
 
NVIDIA works closely with games deveopers and 9 out of 10, and eventually nearer 10 out of 10 games will be either developed on their hardware, developed with Cg or developed with their direct input.

Somebody has been smoking something hallucinogenic again ?

When I asked about the FX's relatively poor performance in our "real" game tests the reply wasn't entirely clear but they certainly claim to have doubts on the reliability of FRAPS and the reliability of those using it.

Yeah right. By the way, reviewers should make sure their monitors are Nvidia-approved. As well as their mouses.
 
2senile said:
NVIDIA works closely with games deveopers and 9 out of 10, and eventually nearer 10 out of 10 games will be either developed on their hardware, developed with Cg or developed with their direct input.

Read: "9 out of 10 games used to be developed on their hardware." Any game that targets DirectX 9 features such as pixel shader 2.0 or float render targets/textures is now being developed on ATI R3XX hardware and will continue to be until NV40 is released (assuming NV40 doesn't have the problems NV3X does).
 
2senile that response from Nvidia deserves its own thread. :oops: The ramifications of such an arrogant line of reasoning is unbelievable. Do game dev.'s really like the idea that Nvidia wants to set the industries standards and that only with extra effort will their games run well? Truth has a funny way of enduring over hype. Nvidia has had two kicks at the can to make a GPU with good PS 2.0 performance. They have failed. If I was developing a game right now, I do not think that I would be able to disregard ATI's line of cards. As of late Nvidia has been on the road too nowhere. That quote sounds like desperation.

sorry if it sounds like I am babbling. I am going on very little sleep. I will elaborate later. :D
 
This lends some interesting insight into the xbox2/ATI situation, and especially shows how much control they want MS to have over 3D graphics (and programming) standards.

By making statements like those, it actually seems to be more of a pot shot at microsoft than at anyone else.

Nite_Hawk
 
Back
Top