nVidia's CEO admits NV30 failed

http://www.theinquirer.net/?article=9173

THE CEO OF NVIDIA confirmed in front of a crowded room of partners in Cannes that the NV30 Ultra also-known-as the Geforce FX 5800 Ultra failed to fulfil many of the things that both it and its partners expected.

I think it's about time they came to grips with reality. Although NV35 might put them back in track in terms of performance, I'm pretty well convinced that it's still going to trail the 9800P when it comes to Image Quality.
 
not in AF situations :)
Application is still the best AF available today and NV35 will be fast enough to use it. I don't like those reduced quality modes on both sides (ATI and Nvidia). If there is not enough performance i would use a lower AF mode like 4x application but i would never consider modes like balanced or agressive or whatever.
But AA for sure.
 
As you might know, I've been using nVidia solutions for a long, long time...

I just picked up a 9800 Pro on Friday, and my opinion is that ATI is much better on all accounts...be it filtering, or AA.
 
Typedef Enum said:
I think it's about time they came to grips with reality. Although NV35 might put them back in track in terms of performance, I'm pretty well convinced that it's still going to trail the 9800P when it comes to Image Quality.



I highly doubt it was ever an issue of NVIDIA's relationship with reality, unless anyone truly believes NVIDIA is impervious to mistakes. Nah, they just waited as long as possible before admitting the goof publicly, which is nothing new.
 
Typedef Enum said:
As you might know, I've been using nVidia solutions for a long, long time...

I just picked up a 9800 Pro on Friday, and my opinion is that ATI is much better on all accounts...be it filtering, or AA.

NVIDIA seems to be at their best when fighting an uphill battle.
 
Lol.. go Nvidia :rolleyes:

Lets just see what they produce.. if it ends up them being 2nd to ATi.. lets all just live it with ... mmmkay :D
 
Kinda sad seeing people question NV35 vs R350 still...
NV35 vs R350 256MB is questionable.

NV35 vs R390 isn't very funny either. The R390 will likely win.
The question is not who's gonna win there. The question is when the R390 will be available.

Of course, you could always find a case where the NV35 loses to the R350, and that's where there's massive FP operation usage. But it'll be less of a difference than with the NV30, and in things which got some nVidia optimizations either because the developer wants it ( Doom 3 ) or because nVidia is "cheating" via drivers, the NV35 will win heavily.


Uttar
 
Uttar said:
...in things which got some nVidia optimizations either because the developer wants it ( Doom 3 ) or because nVidia is "cheating" via drivers, the NV35 will win heavily.

You are encouraging nVidia to cheat with drivers?
 
Uttar, I'm not sure its actually going to be a total walkover in all cases even if ATI don't up the clocks. DX9 PS/VS is a shoe for R350 assuming NV sticks to spec (and its still only 4 FP PS units). If NV40 is still 4x2 there may be cases that favour R350 there and going by NV31's pace, I'm not sure NV's colour compression is quite as effective as ATI's.
 
BoddoZerg: Actually, my stand on this is kinda more complex. IMO, in things like 3DMark, cheating is "okay" if it doesn't degrade IQ. At all. And I mean, nothing. I *have* said several times that I found the 42.68 release to be completely lame, for example. Anyway, that's because 3DMark claims it's a game benchmark. If it wasn't, and they claimed it was more theorical, then I would be against any type of driver hack, even if it didn't degrade IQ. But in a game, only the result counts, not the method to get to it.
Thus, I *heavily encourage nVidia to "cheat" in games, and not to focus too much on 3DMark and stuff, and to cheat in a way that does not degrade IQ* - and I'm serious. What I want, as a gamer, is higher performance and high IQ. I don't care how that's achieved, as a gamer ( although I do as a techie )
Cheating on theorical benchies is unacceptable, however, IMO.

DaveBauman: Psst, it's NV35, not NV40 ;)

Well, the 256MB R350 is 400Mhz according to rumors, and it can do 16 ops/clock total considering FP & TEX and no scalar special cases and stuff.
The NV35 is 450Mhz according to rumors, and I think it can do 12 ops/clock total considering FP & TEX and no FX.
That means:
NV35: 5400
R3500: 6400

This deficit is certainly a lot less serious than with the NV30, and the NV3x get huge performance advantages with FX. Although register usage remains a serious part of the equation... We can't know that before we get benchies.

It's hard to judge Color Compression right now. nVidia claims the NV31 uses the same Intellisample 2.0. as in the NV35 - but at the same time, you got to realize it's a performance part, so things like caches might not be as big. Remember you R9600 conclusions, don't you? :)

But yeah, there might be cases where the R350 win. But I believe that they'll be quite rare. As I said, the 256MB version is a completely different matter. I'd predict a mixed bag, but I don't know for sure either.


Uttar
 
how could you even consider it cheating if it doesn't degrade image quality? the problem is that nvidia has been sacrificeing image quality for speed.
 
kyleb said:
how could you even consider it cheating if it doesn't degrade image quality? the problem is that nvidia has been sacrificeing image quality for speed.

Well, it's cheating because it doesn't do what the application asks it for.
Anyway, rumors are with the 43.51 drivers, there already is a lot less IQ problems as with 43.45 in 3DMark.

Also, I never heard of IQ problems in GT2 & GT3 for 3DMark 2003 & NV3x. Did it ever cross your mind that this might be because there aren't any? Few games in the coming months are gonna be as shading-frenzy as GT4. So optimizations similar to the GT2 & GT3 ones, which give a very nice performance advantage VS the R3xx, could in fact be done in many games.


Uttar
 
Uttar said:
Also, I never heard of IQ problems in GT2 & GT3 for 3DMark 2003 & NV3x. Did it ever cross your mind that this might be because there aren't any? Few games in the coming months are gonna be as shading-frenzy as GT4. So optimizations similar to the GT2 & GT3 ones, which give a very nice performance advantage VS the R3xx, could in fact be done in many games.


Uttar
AFAIK the only optimization is, that the faster performing drivers use PS1.3 instead of PS1.4 in GT2 & GT3. Because those tests don't use any PS1.4 specific features, there is no IQ difference.
 
CEO admitting to product failure?

Do some of you read what you post?

What's worse is that you guys "trust" the Inquirer.
 
AFAIK the only optimization is, that the faster performing drivers use PS1.3 instead of PS1.4 in GT2 & GT3. Because those tests don't use any PS1.4 specific features, there is no IQ difference.
Maybe no visual difference but why run with extra passes (which is what would happen using PS1.3 over PS1.4)? Is the NV3x architecture going to be that much better using 1.3 over 1.4 that the performance hit of having to do additional vertex processing is totally outweighed by the supposed increase in speed of the pixel processing?
 
Neeyik said:
AFAIK the only optimization is, that the faster performing drivers use PS1.3 instead of PS1.4 in GT2 & GT3. Because those tests don't use any PS1.4 specific features, there is no IQ difference.
Maybe no visual difference but why run with extra passes (which is what would happen using PS1.3 over PS1.4)? Is the NV3x architecture going to be that much better using 1.3 over 1.4 that the performance hit of having to do additional vertex processing is totally outweighed by the supposed increase in speed of the pixel processing?

My guess is they aren't really using PS1.3. or anything. More like their own proprietary system ( kinda like NV_Fragment_Program in OpenGL ) , where they'd have their 1024 instructions limit and FX12/FP16/FP32. My guess is also they're using FX12 everywhere in GT2/GT3 :)
Just guesses, really.


Uttar
 
Back
Top