radar1200gs said:
You really can't blame nV for focussing more on DX8 than DX9 with nV3x.
nV3x has more DX9 functionality than R3xx, it's just that there is less DX9 performance).
Why would nVidia do things this way, look back to DX8 and how long it took for games to use its features. Until quite recently most games only really use DX7 features. Then you have the question of DX9 usage. Prior to TR:OAD and farcry (both of which only use DX9 as a tinsel garnish at best) there has been absolutely nothing on the games front that requires it.
Don't forget developers also targetted the XBOX which is DX8'ish.
So, IMO nVidia was wise to focus more on DX8 performance. In the real world thats what the chips were required to run. Theoretically R3xx may have had performance advantages in DX9 games, but, given the amount of Dx9 games available buring R3xx's lifetime theory is exactly what it remained.
Greg, this was a surprisingly incoherent and outrageously tangential post, WRT this thread. But if you feel the need to continue your fight, I'm here to please:
1. No one blames NV30 for focusing on DX8 performance. They blame them more for not improving AA quality, and for jerking everyone around with their D3D drivers.
2. NV3x does seem to have larger instruction allowances, but I thought it was still debatable as to whether it had greater functionality (centroid sampling, MRTs, geometry instancing vs. conditionals and half-precision floats).
3. That most games focused on DX7 "until recently" is probably nVidia's fault due to the GF2MX and then GF4MX dominating the market. I'm not sure how old budget hardware is an excuse for limiting next-gen products, though.
4. What do you mean "prior" to TR:AoD and Far Cry? Did you expect to see DX9 effects in games before the hardware or even API was released?
5. Point taken WRT Xbox and DX8, but ...
6. ... nV's wisdom was proven rather short-sighted upon R300's arrival, which offered equivalent DX8 performance AND superior DX9 performance AND nicer AA AND faster AF. So nV was wise, short a few ANDs. I'll be nice and ignore their wisdom re: manufacturing processes and HSFs.
Anyway, NV30 would've been a very nice leap all by itself, and pretty much in keeping with the improvements b/w GF3 and GF4. People would've been happy with a doubling of performance alone. But NV30 arrived alongside a certain R300, that offered all that and then some.
BTW, I'm pretty sure digi's owned an nV card or two in his time. If it makes you feel better, though, go ahead and assume his criticism is solely out of ignorance.
You have a GPU plagued by manufacturing problems, some hardware features not fully realised, with half the piplines of R300, shockingly bad early drivers and no hardware shortcuts taken (filtering precision, precision truncuation etc, etc), and yet a lot of the time it draws level with or beats the competition.
You forgot to mention that, despite nV's heroic struggles on a gimpy, half-pipe retread, they still budgeted more transistors than ATi and yet managed to mangle the initial drivers on a three-generation-old (DX8 focus, you said) architecture. That's wisdom, baby!
As for your very annecdotal evidence of more ppl selling 9700s than 5900s, that may be b/c 9700s are a frickin' half year older. Your 5900XT may scream along, but how does that equate to a 9700P not doing the same (and with gamma-adjusted edges
)? And you neglected to mention exactly how NV3x would "influence the industry for years to come." Specifically, you forgot to mention that the influence would be more negative than positive, mainly in slowing DX9's acceptance.
Just ... just let it go. Three out of four Moms/dentists agree that NV40 rocks, so why do you feel the need to drag NV30 back into this? The fact that nV is ditching the FX series and name in favor of a 6200 at the low end should show you that even nV has realized the FX's failure WRT the competition and public consciousness. nV's almost perfect rebound with the 6800 should make it easy to accept NV3x's shortcomings and move on.