Beyond3D Wolves Needed to Tear This One Apart

Status
Not open for further replies.
radar1200gs,

JoshMST and I have discussed things further via PM.

Theres no need to re-start your partisan NV3X crusade!
 
PeterAce said:
radar1200gs,

JoshMST and I have discussed things further via PM.

Theres no need to re-start your partisan NV3X crusade!

I wasn't really intending to discuss nv3x myself - the discussion was dragged that way by certain people (non-nvidia owners)...

Anyhow, I'm still waiting to see some specific examples of that "nvidia centric way of explaining things" that you say JOsh suffers from.
 
radar1200gs said:
You really can't blame nV for focussing more on DX8 than DX9 with nV3x.

nV3x has more DX9 functionality than R3xx, it's just that there is less DX9 performance).

Why would nVidia do things this way, look back to DX8 and how long it took for games to use its features. Until quite recently most games only really use DX7 features. Then you have the question of DX9 usage. Prior to TR:OAD and farcry (both of which only use DX9 as a tinsel garnish at best) there has been absolutely nothing on the games front that requires it.

Don't forget developers also targetted the XBOX which is DX8'ish.

So, IMO nVidia was wise to focus more on DX8 performance. In the real world thats what the chips were required to run. Theoretically R3xx may have had performance advantages in DX9 games, but, given the amount of Dx9 games available buring R3xx's lifetime theory is exactly what it remained.
Greg, this was a surprisingly incoherent and outrageously tangential post, WRT this thread. But if you feel the need to continue your fight, I'm here to please:

1. No one blames NV30 for focusing on DX8 performance. They blame them more for not improving AA quality, and for jerking everyone around with their D3D drivers.

2. NV3x does seem to have larger instruction allowances, but I thought it was still debatable as to whether it had greater functionality (centroid sampling, MRTs, geometry instancing vs. conditionals and half-precision floats).

3. That most games focused on DX7 "until recently" is probably nVidia's fault due to the GF2MX and then GF4MX dominating the market. I'm not sure how old budget hardware is an excuse for limiting next-gen products, though.

4. What do you mean "prior" to TR:AoD and Far Cry? Did you expect to see DX9 effects in games before the hardware or even API was released?

5. Point taken WRT Xbox and DX8, but ...

6. ... nV's wisdom was proven rather short-sighted upon R300's arrival, which offered equivalent DX8 performance AND superior DX9 performance AND nicer AA AND faster AF. So nV was wise, short a few ANDs. I'll be nice and ignore their wisdom re: manufacturing processes and HSFs.

Anyway, NV30 would've been a very nice leap all by itself, and pretty much in keeping with the improvements b/w GF3 and GF4. People would've been happy with a doubling of performance alone. But NV30 arrived alongside a certain R300, that offered all that and then some.

BTW, I'm pretty sure digi's owned an nV card or two in his time. If it makes you feel better, though, go ahead and assume his criticism is solely out of ignorance.

You have a GPU plagued by manufacturing problems, some hardware features not fully realised, with half the piplines of R300, shockingly bad early drivers and no hardware shortcuts taken (filtering precision, precision truncuation etc, etc), and yet a lot of the time it draws level with or beats the competition.
You forgot to mention that, despite nV's heroic struggles on a gimpy, half-pipe retread, they still budgeted more transistors than ATi and yet managed to mangle the initial drivers on a three-generation-old (DX8 focus, you said) architecture. That's wisdom, baby!

As for your very annecdotal evidence of more ppl selling 9700s than 5900s, that may be b/c 9700s are a frickin' half year older. Your 5900XT may scream along, but how does that equate to a 9700P not doing the same (and with gamma-adjusted edges :p)? And you neglected to mention exactly how NV3x would "influence the industry for years to come." Specifically, you forgot to mention that the influence would be more negative than positive, mainly in slowing DX9's acceptance.

Just ... just let it go. Three out of four Moms/dentists agree that NV40 rocks, so why do you feel the need to drag NV30 back into this? The fact that nV is ditching the FX series and name in favor of a 6200 at the low end should show you that even nV has realized the FX's failure WRT the competition and public consciousness. nV's almost perfect rebound with the 6800 should make it easy to accept NV3x's shortcomings and move on.
 
Don't care about the NV3x, it's been over for awhile

In performance and features ATI's R3xx architecture
blows away the NV3x.

There should be no argument on this as it is fact.
So Radar, I disagree with what you are saying.
 
Pete said:
BTW, I'm pretty sure digi's owned an nV card or two in his time. If it makes you feel better, though, go ahead and assume his criticism is solely out of ignorance.
GF2 MX400, GF3, GF4ti4200, GF4ti4400, and a 5900 non-ultra flashed to 5950 ultra.

The GF4ti4400 is in my 4 year old daughter's 1Ghz celeron and the 5950u is in my 7 year old son's 2500+/K7N2-L rig.

With the exception of the 5950 I've been exceptionally pleased with all the nVidia cards I've used, and I got the 5950 knowing ahead of time what I was getting. I got it used for a decent price off a friend because I wanted to see an FX in action for myself.

My son loves the 5950 though, and it is a hell of a gamer for him I must admit. :)
 
digitalwanderer said:
With the exception of the 5950 I've been exceptionally pleased with all the nVidia cards I've used, and I got the 5950 knowing ahead of time what I was getting. I got it used for a decent price off a friend because I wanted to see an FX in action for myself.

Okay, that might qualify as ignorant actually (albeit forced ignorance)! ;)

I've owned a few Nvidia cards (Riva128, Gerforce2,3,4), but ATI really won me over with the R3XX series (9700Pro).
When the X800's and the 6800's were announced I was inclined to actually go back to Nvidia, but I ultimately felt that their solution was not as "elegant" as ATI's, and so I bought the X800 Pro (which is running as an XT now) :p Interestingly enough, I usually recommend 6800GT's to my buddies as they don't upgrade as often as I do, and I feel that Nvidia still has a legup on ATI in the future-proof department right now. :)
 
Pete,

First off, what led you to believe I was targetting DW in any of my posts???, let alone that I considered him ignorant??? (I'd love to know where some of you guys get your ideas, I really would...)

Okay, briefly, NV3x's AA is just fine. It works as advertised.

Initial drivers were disappointing. Don't know what nVidia was thinking there, very unlike them. However that is no longer a valid criticism, current drivers work very nicely.

Older geforces dominating the market is certainly nothing to be ashamed of and not the reason for games not using new features. Just compare DX6 to DX7 for instance, or the transition period from glide to D3D/OpenGL.

You can thank TSMC for NV3x's lateness to market. It was intended to launch well before R300. Lots of the transistors in Nv3x have never been fully utilized. Thanks to TSMC and its "wonderful" 0.13 micron process NV3x lacked FP framebuffer (went on to become Open EXR in NV40) and this also impacted MRT's and render to texture amongst other things.

As for gamma adjusted AA edges, that to me smacks of the GPU altering data it has no right to alter in the first place. (No-one explicitly gave the GPU permission to alter the data).

I don't see nVidia "ditching" the nV3x series at all. It has had a life at least as long as many nvidia products prior to it and is now being phased out by the next generation, just like those prior products were.
 
Just where have I heard this deluded pile of BS....???? OH! Yes, in about 30 different threads .......... :rolleyes: :rolleyes: :rolleyes:
 
NV3x's AA is not just fine--you know, that whole ordered grid thing at 4x, a lack of higher multisampling modes, etc.? Its AA sucks.

Oh, and:

As for gamma adjusted AA edges, that to me smacks of the GPU altering data it has no right to alter in the first place. (No-one explicitly gave the GPU permission to alter the data).
Maybe I'm loopy from the fumes of having two computers run in close proximity in a closed room for several hours, but weren't you one of the people defending NV's shader replacement in 3DM03?
 
radar1200gs said:
NV3x's AA is just fine. It works as advertised.
Hold on there just a second, I have trouble with this one.

It was advertised as equal to ATi's by too many places for me to agree with that, 'specially after spending money just to see it for myself.

The nV3x does AA, but if you compare it to ATi's AA it doesn't do AA very well at all.

That's probably my personal biggest issue with the FX, the AA sucked. I can even still tell a difference with the 6800's AA, but it is muchly improved over the FX and almost up to ATi's in quality.
 
radar1200gs said:
Pete,


I don't see nVidia "ditching" the nV3x series at all. It has had a life at least as long as many nvidia products prior to it and is now being phased out by the next generation, just like those prior products were.
Hello! You will not have a nV3x gf4mx type card going for 3 years. The nV3x chip is dead dead dead. well except for all the 5200s laying around. And well the AA issue... you dont have a clue. Someday they might even give you and your lil 5900Xt a REAl driver with Bill Gates sig. Tho thats a nice linux card (listining Dave O. ?)
 
NObody from nVidia ever said that nV3x's AA was equal to that of ATi's. Other more ignorant third parties may have. That's not nVidia's fault.

And once again the AA works as advertised. It may not be rotated grid AA, but it is antialiasing nonetheless.
 
Like talking to the wall.........

Don't worry, take signature with 2 grains of salt and you will feel much better in the morning...... ;)
 
I (and presumably many others) are still waiting to see some specific examples of that "nvidia centric way of explaining things" that pete insists Josh suffers from.

There is an eerie silence on that issue, but frantic attacks on nVidia and its supporters from certain forum members instead. I'd say I'm suprised, but, sadly, I'm not...
 
radar1200gs said:
I (and presumably many others) are still waiting to see some specific examples of that "nvidia centric way of explaining things" that pete insists Josh suffers from.

There is an eerie silence on that issue, but frantic attacks on nVidia and its supporters from certain forum members instead. I'd say I'm suprised, but, sadly, I'm not...
digitalwanderer said:
"If I have to explain it to you, I'm afraid you wouldn't understand."
BTW-I'm pretty sure you're the only one waiting Radar.
 
You can thank TSMC for NV3x's lateness to market.

Um, unfortunately, that's actually NVidia's fault. They had to take into consideration possible problems with using newer processes. The GF3 was a gamble with a newer process... and they hit big... however the NV30 was a gamble too... but that became the big flop.

I don't see nVidia "ditching" the nV3x series at all. It has had a life at least as long as many nvidia products prior to it and is now being phased out by the next generation, just like those prior products were.

They aren't exactly ditching it... just hiding all traces of it's existance. Unlike previous generations, this was a series to forget .

There's nothing NVidia centric in these posts. It's just that when you compare the hardware to other parties (specifically ATI)... then the hardware that was delivered falls short of the competition. Even with the NV35, it was a decent (but still not great) attempt to fix some of the problems. If the GF6 series hasn't told you anything.. it's that the NV3x series is COMPLETELY the OPPOSITE of how the GF6 series will affect the industry.

Farcry with NV3x - uses LOTS MORE PS 1.1 shaders than PS 2.0 (supporting DX8 more than DX9) - very backwards in moving to DX9
Farcry with NV4x - uses some (more in future uses of the engine) PS 3.0 - moving forward (in contrast to the NV3x)
 
You can force farcry to run full DX9 on nv3x using 3danalyse (set the device ID to R300).

Developers should quit trying to force rendering paths down gameplayers necks and instead give them the option of which path they would like to use, just like driver optimizations.
 
Status
Not open for further replies.
Back
Top