What was good about NV3x?

i actualy was pretty pleased with the 2d and video acceleration on my nv3x. and it's opengl support was pretty decent, overall. and for some reason, older games (pre-dx8)seamed to have much nicer color overall. but that might have been monitor calibration.
 
NV30 purportedly inherited the Gf4MX's integrated video encoder, so in contrast to the Geforce4Ti, where there was a choice of different external encoder chips with different quality, this was turned into a reliable constant. And >=5MHz video bandwidth is still state of the art for non-HDTV outputs.

Gradient instructions are nice, though uses are rare. Emulating these is difficult and slow.

But the only thing I seriously like is the ability to arbitrarily underclock the chip in 2D mode :?
 
when it boils down to it, the only thing bad about the nv3x was performance. it had rediculous features for it's performance, making most of those features basicly unusable. overall i was pleased with my GFfx, but then again i bought it with the full knowlage that the radeon 9700 i owned for over half a year would out perform it by a signifigant margine.

i bought it just to play with, because i like having new hardware, and basicly because i wanted a solid openGL performer because i've owned too many ati cards to know that they might break something randomly (IRT openGL) for a driver release or two. and on that note, i will say that with all of the r300 series cards i've owned they were definatly a turning point for ati in software (and hardware) quality. pretty much everything they promised they delivered with an exception here and there.
 
DaveBaumann said:
ChrisRay said:
Besides, your point is irrelevent since he asked what was "good" about the card. Not what exclusive good features it had.

If thats the way you took the question then why not just include things like texture mapping, filtering, Z buffering, alpha blending, triangle setup, transformation, lighting, etc., etc. ;)

Sure, But none of those actually stood out during the cards lifetime. It's filtering however did. It was one of the few real advantages the card had. Aside from super sampling. Which is not an advantage now that the Nv4x is around. I'd still rather have NV3x AF on the Nv4x.
 
It had enough good points to make the 5900XT pretty popular when it debuted for ~$200 with Call of Duty. This list of positives covers the whole NV3x range (well, except maybe the 5200 :p):

+faster 2D than ATi (according to Firingsquad)
+better multimon support than ATi (especially in Win2K)
+lower idle power usage than ATi thanks to dynamic clocking
+good "legacy" and DX8 performance (whether due to engines geared toward nV's 4x2 architecture, faster OGL drivers, or just incredible clock speeds)
+initially better (then worse, thanks to brilinear [and UT2K3]) AF than ATi

And, for devs (AFAIK):

+the chance to play with longer shaders and FP32 (way) ahead of time

Really, its biggest weakness was slow DX9 performance, a weakness exposed by 3DM03 and only evident in a handful of later titles (TR:AoD, Far Cry, ... um, I'm drawing a blank, but I guess there was the odd DX9 RTS). I'm not even sure HL2 is a real weakness, as apparently there's a DX8 water reflection hack that brings the FX's IQ just about up to DX9 level, and--per Anand's DX8 numbers--at speeds pretty close to a 9700 (but, sadly, closer to a 9600XT).
 
It's definitely not a bad card at all. I think Nvidia got muscled around by Microsoft a bit on the DX side of things and they payed for it. The AF on the cards was the best I've seen thus far like ChrisRay said. My Radeon 9700 was definitely faster in most cases, but I enjoyed using the 5900 and it never gave me a problem or a glitch.
 
And, for devs (AFAIK):

+the chance to play with longer shaders and FP32 (way) ahead of time

I highlighted "play" on purpose. I'd figure that developers would prefer to use them (as in real game code for future games) on NV4x.
 
ondaedg said:
It's definitely not a bad card at all. I think Nvidia got muscled around by Microsoft a bit on the DX side of things and they payed for it.
More like nVidia tried to muscle M$ on the DX side of things and utterly failed. ;)
 
ondaedg said:
It's definitely not a bad card at all. I think Nvidia got muscled around by Microsoft a bit on the DX side of things and they payed for it. The AF on the cards was the best I've seen thus far like ChrisRay said. My Radeon 9700 was definitely faster in most cases, but I enjoyed using the 5900 and it never gave me a problem or a glitch.
That's what I thought at first, too, but most of the problem was that they decided to go for a VLIW architecture with the NV3x, but didn't start designing the compiler until very late in development. In other words, they didn't realize until they were done with the card that they couldn't write a shader compiler that would make it perform to reasonable levels when using floating point precision.

Anyway, the anisotropic filtering on the NV2x was definitely better than that offered with the NV3x.
 
Reverend said:
BTW, just wanted to add that the NV3x wouldn'tve sucked if the R300 never came out.

:LOL:
Good point and a fair one. It wasn't so much that nVidia came out with a bad card with the FX so much as how the FX looked next to the R300 which I still think is gonna go down as a watershed card in viddy card history.

nVidia had their low moment when ATi had their epiphanial one.
 
I don't know so much about that, Rev. The performance was below anybody's expectations. I mean, the GeForce 6600 GT really shows what the GeForce FX 5800 Ultra should have performed like, what everybody expected it to perform like, if they hadn't massively screwed up.
 
Chalnoth said:
I don't know so much about that, Rev. The performance was below anybody's expectations. I mean, the GeForce 6600 GT really shows what the GeForce FX 5800 Ultra should have performed like, what everybody expected it to perform like, if they hadn't massively screwed up.
But isn't that because nVidia paper-launched it so early and with such massively overblown expectations from the card? :|
 
The performance was below anybody's expectations. I mean, the GeForce 6600 GT really shows what the GeForce FX 5800 Ultra should have performed like, what everybody expected it to perform like, if they hadn't massively screwed up.
well... nVidia kinda hinted at what direction they are heading twards when they released the GF3. durring those press released they stated that they were concentrating on "higher quality pixels" as opposed to more pixels, like they had in the past. if you look at the GF3, for example, it really wan't any faster than the GF2u in dx7 games. in fact, i seam to remember some games favoring the GF2u. noone really complained about the speed of the GF3 when it came out, because there ati's offering at the time (radeon vivo) was so far behind. basicly the same thing happened with the 5800. it wasn't really any faster than the ti4800 but offered more programability, faster fsaa, and better 2d. the differance was that ati had a competiung product that was released half a year earlier and outperformed it in most situations.

the bottom line was that nVidia seriously underestimated ati. and with good cause, seriously. look at the past, when ati's 8500 (released months after the gf3 with rediculouly better paper specs) could barely keep pace with the GF3 in games. or the radeon vivo (and later 7500) that had trouble keeping up with the GF2.
 
digitalwanderer said:
But isn't that because nVidia paper-launched it so early and with such massively overblown expectations from the card? :|
Not even that. One would expect that the new card, having more than twice the transistors of the previous part, and running at higher clock speeds, would more than double the performance. And it was completely unexpected that floating point would be so much slower than integer processing (though we should have guessed after the programming specs were released around August or so).
 
the bottom line was that nVidia seriously underestimated ati. and with good cause, seriously. look at the past, when ati's 8500 (released months after the gf3 with rediculouly better paper specs) could barely keep pace with the GF3 in games. or the radeon vivo (and later 7500) that had trouble keeping up with the GF2.

The 8500 was as fast and sometimes faster than the geforce 3 ti 500 which was the first set of refreshes for the product , about a month after both hit the market the 8500 started pulling ahead , then nvidia put out the geforce 4 .

The radeon vivo was better than the geforce 2 in many ways but came outto late and they couldn't get the clock speeds high enough.

The 9700pro was special for ati because they were able to get it out before nvidia and nvida not only came out later but fumbled on design
 
jvd said:
The 8500 was as fast and sometimes faster than the geforce 3 ti 500 which was the first set of refreshes for the product , about a month after both hit the market the 8500 started pulling ahead , then nvidia put out the geforce 4 .
sorry to take this a bit off topic, but because so much time has passed by, I think I can bring something new to light, how I eventually found my way to here and all that how I am known today...

between the 8500 (in July 2001) and GF4Ti (in February 2002) there's place for 2 chips that would have mixed up the bag quite lot if things wouldn't gone as bad as they went... anyone dares to guess which are those? ;) (we are talking here based on original launch plans.) and feel free guess also planned clocks, planned release time frame, etc...



(I did live for those back then. it's was beyond anything that can be called as fanatic behauviour. This all because 2nd year BSc in software engineering student got a leak too big to handle in June 2001...)
 
digitalwanderer said:
ondaedg said:
It's definitely not a bad card at all. I think Nvidia got muscled around by Microsoft a bit on the DX side of things and they payed for it.
More like nVidia tried to muscle M$ on the DX side of things and utterly failed. ;)

I personally don't like Microsoft having the final say when it comes to API standards. They could at any time make or break an IHV at the push of a white paper. If only the open gl body would have got their heads out of their you know what...
 
Back
Top