What was good about NV3x?

The 8500 was as fast and sometimes faster than the geforce 3 ti 500 which was the first set of refreshes for the product , about a month after both hit the market the 8500 started pulling ahead , then nvidia put out the geforce 4 .
umm....
http://www.anandtech.com/video/showdoc.html?i=1544
the radeon 8500 had basicly GF3ti200 speeds at launch. and about a month later....
http://www.anandtech.com/video/showdoc.aspx?i=1558
got new drivers that actualy made it more in tune with the vanilla GF3, but usualy lost to the ti500. on paper, though, the r8500 should have been on par with the GF4ti4400 (clock speed, pipelines, poly and shader throughput and even bandwidth were competative) but the 8500 fell seriously short of that mark.
 
Their drivers have historically never been that great. However, their recent commitment to improving drivers has shown great dividends. They are definitely on par with Nvidia's that's for sure.
 
ANova said:
Reverend said:
BTW, just wanted to add that the NV3x wouldn'tve sucked if the R300 never came out.

:LOL:

In DX8.1 and lower yes. DX9 on the other. :|
Would the FX have sucked against the R300 if dx9 had never came out and they were both competing in dx8.1? :| (I really don't know and am curious.)
 
QuadroFX 1100. Wonderful workstation card. Also, FP32 with long and complicated shaders made things possible in real-time as far as DCC goes that were never possible before. The CgFX plugins allow the creative director to see what shaders are going to look like in the final render in real time and thus really does wonders for the communication between the artist and CD. Also, Gelato is a very powerful rendering solution that can make huge speed differences in rendertimes.

Now, it's not that the R3x0 isn't capable of these things BUT the development support isnt there to make it happen. Also, FP24 WILL rear its ugly head sometimes in these cases.
 
see colon said:
The 8500 was as fast and sometimes faster than the geforce 3 ti 500 which was the first set of refreshes for the product , about a month after both hit the market the 8500 started pulling ahead , then nvidia put out the geforce 4 .
umm....
http://www.anandtech.com/video/showdoc.html?i=1544
the radeon 8500 had basicly GF3ti200 speeds at launch. and about a month later....
http://www.anandtech.com/video/showdoc.aspx?i=1558
got new drivers that actualy made it more in tune with the vanilla GF3, but usualy lost to the ti500. on paper, though, the r8500 should have been on par with the GF4ti4400 (clock speed, pipelines, poly and shader throughput and even bandwidth were competative) but the 8500 fell seriously short of that mark.

not to derail this to much but
http://graphics.tomshardware.com/graphic/20020522/ti4400_4600-12.html#benchmarks winning 2 out of 3

and if i search harder it will become more apparent . The 8500 fell between the ti 500 and the geforce 4
 
digitalwanderer said:
ANova said:
Reverend said:
BTW, just wanted to add that the NV3x wouldn'tve sucked if the R300 never came out.

:LOL:

In DX8.1 and lower yes. DX9 on the other. :|
Would the FX have sucked against the R300 if dx9 had never came out and they were both competing in dx8.1? :| (I really don't know and am curious.)

It would certainly have been a closer race, though the R300 was still the faster design.
 
jvd said:
see colon said:
The 8500 was as fast and sometimes faster than the geforce 3 ti 500 which was the first set of refreshes for the product , about a month after both hit the market the 8500 started pulling ahead , then nvidia put out the geforce 4 .
umm....
http://www.anandtech.com/video/showdoc.html?i=1544
the radeon 8500 had basicly GF3ti200 speeds at launch. and about a month later....
http://www.anandtech.com/video/showdoc.aspx?i=1558
got new drivers that actualy made it more in tune with the vanilla GF3, but usualy lost to the ti500. on paper, though, the r8500 should have been on par with the GF4ti4400 (clock speed, pipelines, poly and shader throughput and even bandwidth were competative) but the 8500 fell seriously short of that mark.



not to derail this to much but
http://graphics.tomshardware.com/graphic/20020522/ti4400_4600-12.html#benchmarks winning 2 out of 3

and if i search harder it will become more apparent . The 8500 fell between the ti 500 and the geforce 4

Thats not what I'm seeing in those graphs, Dont tell me your using 3dmark2001SE as your basis for comparison?
 
I think there are a lot of bugs in R200/8500s architecture. It was mentioned here and there.
 
jvd said:
not to derail this to much but
http://graphics.tomshardware.com/graphic/20020522/ti4400_4600-12.html#benchmarks winning 2 out of 3

and if i search harder it will become more apparent . The 8500 fell between the ti 500 and the geforce 4
Well, I looked up tomshardware's VGA charts:
http://graphics.tomshardware.com/graphic/20030120/index.html

Looks more like the 8500 fell more inbetween the GeForce3 Ti 200 and Ti 500, sometimes underperforming the Ti 200. But that's just in benchmarks without antialiasing or anisotropic filtering: with either card you could enable anisotropic filtering and 2x AA, significantly outperforming the same settings with the 8500 due to the 8500's use of supersampling.
 
see colon said:
basicly the same thing happened with the 5800. it wasn't really any faster than the ti4800 but offered more programability, faster fsaa, and better 2d.

Not really the fx 5800 ulta was ~50% faster purely due to clock speeds. In af/fsaa it was a huge amount faster. Thats significantly larger than any of these transitions: geforceddr-->geforce 2 gts geforce 2 ultra-->geforce 3 geforce 3 ti 500--->ti 4600.

In a way the gffx was the most impressive card Nvidia has ever released in terms of pushing forward performance, ATI just managed to do even better. Bothe 9700s and fx 5800s were large upgrades from the geforce 4 ti series.
 
dan2097 said:
In a way the gffx was the most impressive card Nvidia has ever released in terms of pushing forward performance
Had, perhaps. Certainly not has after NV40. And of course the 5800s performance came hand in hand with some pretty major sacrifices.
 
dan2097 said:
Not really the fx 5800 ulta was ~50% faster purely due to clock speeds. In af/fsaa it was a huge amount faster. Thats significantly larger than any of these transitions: geforceddr-->geforce 2 gts geforce 2 ultra-->geforce 3 geforce 3 ti 500--->ti 4600.
Not really, if you keep in mind that the fx5800ultra was already "factory-modded" to be able to reach higher clock speeds than what it really could. I don't think I need to mention what disastrous consequences this had...
Without some competition, you would have never seen a 500Mhz nv30 card, guaranteed.
 
I still think Rev made the best point here. Had the R300 not been out we really would not have known how good/bad its prefromance would be back then. If there was no R300 most of us here would have said, well its DX9 stuff which is new/next gen and NV has a history of first introducing a new/next gen feature with questionable prefromance. Then follow it up next version of their card with top notch pefrormance. So we would have said, "well yea its slow its DX9 stuff and since its new its gonna to be slow". It not until we compared it to the R300 and said..hey wait a second..this is also doing DX9 stuff but it "ain't slow"......
 
dan2097 said:
In a way the gffx was the most impressive card Nvidia has ever released in terms of pushing forward performance, ATI just managed to do even better. Bothe 9700s and fx 5800s were large upgrades from the geforce 4 ti series.
Was it? IIRC, the GF4 was also twice as fast as the GF3 when it came to AA performance, as was the 5800U compared to the GF4. (The 9700P was more than twice as fast as the 8500, but that's not taking into account MSAA vs. SSAA.)

Had R300 not launched is not a very interesting Q. Had R300 not launched with DX9 or gamma-corrected and sparse-sampled AA is a more interesting question. Would the FX 5800/U still have been (obnoxiously) loud, double-decker solutions? Would they have been clocked anywhere near where they were? How would performance have fared in that case?

The FX series wasn't that interesting (from my end-user perspective) because nV basically presented a 4x2 DX8 architecture for the third time, with not many improvements beside speed (raw clocks, double stencil). OTOH, this relative feature stability allowed nV to maintain its driver advantage, and it probably helped devs in the sense of not changing their featureset too greatly.

(Edit: I have to say that last paragraph was written with the R300 in mind. If nV had released the 5800/U to no competition, I'm sure I would've welcomed its huge speed boost over the GF4.)

Ailuros, are you just saying that I was being (unduly) flippant, or that DX9 features so far ahead of hardware that can actual run them quickly aren't important to devs? If the latter, Doom 3 was built with the GF1 featureset in mind, and apparently JC's next engine will be built with the FX's featureset in mind. If the former, well, it's easy to get carried away on a forum. :)

BTW, as to the R300 without DX9: aren't Deus Ex: IW and Thief 3 running on DX8 engines? IIRC, R300 trounced (still trounces?) NV30 in those two games.
 
Chalnoth said:
jvd said:
not to derail this to much but
http://graphics.tomshardware.com/graphic/20020522/ti4400_4600-12.html#benchmarks winning 2 out of 3

and if i search harder it will become more apparent . The 8500 fell between the ti 500 and the geforce 4
Well, I looked up tomshardware's VGA charts:
http://graphics.tomshardware.com/graphic/20030120/index.html

Looks more like the 8500 fell more inbetween the GeForce3 Ti 200 and Ti 500, sometimes underperforming the Ti 200. But that's just in benchmarks without antialiasing or anisotropic filtering: with either card you could enable anisotropic filtering and 2x AA, significantly outperforming the same settings with the 8500 due to the 8500's use of supersampling.
I got my 8500 in early january, and never had any driver bugs untill I started playing the free game racer- 1 driver set broke it for a few releases.
Other bug is that I never got dvd assist to work untill I got a 2500+- didn't play well with my A 750 setup.
Then richard burns, but that got fixed also.
So 2 games in 3 years ain't half bad.
If you like AA, you bought a geforce 3, and how many games could you use 4X in anyway?
If you like AF, you got an 8500, which performs in every game, albeit you could see the mip map borders in games with negative LOD, part of that is also because ati sets a lower(sharper) lod in their drivers.
I played around with my brother in laws ti200, and you couldn't really use 4X aa in games, and you could use 2X aa on both cards, and it of course looks better on the 8500 cuz of the super sampling.
The TI 4200 finally fixed the balance of power, however.
Here's in old review of the AIW 8500 128MB to show performance in april of 2002.
http://hardocp.com/article.html?art=Mjgy
 
radeonic2 said:
I got my 8500 in early january
I got mine in September '02, and most games were basically unplayable due to bugs and stuttering until Catalyst 3.4. I upgraded from a GF2MX as it didn't have enough grunt for competitive RTCW, but the R8500 was on average no better for a good 6 months. From then on though it steadily got better, with pretty much no glaring issues through 2004, and most bugs I could attribute to the card slowly dying (capacitors ended up popping and spraying the inside of my case with goo :?). Nonetheless, when people say ATI's drivers were garbage, they're 100% correct.
 
Morrowind ran horrible on my 8500/KT266A/AthlonXP. It would develop a pause in the menus where it would take a few seconds for the game to respond to clicks. And it would stutter. I'm not sure if it works better these days with the newer drivers today.

My 8500 was a pretty big jump from my Radeon DDR. But the 9700 was just a huge jump from the 8500, in every way, and was (and is) a lot less buggy experience. 9700 PRO has got to be the longest-lasting video card in history, IMO. The NV3x may have contributed to its life in that game developers couldn't really jump on DX9 and go, until now.
 
Back
Top