Beyond3D Wolves Needed to Tear This One Apart

Status
Not open for further replies.
I really don't recall an nV bias. I mean, Josh out and said NV30 was a GF4 + some DX9 bits. That's ballsy, in this politically charged atmosphere, what with the election around the corner. He was pretty much spot-on when he noted that nVidia voted against the DX9 appropriations before they voted for them. He outed it as a flip flopper, a San Jose liberal with ties to the TMSC, and you've gotta respect that.

Calling ATi "shallow" and nV "deep" was a bit much, I admit. But everyone misunderestimates ATi, so it's somewhat understandable. At least you know where ATi stands on the issues: DX9, in times of change. They're like an instruction-limited, unconditional, pixel-shaded rock. nV, OTOH, is like the ocean, all undulating specularity and no stable feature set.
 
You really can't blame nV for focussing more on DX8 than DX9 with nV3x.

nV3x has more DX9 functionality than R3xx, it's just that there is less DX9 performance).

Why would nVidia do things this way, look back to DX8 and how long it took for games to use its features. Until quite recently most games only really use DX7 features. Then you have the question of DX9 usage. Prior to TR:OAD and farcry (both of which only use DX9 as a tinsel garnish at best) there has been absolutely nothing on the games front that requires it.

Don't forget developers also targetted the XBOX which is DX8'ish.

So, IMO nVidia was wise to focus more on DX8 performance. In the real world thats what the chips were required to run. Theoretically R3xx may have had performance advantages in DX9 games, but, given the amount of Dx9 games available buring R3xx's lifetime theory is exactly what it remained.
 
but the R300 was still loads faster in most of these DX8 games. and it still had superior DX8 performance most of the time.
 
Radar, I think you are vastly under estimating the lifetime of both the r300 and Nv35 derivites, paticularly the high end.
 
No I'm not underestimating their lifetimes.

I expect nV3x will influence the industry for years to come, whereas a very large proportion of R300 owners have upgraded to something else in Ati's lineup or a 6800 series chip already.

With regard to DX8 performance, I think you'll find the ridiculous nVidia driver situation was more responsible for R300's wins in DX8 than any shortcoming of the actual GPU. My 5900XT absolutely screams along under the latest drivers.
 
radar1200gs said:
No I'm not underestimating their lifetimes.

I expect nV3x will influence the industry for years to come, whereas a very large proportion of R300 owners have upgraded to something else in Ati's lineup or a 6800 series chip already.

With regard to DX8 performance, I think you'll find the ridiculous nVidia driver situation was more responsible for R300's wins in DX8 than any shortcoming of the actual GPU. My 5900XT absolutely screams along under the latest drivers.
Your logic is daft to say the least. Of course people who care about performance will upgrade: That's why they are called enthusiasts. But you can't say with a straight face that people are upgrading 9700-9800 cards any faster than people are upgrading NV3x cards. As noted above, the NV3x was slower is nearly every application and the extra features don't mean a whit when they aren't used (i.e. games using lower quality paths) or are too slow when used (see your own comment).

-FUDie
 
radar1200gs said:
No I'm not underestimating their lifetimes.

I expect nV3x will influence the industry for years to come, whereas a very large proportion of R300 owners have upgraded to something else in Ati's lineup or a 6800 series chip already.

With regard to DX8 performance, I think you'll find the ridiculous nVidia driver situation was more responsible for R300's wins in DX8 than any shortcoming of the actual GPU. My 5900XT absolutely screams along under the latest drivers.

Did it possibly occur to you that some people are satisfied with the performance of said cards? Even today the Nv35 series and r300 series are providing performance capable of playinmg latest games. Not everyone buys these cards with the need for 1600x1200 with 4xAA/16xAF, And rightfully so, it's just not neccasary. These cards are for the most part capable of playing even the latest games at acceptable frame rates, And havent really out lived their lifetime.

I cant for the life of me understand why you think someone will be more prone to upgrade and r300 than Nv3x series, And tbh I think it would be the other way around if assuming the card owner was actually concerned about said performance.

Chris
 
Hey, I'm simply stating what I predominately see in the forums - those who had a 9500 or 9700 tend to have moved on. Those who had a 5700 or better have kept it longer.

Speaking for myself, I'll be keeping my 5900XT for a fairly long while yet. It isn't the limiting factor in my sytem. My XP2400 and PC2100 memory (I bought 1 gig of PC2100 when I bought my nForce1) are more pressing concerns upgrade wise.
 
radar1200gs said:
Hey, I'm simply stating what I predominately see in the forums - those who had a 9500 or 9700 tend to have moved on. Those who had a 5700 or better have kept it longer.

I think this can be explained easily... The nvidia FX owners dont care much for performance. Those with R300 care more for performance. Hence why the former R300 owners have upgraded, while the FX owners still keep the same ol card.
 
radar1200gs said:
Hey, I'm simply stating what I predominately see in the forums - those who had a 9500 or 9700 tend to have moved on. Those who had a 5700 or better have kept it longer.
Funny, that's not what I've seen in forums. Head to Rage3D and look for all the posts by people who have upgraded from an NV3x to a 9800 or X800 board.
Speaking for myself, I'll be keeping my 5900XT for a fairly long while yet. It isn't the limiting factor in my sytem. My XP2400 and PC2100 memory (I bought 1 gig of PC2100 when I bought my nForce1) are more pressing concerns upgrade wise.
Good for you. I guess you won't miss the DX9 effects that your card supports but never get used.

-FUDie
 
FUDie said:
radar1200gs said:
Hey, I'm simply stating what I predominately see in the forums - those who had a 9500 or 9700 tend to have moved on. Those who had a 5700 or better have kept it longer.
Funny, that's not what I've seen in forums. Head to Rage3D and look for all the posts by people who have upgraded from an NV3x to a 9800 or X800 board.
Speaking for myself, I'll be keeping my 5900XT for a fairly long while yet. It isn't the limiting factor in my sytem. My XP2400 and PC2100 memory (I bought 1 gig of PC2100 when I bought my nForce1) are more pressing concerns upgrade wise.
Good for you. I guess you won't miss the DX9 effects that your card supports but never get used.

-FUDie

What use do I have for Rage3d? I own a nVidia product, not an ATi product...

I don't know what you mean by unused DX9 effects either. The only DX9 effects I'm unable to take advantage of in current games are the HDR & SM3.0 effects in farcry.
 
radar1200gs said:
FUDie said:
radar1200gs said:
Hey, I'm simply stating what I predominately see in the forums - those who had a 9500 or 9700 tend to have moved on. Those who had a 5700 or better have kept it longer.
Funny, that's not what I've seen in forums. Head to Rage3D and look for all the posts by people who have upgraded from an NV3x to a 9800 or X800 board.
Speaking for myself, I'll be keeping my 5900XT for a fairly long while yet. It isn't the limiting factor in my sytem. My XP2400 and PC2100 memory (I bought 1 gig of PC2100 when I bought my nForce1) are more pressing concerns upgrade wise.
Good for you. I guess you won't miss the DX9 effects that your card supports but never get used.
What use do I have for Rage3d? I own a nVidia product, not an ATi product...
Then you really don't know everything, do you?
I don't know what you mean by unused DX9 effects either. The only DX9 effects I'm unable to take advantage of in current games are the HDR & SM3.0 effects in farcry.
Sure. That's why HL2 defaults your card to DX8.1 mode, hmm? That's why the 5900XT gets clobbered in Tomb Raider when all the effects are cranked (of course, the 5900XT still doesn't support float buffers so you can't even use all of the effects in TRAOD).

Go ahead and lie to yourself, but no one else buys what you say.

-FUDie
 
There isn't much that can be done about developers who refuse to code to the capabilities of the hardware unfortunately (NV3xc does support limited FP buffers). or developer stupidity (valve).

In HL2's case I'm sure DX9 mode will be able to be forced on, with a tool like 3danalyze just like early on with farcry.

And lets not forget that NV3x is capable of running Ati's Ruby demo with no shader alterations whatsoever, something ATi's own R3xx series certainly can't do...
 
radar1200gs said:
And lets not forget that NV3x is capable of running Ati's Ruby demo with no shader alterations whatsoever, something ATi's own R3xx series certainly can't do...

So, a simple question: Do you think NV3x is better than R3xx?
 
Personally, yes I do.

You have a GPU plagued by manufacturing problems, some hardware features not fully realised, with half the piplines of R300, shockingly bad early drivers and no hardware shortcuts taken (filtering precision, precision truncuation etc, etc), and yet a lot of the time it draws level with or beats the competition.
 
radar1200gs said:
Personally, yes I do.

You have a GPU plagued by manufacturing problems, some hardware features not fully realised, with half the piplines of R300, shockingly bad early drivers and no hardware shortcuts taken (filtering precision, precision truncuation etc, etc), and yet a lot of the time it draws level with or beats the competition.

For me thats a bad produt ...
 
ClyssaN said:
radar1200gs said:
Personally, yes I do.

You have a GPU plagued by manufacturing problems, some hardware features not fully realised, with half the piplines of R300, shockingly bad early drivers and no hardware shortcuts taken (filtering precision, precision truncuation etc, etc), and yet a lot of the time it draws level with or beats the competition.

For me thats a bad produt ...

For anyone not wearing green-tinted lenses that's a bad product.

Still, at least it spurred NV on to create the very capable NV4X series after the complete balls-up with NV3X.
 
radar1200gs said:
No I'm not underestimating their lifetimes.

I expect nV3x will influence the industry for years to come, whereas a very large proportion of R300 owners have upgraded to something else in Ati's lineup or a 6800 series chip already.

With regard to DX8 performance, I think you'll find the ridiculous nVidia driver situation was more responsible for R300's wins in DX8 than any shortcoming of the actual GPU. My 5900XT absolutely screams along under the latest drivers.
:LOL: :LOL: :LOL: :LOL:
 
FUDie said:
radar1200gs said:
FUDie said:
radar1200gs said:
Hey, I'm simply stating what I predominately see in the forums - those who had a 9500 or 9700 tend to have moved on. Those who had a 5700 or better have kept it longer.
Funny, that's not what I've seen in forums. Head to Rage3D and look for all the posts by people who have upgraded from an NV3x to a 9800 or X800 board.
Speaking for myself, I'll be keeping my 5900XT for a fairly long while yet. It isn't the limiting factor in my sytem. My XP2400 and PC2100 memory (I bought 1 gig of PC2100 when I bought my nForce1) are more pressing concerns upgrade wise.
Good for you. I guess you won't miss the DX9 effects that your card supports but never get used.
What use do I have for Rage3d? I own a nVidia product, not an ATi product...
Then you really don't know everything, do you?
I don't know what you mean by unused DX9 effects either. The only DX9 effects I'm unable to take advantage of in current games are the HDR & SM3.0 effects in farcry.
Sure. That's why HL2 defaults your card to DX8.1 mode, hmm? That's why the 5900XT gets clobbered in Tomb Raider when all the effects are cranked (of course, the 5900XT still doesn't support float buffers so you can't even use all of the effects in TRAOD).

Go ahead and lie to yourself, but no one else buys what you say.

-FUDie

You mist the biggest one of all even the biggest nvidia fan there is (chalnoth) says the fx line was garbage and the 6800 series is vindication.
 
Status
Not open for further replies.
Back
Top