Nvidia withholding the launch of GeForce4 Ti 4200

On 2002-02-22 16:30, pascal wrote:
Who is buying this ugly thing ???
weird.gif

Not me thats for sure :smile:
 
On 2002-02-22 16:50, pascal wrote:
Well GF4-MX has great FSAA performance. Spanking most other cards.
It is not faster than a GF3 Ti200 http://www.anandtech.com/video/showdoc.html?i=1583&p=13
Not me thats for sure :smile:
Not me too
roofle.gif

I hate/love to say this but not anyone who actually knows what the heck it is. Personally I think it is a scam GF4MX(My ass it is a GF4- It is a GF2 with GF4 tech but it is still only DX7.) People will pay the cash thinking they are getting a new generation card but all they are doing is buying a butched up GF2MX. Nvidia is doing this so they can keep their margins thats all. But I think that the deal stinks personally.

Sabastian
 
These name insignias don't mean anything. I didn't see anyone jumping up and down about the fact the Geforce 2 MX series is a step down from the original Geforce DDR.
 
On 2002-02-23 16:41, Exposed wrote:
These name insignias don't mean anything. I didn't see anyone jumping up and down about the fact the Geforce 2 MX series is a step down from the original Geforce DDR.

That is not true at all. The MX had features not found on the GF DRR as well as almost all features of the normal GF2. The GF4 MX does not have the features of the GF4 cards.
 
On 2002-02-23 17:50, jb wrote:
On 2002-02-23 16:41, Exposed wrote:
These name insignias don't mean anything. I didn't see anyone jumping up and down about the fact the Geforce 2 MX series is a step down from the original Geforce DDR.

That is not true at all. The MX had features not found on the GF DRR as well as almost all features of the normal GF2. The GF4 MX does not have the features of the GF4 cards.

Those where, except "TwinView" and one more TMU? And which "features" did the GF2MX miss from the GF2GTS, except from two more pixel pipes and higher memory bandwith (hardly a feature)?

A GF1 DDR is faster in todays games than a GF2 MX/400 in 32bit because of the much higher bandwith (2,8GB compared to 4,8GB).
 
On 2002-02-23 16:41, Exposed wrote:
These name insignias don't mean anything. I didn't see anyone jumping up and down about the fact the Geforce 2 MX series is a step down from the original Geforce DDR.

But a GF2 has the same feature set a the original GF DDR, but is just enhanced to make it faster. A GF4MX has the feature set of a GF2, not GF3. If it had GF3 features the name wouldn't be such a misnomer.

As has been stated by others, it just looks like a scam for the uninformed.
 
Galille,

I understand your point of view, looks like the GF4MX is highlly tuned for Q3.

IMHO one should not base his/her decision of a new card using an old benchmark like Q3 when a new benchmark based on real next generation game engine like Unreal 2 is available.

I upgraded from my Radeon 32MB DDR to GF3 Ti200 thinking about next generation games.

Probably any good DDR card will play Q3 very well.
 
True
I'm not saying that GF4-MX is a good choice, but it looks to me like it get some of the same FSAA performance-increase as GF4-Ti does over GF3. Probably due to the two vertex things ;) (2X and Quincunx same performance)
But in general GF4-MX is a lousy performer.

My guess on what will happen:
oem machines that usually ship with MX/TNT2 card will ship with GF4-MX, but it will not sell that good in the stores retail. Hopefully they will continue the GF3-Ti200 line.

<font size=-1>[ This Message was edited by: Galilee on 2002-02-23 20:37 ]</font>
 
The problem is that GF4MX is much more expensive than the GF2MX. My guess for OEM it is not good too.

My hope is ATI pushing Nvidia with low cost DX8 cards.
 
But a GF2 has the same feature set a the original GF DDR, but is just enhanced to make it faster. A GF4MX has the feature set of a GF2, not GF3. If it had GF3 features the name wouldn't be such a misnomer.

The GF4 has the improved crossbar memory system, Quincunx plus the new Accuview engine, and nview. It's more than a GF2, but not a full blown GF4. Arguing about the naming scheme is pointless.
 
wait wait wait... don't you remember the mystery FSAA methods done by reverend on a geforce 3 ... wasn't that "accuview" ?
 
Ask any game developer or publisher if the GeForce naming scheme is "pointless."

Blanket statement. How many game developers and publishers have you interviewed? ;)

wait wait wait... don't you remember the mystery FSAA methods done by reverend on a geforce 3 ... wasn't that "accuview" ?

No.
 
Blanket statement. How many game developers and publishers have you interviewed?

Heh, and I suppose "Arguing about the naming scheme is pointless" is not a blanket statement? ;)

I have not interveiwed any developers / publishers, but I have read interveiws / statements by at least 2 of them: Carmack and Hook. And both were extremely disappointed.
 
Heh, and I suppose "Arguing about the naming scheme is pointless" is not a blanket statement?

No, it's not a blanket statement. It's an opinion. "Ask any game developer or publisher if the GeForce naming scheme is "pointless." "....Now THAT's a blanket statement ;)

I have not interveiwed any developers / publishers, but I have read interveiws / statements by at least 2 of them: Carmack and Hook. And both were extremely disappointed.

From one company? That doesn't quite support your statement above ;)

Carmack was also extremely disappointed in the naming convention of the GF3 and GF2, yet I don't see you envangelizing that aspect ;)
 
Back
Top