What does everyone think about the ATI video presentation?

My mistake- so I'll slide the 9100 up a hair on the graph. :) For some odd reason my point on the 9000 (RV2xx) was twisted to mean the 9100 and in my hurried graph throw-together I didn't proofread the reply about the 9100 when if ya see my previous post, was entirely referencing the 9000.

And yes, I have to image link text files as freebie tripod wont allow redirects of jpg/gif/png or other image formats, but freely allows html or txt mime-types to redirect fine and dandy. IE/Netscape handle this correctly to the best of my knowledge. :)
 
The 9200 isn't as problematic as the GF4 MX.

The 9200 came to market less than 6 months after DX9
The GF4 MX came to market more than 1 year after DX8

If you don't see the difference, may I suggest glasses? :)
So, on the technological POV, the 9200 is less problematic than the GF4 MX.

On the naming scheme, however, it is *more* problematic.

"GF4 MX" doesn't say it's DX7 or DX8. Unless you looked at a review or at the specs on the box, you couldn't guess it.
"9x00" , with the '9' , pretty much implies DX9 support.

Yes. You can explain ATI & nVidia naming scheme whichever way you want. But if they really didn't want to confuse people on DX support, then they'd make sure it's very clear. And they didn't.

So, you've got to take the first impression ( both for someone slightly familiar with GPUS and for someone who got no idea what a GPU is but kinda knows what DX is ) by looking at the name into account. That's all that matters, IMO.
And considering that, the 9200 is worse than the GF4MX.

Thus, in my opinion:
On the technological POV, the GF4MX is worse than the 9200.
On the naming POV, the 9200 is worse than the GF4MX

There's no clear winner, it seems :)


Uttar
 
You guys are absolutely wrong about nvidia and its MX support of DX8. Nvidia claims that indeed the GF4MX series cards fully support DX8. Pure deception on nvidias behalf. How nvidia got away with this outright false advertising I don't know.
http://www.nvidia.com/docs/lo/1468/SUPP/PO_GeForce4_MX_92502.pdf
EDIT: on the second page of that PDF under API support for DX8.1a

Nor did it even act as though it did. But nvidia was able to slide this crap under the radar and get away with it.
http://www.sharkyextreme.com/hardware/videocards/article.php/3211_1470621__6

Where ATi while does include the Radeon 9000 in the 9000 familly ATi does not advertise that this card was a DX9 card ever in any of the technical specifications. While the naming convention is borderline deceptive I would also argue that FULL DX8.1a compliancy including support for PS1.4 make the Radeon 9000 pro more a kin to DX9 then the Geforce 4 MX was to a DX8 card. http://mirror.ati.com/products/pc/radeon9000pro/index.html

Lets also take note of the fact that nvidia never really did release a DX8 low end part. But now that DX9 is coming out the focus is on the low end. Why? Because nvidia doesn't hold the high end parts? Give me a break. :rolleyes:
 
Sabastian said:
You guys are absolutely wrong about nvidia and its MX support of DX8. Nvidia claims that indeed the GF4MX series cards fully support DX8. Pure deception on nvidias behalf. How nvidia got away with this outright false advertising I don't know.
http://www.nvidia.com/docs/lo/1468/SUPP/PO_GeForce4_MX_92502.pdf
EDIT: on the second page of that PDF under API support for DX8.1a

The wording is bothersome: "Complete DirectX support, including DX 8.1".

Gotta' love marketing documents.
 
Sabastian said:
You guys are absolutely wrong about nvidia and its MX support of DX8. Nvidia claims that indeed the GF4MX series cards fully support DX8. Pure deception on nvidias behalf. How nvidia got away with this outright false advertising I don't know.

Maybe i'm missing something but can you tell me where anyone has stated (in this thread) that the GF 4 MX is a DX8 card ?
(To lazy to go through the whole thread)

Lets also take note of the fact that nvidia never really did release a DX8 low end part. But now that DX9 is coming out the focus is on the low end. Why? Because nvidia doesn't hold the high end parts? Give me a break. :rolleyes:

Or maybe Nvidia actually learned from all the criticism they got with the release of GF MX4 and decided to do something about it for their next generation ?
 
John Reynolds said:
Sabastian said:
You guys are absolutely wrong about nvidia and its MX support of DX8. Nvidia claims that indeed the GF4MX series cards fully support DX8. Pure deception on nvidias behalf. How nvidia got away with this outright false advertising I don't know.
http://www.nvidia.com/docs/lo/1468/SUPP/PO_GeForce4_MX_92502.pdf
EDIT: on the second page of that PDF under API support for DX8.1a

The wording is bothersome: "Complete DirectX support, including DX 8.1".

Only if you are bothered by someone telling bald lies about their product in hopes of maximizing their profit.

Entropy
 
Sabastian, could you tell me what DX8.1a means, as opposed to DX8.1? I've seen you using that quite a bit, but I have no idea what it means.
 
Bjorn said:
Sabastian said:
You guys are absolutely wrong about nvidia and its MX support of DX8. Nvidia claims that indeed the GF4MX series cards fully support DX8. Pure deception on nvidias behalf. How nvidia got away with this outright false advertising I don't know.

Maybe i'm missing something but can you tell me where anyone has stated (in this thread) that the GF 4 MX is a DX8 card ?
(To lazy to go through the whole thread)

There are some if you read through the thread that equate the Radeon 9000 to the GF4MX scenerio.(sorry I am not going to pull these quotes if you simply read you can see this.) But agian the naming scheme of the Radeon 9000 is somewhat deceptive but not as deceptive as the GF4MX in any way. Because the Radeon 9000 fully supports DX8.1a with PS1.4(BTW is something no nvidia cards support unless the DX9 of which there are non you can buy even yet. Unless you buy a Quatro FX) Pixel shader 1.4 goes a long way to being like pixel shader 2.0 DX9 spec. So while the card is still advertised as a DX8.1a card and is technically a DX8 card it is considerably more a kin to DX9 then the name suggests.

Bjorn said:
Sabastian said:
Lets also take note of the fact that nvidia never really did release a DX8 low end part. But now that DX9 is coming out the focus is on the low end. Why? Because nvidia doesn't hold the high end parts? Give me a break. :rolleyes:

Or maybe Nvidia actually learned from all the criticism they got with the release of GF MX4 and decided to do something about it for their next generation ?

No, christ they could have put out a low end DX8 part ages ago and they put up with a lot of critique about it for a reason. They did it on purpose for some reason, just why is another matter. I guess they were so addicted to the low end margins on the MX OEM deals that they simply didn't want to compete directly with ATi on the low end DX8 parts. They marketed the Geforce 4 MX as a DX8 part even though it was not a low end DX8 part at all. Nvidia didn't "learn" anything they did it on purpose.
 
ET said:
Sabastian, could you tell me what DX8.1a means, as opposed to DX8.1? I've seen you using that quite a bit, but I have no idea what it means.

LOL, AFAIK there is no DX8.1 there was only one substaintial upgrade to DX 8 and that was DX8.1a.
 
Instead of just reposting my opinion again, I'll just link to the many times we (as a forum) have discussed this before and my opinions have been stated:

Impressions of the GF 4 MX...

Feb 2002

Feb 2002

Mar 2002

First impression of the 9000...

Jul 2002

My slightly different impression after a chart showing that ATI had described the number as not being based on DX 9...

Nov 2002

Note that my "defense" of the 9000 kicks in when people say it is the "same as" or "worse than" the GF 4 MX...I pretty much tend to agree with criticisms otherwise, especially as based on performance (see my comments on the "9100" below).

Now, to put it in the context of the 4800 naming problems from nvidia...

Nov 2002

Note that the "8500 OEM" I'm criticizing is not the 8500LE, but the reduced speed cards released before that naming effort took effect. ATI's only "saving grace" (as far as them being negligent rather than intentionally malicious in the lack of identifying the down-clocked cards) in that regard that they have a history of going "both ways" (i.e., they released the original Radeon with an unannounced speed increase at one point...dubbed by the community as "SE" cards).

And to repeat my outlook on the "9100" name...

Dec 2002

My view on the 9200 name, besides its being a continuation of the above problems, will depend on release clock speeds, as I've said recently. Makes the parallel to the 4800 SE even stronger, because it goes against their own naming consistency.

Oh, and IMO, the 5200 is similar to the 9000 (and likely 9200) problem (theoretically, depending on the vertex shader issue), but better since the feature set is consistent with the "GF FX" monicker. The performance relative to the prior generation high end (4600) may be lacking like the 9000 was (though we don't know to what degree as of yet), and therefore some of the features it offers may only be theoretical, but this is nowhere in the same league as the GF 4 MX! See the above posts for how I view such arbitrary equivalency. :-?

Of course, a name like GF 5 MX might have been best to illustrate something was lacking, since the vertex shader assumptions we are making at this point could make the card a significant problem for some people. Also, GF FX 5000 would have been less misleading than the "4200" parallel implied, but I think we should put the GF 4 MX parallels behind us (Hmm...though I wonder if we can put the GF 4 MX itself behind us as of yet?)
 
Diamond Edge (NV1, with integrated audio) with Panzer Dragoon and Virtua Fighter shipped first in 1995. GLINT shipped in 1995 before Verite also. Verite 1000L did not ship in Mid95, it was announced in October of '95, and not shipped until the end of the year. Orchid Righteous 3D shipped 4Q 96 (I owned one of the very first off the production line), glQuake beta didn't hit until end of Jan '97. Between 1995, and before GLIDE and miniGL apis took hold, HW 3D gaming by and large sucked. You had to scavenge websites for news of any game that had support for your chipset.


The level of your anti-Nvidia rhetoric is astounding. You make it sound as if Nvidia is a company of morons with no achievements.

Why are you so hell bent on proving NVidia loses in any contest or horserace you try to come up with?
Total Nonsese.... :rolleyes:

My statement had zippo to do with "Anti-Nvidia Rhetoric".

http://firingsquad.gamers.com/features/nvidiahistory/page2.asp

Also wich John Carmack has some very strong statements contending with much of what Firinqsquad was sayig about microsoft killing the Nv1 by going with pollygons. FS's message being that Quadratic texture mapping was somehow "better". Which is simply not true. in fact there is a whole HOST of things that the Nv1 could not do that were included in the V1000 and Vodoo1. Unfortunately the only place I mcan find Carmacks comments are via a Rissian website.

http://www.3dnews.ru/reviews/video/nvidia-nv1/

I guess this is why you must also think that a GF4MX is a *True* Dx8 card to.. eh?? ;)

Everyone, I mean EVERYONE who is a serious gamer/3D enthusiest from back in the day knows that the Verite 1000, and Voodoo 1 WERE the First True industry standard Consumer 3d accelerators. Far more capable of anything the Nv1 could do. Thats for both Direct3D and OpenGL. You want to throw in a Ritchous Orchid as well??? Puhlease... :rolleyes: and GLINT was a workstation developer board. Not a retail consumer graphics card.
 
Hellbinder[CE said:
]You want to throw in a Ritchous Orchid as well??? Puhlease... :rolleyes: and GLINT was a workstation developer board. Not a retail consumer graphics card.

Uhh, the Orchid Righteous was a Voodoo 1 card.
8)
 
John Reynolds said:
Hellbinder[CE said:
]You want to throw in a Ritchous Orchid as well??? Puhlease... :rolleyes: and GLINT was a workstation developer board. Not a retail consumer graphics card.

Uhh, the Orchid Righteous was a Voodoo 1 card.
8)

Correct, and the "GLint" chip being talked about is not 3DLab's full-blown "GLint", but the "GiGi", or "Gaming Glint". This chip was on the original Creative Labs' 3D Blaster, and appeared right about the same time as the NV1. The Gaming Glint was directly targeted at consumers, not the workstation market.

That being said, the first round of real, honest-to-goodness "actually useful" 3D chips included the Voodoo Graphics, Rendition Verite 1000, and the PowerVR PCX 1/2.
 
Anyone who was a serious gamer "back in the day" would have know what an Orchid Righteous 3D was. I owned one of the very first ones off the production line.

You would also know when the Rendition Verite 1000 shipped as well.



I never claimed the NV1 was what kick started the 3D revolution. I merely claimed that the Verite was not the first 3D card to ship. I was correcting your factually inaccurate history. The 3D revolution didn't take off until glQuake shipped.

You can drone on all you want about the Verite1000, but vquake didn't build the market. It was the Voodoo1 with miniGL and Glide, plus 3dfx's developer relations program, which got so many 3D titles to ship.
 
Correct, and the "GLint" chip being talked about is not 3DLab's full-blown "GLint", but the "GiGi", or "Gaming Glint". This chip was on the original Creative Labs' 3D Blaster, and appeared right about the same time as the NV1. The Gaming Glint was directly targeted at consumers, not the workstation market.
The "Gaming Glint" in Creative Labs' 3D Blaster VLB is 3Dlabs 300SX chipset, it and the Diamond Edge3D were the only 3D consumer hardware available in 95 at FRY's, ComputerCity, CompUSA... It came with 5 games and the most memorable was the famous Flight Unlimited.

3Dlabs 300SX chipset was also used by Intergraph for their lower end professional graphic work stations.

The very first win95 hardware enabled D3D driver didn't belong to neither 3Dfx's Voodoo1 nor Rendition's V1K both only showed up in 96.

That being said, the first round of real, honest-to-goodness "actually useful" 3D chips included the Voodoo Graphics, Rendition Verite 1000, and the PowerVR PCX 1/2.
NV1 and 300SX were the very first pioneer 3D consumer grade hardware chipsets to kick start PC world (Diamond Edge3D_NV1 - $349, Creative Labs' 3D Blaster VLB_300SX - $329 - ComputerCity Thanksgiving '95).
 
Geforce 3 ti-200 was/is a "real"(budget or what too call it) DX8 part, had someone forgoten that one..

And as someone said earlier the Geforce 4MX had the crossbar and z-check memory- optimizations from the REAL Geforce 4, the LM2.

The difference was that while the REAL Geforce 4 had 4*32bit mc the MX has 2*64bit´s.

But Nvidia had one real DX8 budget card and many seem´s to forgett that.
 
JC said:
Joe DeFuria said:
That being said, the first round of real, honest-to-goodness "actually useful" 3D chips included the Voodoo Graphics, Rendition Verite 1000, and the PowerVR PCX 1/2.
NV1 and 300SX were the very first pioneer 3D consumer grade hardware chipsets to kick start PC world (Diamond Edge3D_NV1 - $349, Creative Labs' 3D Blaster VLB_300SX - $329 - ComputerCity Thanksgiving '95).

You didn't say anything that I didn't. ;)

Yes, the NV1 and GiGi were first in actual retail boards. However, their actual "usefulness" was limited (with no exaggeration) to the pack-in titles that came with the hardware. No actual D3D support (these cards shipped before the first version of D3D was available), and no third party support beyond the pack-ins IIRC. Pranzer Dragoon, Flight Unlimited...(Though it wouldn't surprise me to learn if there were one or two other titles...)

The first chips that had any non trivial support from IHVs, was Voodoo Graphics, Rendition Verite, and PowerVR PCX1/2.
 
Back
Top