What does everyone think about the ATI video presentation?

well my beef is that from the sounds of it i don't think it will be worth 80$. granted, until i see it put through some testing i cant rightly say exactly how much i think it should be worth. however, i freely admit that i may well wind up impressed by the performace of the card and have to eat my words. but such is life. ;)
 
I don't think there are many people here that could even relate to buying a low end card outside of using it in a server. :)

I mean $80 vs $150 for most likely many times the performance for someone with any money at all is a no brainer. The $80 card matches up well with a DX9 version of Deer Hunter. :)

I also have to wonder if NVIDIA being the dominent crap DX9 card provider, will cause some impact on 1.4 getting supported if they only do a half assed job of supporting it. They could leverage more support for 1.1-1.3 (is there really a diff between them?) if their 1.1 is twice as fast as their 1.4. Having to detect cards when deciding what shader to use would suck for developers I would imagine.
 
So is it official that the FX5200 has no hardware DX9 support? In that case I'm impressed that it ran Dawn at the presentation even if that was a low poly version. Hmm. Also, as of DX9 isn't there the option to use the reference rasterizer for pixel shaders? So couldn't you market ANY card even say a TNT or whatever as a DX9 card as long as it can run DX9 at all? In that case if the FX5200 doesn't have hardware DX9 support it really shouldn't be marketed as a DX9 card any more than say the R9200 (which is correctly marketed as DX8.1). What a mess.
 
On-topic for a moment, I thought ATi's presentation was better than nVidia's (yes, even taking the opening and closing cheesiness into account). :) It was just livelier, with better game and tech presentations, IMO. Seemed ATi put more effort into it, which is fine, as nV is playing catch-up.

I didn't get Jen Hsun's [sic] comment after Tamasi left the stage--something to the joking effect that Tamasi had raised the $79 price by $20 as he was leaving the stage. Was that in reference to its purported featureset, or a sneaky way of revealing the true MSRP? ;)

I find it interesting both companies aim/advertise their entry level cards for 12x10 w/4xAA. Pretty high entry, IMO. They both showed relative performances, IIRC, so are these just extreme cases to show unusually-high performance improvements?

Bleh, don't answer, I've had enough of speculating--I'll just shut my speculative yap until we see benches on Monday. I would like to know if a 9500P would be a better bet than a 9600P at the same price, though. If AA is more fillrate-limited, will the 9500P's superior fillrate yield better performance than a 9600P, even though the 95P theoretically has insufficient bandwidth to drive all its pipes? If AF is more bandwidth-limited, will the 96P outperform the 95P in AF b/c of faster memory?
 
It was in reference to him saying the card cost 99$ the second he got up there.

I thought ATi's presentation was better too. Say unprofessional if you want, but it was a party, not a business meeting. Any company that throws launch parties with strippers (or whatever those were ;p) and Tool is fine by me :)
 
Pete said:
I didn't get Jen Hsun's [sic] comment after Tamasi left the stage--something to the joking effect that Tamasi had raised the $79 price by $20 as he was leaving the stage. Was that in reference to its purported featureset, or a sneaky way of revealing the true MSRP? ;)

I think Tamasi made a speak-o, thats what the joke was about. The MSRP of the extreme lowend 5200 will probably be $79, but we'll see varients of it for up to $149 MSRP. Likely this price will have to drop dramaticly when ATI starts lowering the 9600non-pro. But I think the margin on a 40M transistor 5200 Ultra is still likely to be a lot better than 9600non-pro. I really hope nVidia doesn't have to try pitting the 5200 Ultra against a 9600 though, I seriously dont see how it could. I'm wondering when we'll see 5600 vanilla varients.


I dont understand how hypocritical some of the people on this board are being right now. I've seen people deffinatively say the 9200 will be faster than the 5200, so whats the point of buying a 5200...AND that the 9200 is faster at PS1.4, so its somehow better than the 5200 that supports DX9.

All of you dislike the GF4MX as much as me for holding developers back. Now nVidia is finaly doing the right thing, and you're still going to try and crusify them. It's really not logical however you think about it. At least not until we see at least a single benchmark (old ones from TheINQ cant really count, thats absurd). I hate to throw around the work "fanboy", but if the shoe fits, some of the people here might be displaying their new shoes. :rolleyes:
 
I think a number of you can't see the forest for the trees wrt GFFX 5200. It's the first sign of forward looking strategy/execution from Nvidia in 12 months... This will underpin revenues for Nvidia for the next 12-24 months in the OEM & low end sphere. Why? They've targetted excess capacity process, so can extract decent GMs. They will be looking to actively run down TNT2M64, GF2MX, GF4MX, GF4Ti inventories & capitalization... The bonus will be that ATI will need to undercut with their low end DX8 part, regardless of performance, until they too push a DX9 part in this segment.

BTW, I too had heard NV34 lineage from NV30, fp16 PS2, 128bit DDR, lacking "advanced logic units", not full HW VS2.
 
I think people may be jumping the gun here a bit unless there is proof that the FX5200 will not do PS2.0 in hardware (other than the Inq. benchmarks). If it does not do the shaders in hardware then there really is a problem because they are outright lying about it being a DX9 card.

Even more Off-Topic:
As for the GF4MX I don't really see what the problem is about. Sure, its a DX7 card but it was marketed as a low-end card and so its down a generation from the DX8 GF4Ti cards. Its no different to the DX8 cards (R9200) that ATI are selling in their DX9 line (R9600, R9800). Where are the people claiming that ATI is holding development back? The jump between DX8->9 is just as big as the jump between DX7->8 IMO so its the same thing. Also, there are very few places in games where pixel shaders would be indesspensible and pixel shaders are the thing people are most worried about lacking the GF4MX as far as I can see. It is trivial to have pixel shaded water and a simple texture if no ps is available. Same for any other material or even lighting effects. I've played Warcraft3 and UT2k3 on a GF4MX and they ran fine so its a reasonable card for the low end market. Even Doom3 will run on the MX (sure it will look like crap but it will be playable and thats the main point for low end cards.

And to finish off On-Topic:
ATI vs Nvidia presentation: I think the ATI one had much more of a party feel and while a little cheesy it was fun to watch. The Nvidia one by comparsion was much more boring which would have been fine if it had been technical but it wasn't. Both were just about showing off new products and creating a bit of excitement and of course PR and in that I think the ATI one did much better (it was certainly more memorable and that is the whole point, right?). Sure the Nvidia one was more "professional" but it was in no way more informative than the ATI one so who cares. Also, the Nvidia one felt like they are grasping at straws with the whole DX9 for $79. They were showing off with the extreme low-end while ATI were gloating about their performance lead ('k granted the Nvidia launch was about their budget range only but still, they were talking extreme low end).
 
sabeehali said:
Second my point is 5200 is reported to have LMA2 missing (from some stuff on nVnews) I mean look at parhelia lack of HSR has crippled it too much.

Yeah, I'm the one who started the whole thread to discuss that issue there... It did go into a flame war rather quickly :(

Anyway...
The final word seems to be:
- The NV34 supports everything in LMA II but Z Compression

And what does that include, then? Well...

A crossbar-based memory controller: Ensures that every aspect of the memory system is balanced and that all memory requests by the graphics processor are handled properly. Under complex loads, LMA II’s memory crossbar architecture delivers 2-4 times the memory bandwidth of other standard architectures.
A Quad Cache memory caching subsystem: High-speed access buffers that store small amounts of data and operate at tremendously high bandwidth, ensuring that data is queued and ready to be written to the memory. These caches are individually optimized for the specific information they deal with, resulting in almost instantaneous retrieval of key data.
A visibility subsystem:Determines whether or a not a pixel will be visible in a scene. If it determines a pixel will not be visible, the pixel is not rendered, saving valuable frame buffer bandwidth.
Fast Z-clear technology:Minimizes the time it takes to clear the old data in the Z-buffer, boosting frame rates up to 10% without compromising image quality.
Auto pre-charge:Warns the memory device of areas of the memory likely to be used in the very near future, allowing the GPU to spend less time waiting for memory and more time rendering pixels.



Uttar
 
I bought a Rendition Verite 1000 in Mid 95. Which was a FULL 3daccelerator. It even introdeced Vquake that JC himself had a great deal to do with. It was a mutual agreement between REndition/id that lasted the first several months after Quake Shipped.

Later the Voodoo 1 introduced OpenGL Quake. which the Verite 1000 also supported.

Both offered REal 3dacceleration LONG before Nvidia had anything remotly true 3D on the market.
 
Hellbinder[CE said:
]I bought a Rendition Verite 1000 in Mid 95.

tom.history.02.gif


:rolleyes:
 
Dave H said:
Hellbinder[CE said:
]I bought a Rendition Verite 1000 in Mid 95.

:rolleyes:

yea, I was kinda thinking the same thing. I really don't recall the Verite 1x000 line being out until the end of 96', a full year after the nv1, which IIRC was the first real consumer card to support any kind of texture mapping. I dont believe anyone is saying the nv1 was great, it wasn't, but it does prove that nvidias been around for a while before the TNT.
 
Hellbinder[CE said:
]I bought a Rendition Verite 1000 in Mid 95. Which was a FULL 3daccelerator. It even introdeced Vquake that JC himself had a great deal to do with. It was a mutual agreement between REndition/id that lasted the first several months after Quake Shipped.

Later the Voodoo 1 introduced OpenGL Quake. which the Verite 1000 also supported.

Both offered REal 3dacceleration LONG before Nvidia had anything remotly true 3D on the market.

Diamond Edge (NV1, with integrated audio) with Panzer Dragoon and Virtua Fighter shipped first in 1995. GLINT shipped in 1995 before Verite also. Verite 1000L did not ship in Mid95, it was announced in October of '95, and not shipped until the end of the year. Orchid Righteous 3D shipped 4Q 96 (I owned one of the very first off the production line), glQuake beta didn't hit until end of Jan '97. Between 1995, and before GLIDE and miniGL apis took hold, HW 3D gaming by and large sucked. You had to scavenge websites for news of any game that had support for your chipset.


The level of your anti-Nvidia rhetoric is astounding. You make it sound as if Nvidia is a company of morons with no achievements.

Why are you so hell bent on proving NVidia loses in any contest or horserace you try to come up with?
 
As for the GF4MX I don't really see what the problem is about. Sure, its a DX7 card but it was marketed as a low-end card and so its down a generation from the DX8 GF4Ti cards. Its no different to the DX8 cards (R9200) that ATI are selling in their DX9 line (R9600, R9800).
Not to beat a dead horse, but simply naming the MX as a GF4 markets it as a DX8 card. Yes, ATi is guilty of this idiocy, too, with their 9000. I don't know why they did it--perhaps they thought the 8x00 name was getting too crowded, though they seem to packing the 9x00 line much tighter. It is different from the 9000/9200 in that the MX was very old tech by then--the base architecture was already about 1.5 years old. And yes, ATi is pushing it by releasing a "new" DX8 product about 1.5 years after the 8500 debuted, although it is much more comparable in speed to the 8500 than the GF4MX is to the GF4Ti, or even the GF2MX to the GF2. I don't think the perception-to-reality stretch from 8500 to 9000/9200 is as big as that from GF2MX to GF4MX. Now we're getting into technicalities, though, and I'll gladly admit ATi isn't showing much class in following the same naming low-road as nVidia.)

Bah, enough of the past. Names aside, it should be interesting that both ATi and nVidia are synchronizing their releases, and competing at every price point. If the 9600 does indeed dip as low as $99, as ATi said in their GDC presentation, then they will indeed be matching nV with an all-DX9 lineup at all price points--good news for everyone. (So the 9200 just seems like unnecessary and confusing clutter--they should've just phased out the 9000 for a 9000-8x, as nVidia did.)

*Yes, I had a little fun with color, identifying who I was faulting with their company color. It's not like what I was saying was that important, anyway. The goofy colors should serve as a warning not to take my comments that seriously. ;)
 
Not to beat a dead horse, but simply naming the MX as a GF4 markets it as a DX8 card.

Not at all; it merely markets it as belonging to the GF "level 4" class, for which a (weak) argument can be made in that it has the same MSAA, memory controller, and z-optimizing features that GF4 did. In any case, the GeForce "levels" had not been previously reserved for DX version numbers (i.e. GF and GF2 were both DX7; GF3 and GF4 both DX8).

On the other hand, ATI naming DX8 cards 9x00 does market them as DX9 cards, insofar as, with all previous 4-digit Radeon names, the first digit referred to the DX version number.

Yes, ATi is guilty of this idiocy, too, with their 9000. I don't know why they did it--perhaps they thought the 8x00 line was getting too crowded.

Yeah, 8500 and...8500 LE!!! I take it this is a joke?

If the 9600 does indeed dip as low as $99, as ATi said in their GDC presentation, then they will indeed be matching nV with an all-DX9 lineup at all price points

Considering the PR info ATI sent on the 9200/9600 indicates that the 9600 will only go down to $150 MSRP, that comment would seem to have been either a mistake (a "speak-o", as someone else called it), or perhaps a Freudian slip referring to how low the street price will go to compete with GFfx 5200. In any case, it still won't go down to $80, which likely prices it out of the OEM market.

So the 9200 just seems like unnecessary and confusing clutter--they should've just phased out the 9000 for a 9000-8x, as nVidia did.

To be fair to ATI, Nvidia actually did the same thing (if not more confusing) in some markets (Europe IIRC), calling the Ti4400-8x the "Ti4800SE". The amusing comment was that "SE" stood for "Slower Edition".
 
Democoder-
Diamond Edge (NV1, with integrated audio) with Panzer Dragoon and Virtua Fighter shipped first in 1995.

Actually, Descent (which originally shipped in 1994) later had a special 3D version before this, but was very obscure and far from mainstream. A year later, they came out with a special S3-only version as well which is equally difficult to find a copy of today.

Why are you so hell bent on proving NVidia loses in any contest or horserace you try to come up with?

There was no "horserace" as far as NVIDIA was concerned- and that's a fact. Their NV1 didn't put 3D graphics on the roadmap nor was there any major push. If anything, Rendition, S3 and 3dfx were responsible for pushing the industry this direction... albeit it was more an effort from game developers than anything else, with these three being the chosen vehicles to make their point.

Pete-
simply naming the MX as a GF4 markets it as a DX8 card. (Yes, ATi is guilty of this idiocy, too, with their 9000. I don't know why they did it--perhaps they thought the 8x00 line was getting too crowded.)

Actually, people harping on the 9000 truly show they didn't pay attention to ATI's clear and concise definition of their naming convention. The first number denotes the generation, and the remaining numbers denote the performace/feature level within that generation.

For example, using defined ATI's numbering scheme- if a Radeon 10000 was released, it wouldn't be logical to assume it would beat a 9700 in performance or features. Reason being- you are comparing the lowest end product of one generation with the highest end product of the previous generation. The same goes for a 9000 compared to an 8500. Low-end of 9th generation compared to mid/high end of the 8th generation. Instantly assuming a 9000 could beat out an 8500 in performance or features isn't an absolute given when their clearly defined naming scheme is used against their products.

If ATI could even be vaguely compared with NVIDIA's GF4 MX, it would be the same as taking a Radeon 64DDR and naming it a Radeon 9000. A GF4-MX isn't a GF4 generation, nor is it even a GF3 generation product. What they have done is PR a GF2 generation product up two generations and relabeled.
 
So a 9100 is a different generation from an 8500, but a GF4MX is the same generation as the GF2???

:oops: :oops: :oops:

I see you're not letting Hellbinder and his "mid 95" Verite 1000 get the prize for Most Preposterous Anti-Nvidia Historical Revisionism without a fight...
 
So a 9100 is a different generation from an 8500, but a GF4MX is the same generation as the GF2???

Yes, it's not so hard to see one the blinders come off. It is a different chip- but no clock bump. So by this logic you would consider the GF and GF2 the same generation? It's the same thing- a small chip tweak BUT with a clock bump. I guess if ATI bumped the clock by 5mhz, this whole generation thing wouldn't be an issue?

The comparison basis is that the GF4 (when crediting NVIDIA for what they consider "product generations") falls way off left field on their product roadmap of features/performance. The best way to illustrate this is with a good graph-
roadmap.txt


I see you're not letting Hellbinder and his "mid 95" Verite 1000 get the prize for Most Preposterous Anti-Nvidia Historical Revisionism without a fight...

No, Historical Revisionism would involve revising history. Unless you are about to counter that the PC 3D gaming revolution is widely known to have been founded by a bunch of geeks with NV1's... I'd be interested to hear your viewpoint if you somehow consider mine "revisionism"... and I'm betting it's about as far out in left-field as the GF4 MX is by it's name-sake. :)
 
Sharkfood, 8500 and 9100 are the same chip...

btw, you should not link txt files as images ;)
 
Back
Top