What does everyone think about the ATI video presentation?

Well I guess you are right Bjorn we have to wait to see the performance of these cards. I may be jumping the gun but from what I have seen and heard I am making a judgement call on the GFFX 5200. It could be that I am wrong ..but I don't think so. Wait and see is all we can do and I may end up eating some serious crow then again maybe not.
 
Sabastian said:
DX9 for $79..... that is their marketing on this bugger. But it won't play a DX9 game. Maybe it will play some shiny water effect. (allibit crappy.)

Well, that's better then not getting that effect isn't it ? And again, i don't think anyone bying a $79 card really believes that it's going to be a "all features enabled" card 2-3 years in the future.

Now you know I think that the GFFX5200 DX9 for $79 is nothing but a marketing gimmick. I sincerely think however a solution based on the RV350 would clearly be a better choice for someone whom wants to buy a DX9 card.

That might be true. But the RV 350 doesn't cost 79$ now does it ?

purely a marketing scam aimed at OEMs and maintaining graphics market share with a crappy low end product.... that is how it is like the MX. Further if the silicon isn't under the hood doesn't that make it technically not DX9 hardware?

Sure, it might end up being a big piece of crap. But again, why judge it so prematurely ? (Edit: was a bit to slow :))
 
Sure, it might end up being a big piece of crap. But again, why judge it so prematurely ? (Edit: was a bit to slow :))

The reasons are obvious if you do a statistical sampling of the most prolific posters. :)

Interesting how DX9 games don't even exist yet, and people are claiming that a card no one has been able to benchmark, but stands to offer 2x the performance of the MX, won't be able to run nonexistant games for which no one knows the requirements.

Since the MX can already run games like UT2003 at acceptable framerates for value customers in low resolution, it seems ridiculous to claim that the NV34 will lack the power to run some unspecified DX9 game.

(DX9 does not imply Doom3, or mega-pass stenciled shadow volumes. An engine like Q3, Halo, or UT2003, utilizing VS2.0 and PS2.0 would run fine with the fillrates possible on this card)

I can't help but feel that if the NV34 were a DX7 or DX8 card, and the 9200 was full DX9, the opinions of the detractors would bet the exact opposite. We'd now be talking about the "pathetic" NV34, that lacks features, locks the market into DX7/8 for another generation, we'd be talking about deceptively labeled chips, etc

Instead, shipping a low-end DX9 chip is now a "marketing gimmick". SiS, S3, Trident, et al, may as well quit and ship DX6/7 cards instead.
 
With Nvidia redefining standard graphic definitions, who knows what the words they say really means? With that said... I still have a few reservations about the GF-FX 5200 [NV34]. First, I'm still hesitent to believe it's really DX9 compliant. Second, I'm concerned about the speed at which it will execute DX8.1 let alone DX9.

It is a good thing to bring DX9 to the lowest cost sector, but if it detracts from developers wanting to use DX8.1/DX9 features then it's a disservice. Likewise, if developers have to resert to device-id of cards for determining what features to use then it's a disservice.
 
Of course, but why not reserve judgement until you see performance benchmarks. Is it really logical to start bashing something before you know anything about it?

Edit: Alright, I just realized I'm doing the same thing I left this board for last time. I may as well quit now well I'm ahead, since we know where these discussions ultimately lead anyway.
 
Will Nvidia ever distribute information to inform us what it is, I mean other than the name, target price, and claimed features? They haven't been as forthcoming and honest with the NV30, so why would they be with the NV31?

As for why I think most people are 'bashing' it already relates to trust. There's only so many times one is willing to give them the benefit of the doubt. Once it's repeatedly abused [will ship in by november, erm, december, erm christmas, uhm january, uhm february, oh wait maybe not at all ... will be 8 pipeline ... will be the all-signing all-dancing card of the industry ] why should anyone trust them this time around? It's much akin to the boy who cried wolf.
 
BRiT said:
As for why I think most people are 'bashing' it already relates to trust.


i don't know, i mean the trust issue is surely relevant but it seems the comments here are based more on what we do know of the card. sure it is speculative but there is nothing wrong with speculation, if we turn out to be wrong i am sure everyone will freely admit it. that is unless they are those who stick to their guns despite the facts and don't mind looking like idiots for doing so. ;)
 
Bjorn said:
Sabastian said:
DX9 for $79..... that is their marketing on this bugger. But it won't play a DX9 game. Maybe it will play some shiny water effect. (allibit crappy.)

Well, that's better then not getting that effect isn't it ?

I assume you are addressing the comparison to the mentions of the GF 4 MX (which I agree it doesn't deserve to be compared to), and not drawing the comparison to the 9200, right? Because the gap between PS 1.4 and PS 2.0 as implemented in real time on the 5200 is going to be very narrow or nonexistant for the majority of effects. Let me explain...

I think the use of a term like "DX 9 level" is muddying the discussion. The 5200 is a "DX 9 level" featureset card, with benchmarks (still unconfirmed and greatly in need of verification) showing a sub "DX 8 level" performance. I think many DX 9 targetted effects are going to be DX 9 level in both senses (with the 5600/9600 performance level targetted, hopefully), and many effects are going to be PS 1.4 level (i.e., short but prettier on the 5200 than on the 9200). In the sense of competing with the 9200, that isn't so bad , and in the sense of offering something to consumer, worse than the 9200 doesn't necessarily mean worthless (do people forget that the 9200 and GF 3 class cards are still good cards?), and I think we absolutely have to wait for indepdent benchmarks to evaluate further. Note that PS 1.4 being shown to be a good match for "DX 9 functionality shader for DX 8 level performance target" does seems a perfect fit for the usable levels for the 5200, but allows the 9200 to compete and, apparently, win...I think this goes in line with the efforts they seem to be going through to attack PS 1.4.

As far as competition between the two companies go, what it looks like based on reported results (and what clarifies nvidia's complaint for 3dmark03, as has been stated by several people before) is we'll have a situation where the 9200 will run its PS 1.4 quite a bit faster than the 5200 will run its PS 2.0, and dependence on vertex shading will tend to severely cripple its performance. The result of this is the 5200 looks like it will be a higher image quality card, theoretically, (as long as no new "Aggressive" aniso style tricks are pulled, which I don't expect), but (if the benchmark results are verified) a much lower performing card than the 9200. I expect this ties into all of their related marketing pushes of "higher quality pixels".

As for the VS functionality...many seem to be jumping up and down against the idea of the 5200 using the CPU for vertex shaders...but, to me, this makes a great deal of sense for OEM deals if it can deliver respectable performance (the CPU performance only has to compete against the 1 vertex shader on the 9200...well, depending on the penetration of the RV350 into the OEM market). It doesn't matter how it achieves it except as it pertains to performance (with actual games, with heavy CPU workloads, I'd expect the performance to be low, and I suspect the UT benchmarks were flyby runs, but it is too early to be sure of anything of this nature).

And again, i don't think anyone bying a $79 card really believes that it's going to be a "all features enabled" card 2-3 years in the future.

The question is whether it will run any games that use its advanced featureset at acceptable levels. I happen to think it will, so while I disagree with your line of reasoning, I agree with your point.

Now you know I think that the GFFX5200 DX9 for $79 is nothing but a marketing gimmick. I sincerely think however a solution based on the RV350 would clearly be a better choice for someone whom wants to buy a DX9 card.

That might be true. But the RV 350 doesn't cost 79$ now does it ?

Yeah, but the RV280 does. As my comments above indicate, I think the 5200 is "DX 9 level featureset with DX 8 level, or lower, performance". This applies to the 9200 in that I think PS 1.4 is a lower image quality (likely invisibly so usually, and if the 5200 depends on integer processing for acceptable performance, perhaps not any lower in image quality at all for the purposes of comparison with the 5200) "DX 9 level featureset, in the realm of what it can execute rapidly, at good DX 8 level performance levels".

purely a marketing scam aimed at OEMs and maintaining graphics market share with a crappy low end product.... that is how it is like the MX. Further if the silicon isn't under the hood doesn't that make it technically not DX9 hardware?

Sure, it might end up being a big piece of crap. But again, why judge it so prematurely ? (Edit: was a bit to slow :))

I agree with this part, and would go so far as to disagree with the comments you reply to. I do think it is quite likely to be "crappier" than the 9200, for performance, but I don't think that will be shown to be unacceptable, and unless fp16 performance is completely hosed, it will likely offer "mathematically" better image quality to offset its performance disparity.

Oh, and my confidence in its performance being acceptable to some segment of the marketplace is based on 1) the transistor count in association with the lack of a vertex shader, and presuming OEMs will pair it with high speed CPUs, 2) what they showed in the nvidia video presentation (unless they were pulling some extreme chicanery in what they were running), since I have to think running Dawn at >1 fps indicates a fair bit of PS performance.

In any case, I certainly disagree with the comparison to the GF 4 MX.
 
BRiT said:
Will Nvidia ever distribute information to inform us what it is, I mean other than the name, target price, and claimed features? They haven't been as forthcoming and honest with the NV30, so why would they be with the NV31?

As for why I think most people are 'bashing' it already relates to trust. There's only so many times one is willing to give them the benefit of the doubt. Once it's repeatedly abused [will ship in by november, erm, december, erm christmas, uhm january, uhm february, oh wait maybe not at all ... will be 8 pipeline ... will be the all-signing all-dancing card of the industry ] why should anyone trust them this time around? It's much akin to the boy who cried wolf.

Precisely Brit. Now too top it all off they are showing us demos of that Pixy with four or five of them dancing on their next gen hardware and saying nothing else with regards... just hold off we got something coming bla bla bla. Now we got this snake oil salesman slogan "DX9 for $79" ... and we are supposed to take what they say seriously?

But lets talk about this GeforceFX 5200 a little more. I hear that it has approximately 45million transistors. (please correct me if I am wrong here.) If this is the case it certainly does not have the hardware it needs to be fully DX9 compliant part, in which case it certainly does begin to smell like the GeforceMX 5200. I mean the GeforceMX 440 was touted and advertised as a fully compliant DX8 GPU. Why should we all of a sudden begin to believe that nvidia is done on its marketing mania? Clearly if these benchmarks from the Inquirer ( http://www.theinquirer.net/?article=8031 ) hold any truth the "DX9 for $79" is purely a marketing gimick and snake oil did come to mind when nvidia CEO mentioned it at their conference. There are equally low priced DX8.1a cards that outperform the "DX9 compliant" Geforce FX 5200 on a DX9 test no less.

"The only thing an end user compromises with the Geforce FX 5200 when it comes to DX9 is performance." LOL My sakes go and listen to him say them vary words or something along them lines. BTW we will see how much of a compromise this "DX9 for $79" really is. Snake oil indeed.
http://biz.yahoo.com/oo/030307/77806.html
 
Well, in the spirit of your EDIT later down the page, I'll skip around the inflammatory comments...

DemoCoder said:
...
Interesting how DX9 games don't even exist yet, and people are claiming that a card no one has been able to benchmark, but stands to offer 2x the performance of the MX, won't be able to run nonexistant games for which no one knows the requirements.

There are benchmarks that have been discussed in which it scored lower than the 9200 in 3dmark03 even though it (presumably) ran an extra test. Since these results can fairly be considered pretty dubious, I tend to agree with the sentiment that we should wait, but I'm pointing out that it is the likely reason people are making the conclusions they are.

Since the MX can already run games like UT2003 at acceptable framerates for value customers in low resolution, it seems ridiculous to claim that the NV34 will lack the power to run some unspecified DX9 game.

Well, if you look at just the card, and if it really lacks vertex shaders, it isn't ridiculous at all, since if someone buys it as an upgrade they could very easily be disappointed in it compared to a GF 4 MX depending on their CPU, especially for a game like UT 2k3 (don't know about Doom 3). Looking at it from that direction (again, I'm working on the assumption that the vertex shader functionality being CPU based is true, based on the transistor count and featureset), it would certainly stink.

That means that outside of this (theoretical at this point) and similar situations, I agree with your point.
 
The chip codename tells us that it is designed from the CineFX architecture, which is a clear indication that it will be able to perform DX9.0 operations.

I think the areas that may need answering is on Vertex Shader support and precision. Its quite possible that is doesn't have Vertex Shaders, but that wont mean its not DX9, since these can still be done via the CPU - perhaps they may even go the MX route here and have a system in which the transformation is done via the hardware but other elements done via the CPU, meaning it will be hardware assisted - thats purely specualtion, but as I siad, it would still mean its DX9. The other area is PS precision - it may be the case that this will only be FP16 in the Pixel Shader, however if they can still do the texture co-ordinates with at least FP24 precision (which will likely mean FP32 in the case of CineFX) then this will not mean its uncompliant either.
 
Hey, I think I'm pretty far ahead in the line for people who dislike nvidia as a company and as a marketing force, and I think they fully deserve the complete lack of trust they are receiving (though my hope that it would cause them to modify their marketing is slowly dying :( )...but I don't think that makes them any worse engineering wise. Of course, when comparing to the 9700 (or their marketing descriptions of their hardware :-? ) it is easy to think they stink engineering wise as well, but I don't think they do in an objective sense.

The NV34 will still be there to bash once benchmark results are actually available.
 
Wavey,
I dropped my concerns for the color precisions based on them quite clearly and explicilty stating "128-bit through the full pipeline" very emphatically and directly in their video presentation, EDIT: and believing they dropped vertex shaders from the design. Of course, there are some wild and crazy ways they could be lying when they say that, but despite past insanity, I am not prepared to believe they are still insane enough to think they will get away with it without even more negative backlash.
 
I think it´s good, the GF FX 5200 vill bring the "feature-set" to the entry level making the adoption for DX9 games much more easier from a GD eyes.
If you just want to play around with some fancy demos and have the ability too turn all DX9 on but with a lower res and so on, its great for many consumers.

edit: spelling
 
overclocked said:
I think it´s good, the GF FX 5200 vill bring the "feature-set" to the entry level making the adoption for DX9 games much more easier from a GD eyes.
If you just want to play around with some fancy demos and have the ability too turn all DX9 on but with a lower res and so on, its great for many consumers.

edit: spelling

Hrm, I don't think that game devs will create a DX9 game with the Geforce MX 5200 in mind. Simply because there is already a wider proliferation of considerably more powerful fully DX9 compliant hardware on the market and more clearly better DX9 solutions based on the RV350 coming in at lower prices. By the time there are DX9 games on the market the Geforce MX 5200 will be soo outdated and overlooked as a viable DX9 solution that when end users do attempt to use the Geforce MX 5200 in one of these DX9 games they will think that either there is something wrong with the game or they need to upgrade their video. I know that is speculative but I honestly don't think it will be that great of a solution for end users. IMO the RV350 is a far better answer for end users and I think ATi will make that part competitive price wise as well for even greater proliferation .
 
But if the card was DX8.1(and 25% faster) that would better right?
I mean if you buy a card that cheap your not a "hardcore" gamer, the hardcore gamer should go for the 9800/FX 5800.
That´s not the point, everyone here have wanted the latest DX pushed into the entrylevel market so that GD can make games that uses the "latest" effects..
Then if you want to run a DX9 application at 20fps or 100fps is up to the buyer.
 
well i hope not Sabastian, i really am looking forward to seeing dx9 stuff in action but is bound to be some time before people can develop the content to really make use of the api and if that is were the card is really going to choke then it does a lot to make the point moot. but let me just add *offical benchmarks pending* for all of you who dislike speculation. ;)

demalion said:
I think they fully deserve the complete lack of trust they are receiving (though my hope that it would cause them to modify their marketing is slowly dying :( )

why every would you want such a thing demalion? what is wrong with you? :devilish:
 
overclocked said:
But if the card was DX8.1(and 25% faster) that would better right?
I mean if you buy a card that cheap your not a "hardcore" gamer, the hardcore gamer should go for the 9800/FX 5800.
That´s not the point, everyone here have wanted the latest DX pushed into the entrylevel market so that GD can make games that uses the "latest" effects..
Then if you want to run a DX9 application at 20fps or 100fps is up to the buyer.

Well I won't argue that a hardcore gamer wouldn't want a 9800/5600.(BTW no matter if you are a "hardcore gamer" or not you won't be able to buy the Geforce FX 5800 Ultra AFAIK it is canceled.) But you can get great DX8.1a compliant hardware for the same price of a Geforce FX 5200 and it is very possible that it will run better.

Yes that is the point. They are already trying to make use of the features of DX9 but they are not doing it with the Geforce FX 5200 in mind. It isn't even shipping yet. ATi has had for over 6 months DX9 parts available and has sold over a million of the DX9 compliant R300 cores. Now ATi will start selling their mainstream solution with the RV350 and I believe that it will be a popular OEM product. Considering ATIs brand name recognition in the high end.

Yeah in the end it is up to the buyer. Buyer beware comes to mind with the Geforce FX 5200 IMO especially if they are planning on playing DX9 games on that hardware.
 
The Nvidia presentation did have Dawn running fine on a GFFX5200 so that would point to acceptable DX9 performance (unless they were being deceptive in some way, eg. using a modified version of the Dawn demo). The sub 1000 3dmark03 score in the Inquirer would point to some terrible performance but I'm not trusting that just yet. I really want Nvidia to make this work because a clear lead for ATI will hurt the industry as much as the years of Nvidia domination have. Competition is a good thing, let's hope both companies can keep it up.
 
Back
Top