Bjorn said:
Sabastian said:
DX9 for $79..... that is their marketing on this bugger. But it won't play a DX9 game. Maybe it will play some shiny water effect. (allibit crappy.)
Well, that's better then not getting that effect isn't it ?
I assume you are addressing the comparison to the mentions of the GF 4 MX (which I agree it doesn't deserve to be compared to), and not drawing the comparison to the 9200, right? Because the gap between PS 1.4 and PS 2.0 as implemented in real time on the 5200 is going to be very narrow or nonexistant for the majority of effects. Let me explain...
I think the use of a term like "DX 9 level" is muddying the discussion. The 5200 is a "DX 9 level"
featureset card, with benchmarks (
still unconfirmed and greatly in need of verification) showing a sub "DX 8 level"
performance. I think many DX 9 targetted effects are going to be DX 9 level in both senses (with the 5600/9600 performance level targetted, hopefully), and many effects are going to be PS 1.4 level (i.e., short but prettier on the 5200 than on the 9200). In the sense of competing with the 9200, that isn't so bad , and in the sense of offering something to consumer, worse than the 9200 doesn't
necessarily mean worthless (do people forget that the 9200 and GF 3 class cards are still good cards?), and I think we absolutely have to wait for indepdent benchmarks to evaluate further. Note that PS 1.4 being shown to be a good match for "DX 9 functionality shader for DX 8 level performance target" does seems a perfect fit for the usable levels for the 5200, but allows the 9200 to compete and, apparently, win...I think this goes in line with the efforts they seem to be going through to attack PS 1.4.
As far as competition between the two companies go, what it
looks like based on reported results (and what clarifies nvidia's complaint for 3dmark03, as has been stated by several people before) is we'll have a situation where the 9200 will run its PS 1.4 quite a bit faster than the 5200 will run its PS 2.0, and dependence on vertex shading will tend to severely cripple its performance. The result of this is the 5200 looks like it will be a higher image quality card, theoretically, (as long as no new "Aggressive" aniso style tricks are pulled, which I don't expect), but (if the benchmark results are verified) a much lower performing card than the 9200. I expect this ties into all of their related marketing pushes of "higher quality pixels".
As for the VS functionality...many seem to be jumping up and down against the idea of the 5200 using the CPU for vertex shaders...but, to me, this makes a great deal of sense for OEM deals if it can deliver respectable performance (the CPU performance only has to compete against the 1 vertex shader on the 9200...well, depending on the penetration of the RV350 into the OEM market). It doesn't matter how it achieves it except as it pertains to performance (with actual games, with heavy CPU workloads, I'd expect the performance to be low, and I suspect the UT benchmarks were flyby runs, but it is too early to be sure of anything of this nature).
And again, i don't think anyone bying a $79 card really believes that it's going to be a "all features enabled" card 2-3 years in the future.
The question is whether it will run any games that use its advanced featureset at acceptable levels. I happen to think it will, so while I disagree with your line of reasoning, I agree with your point.
Now you know I think that the GFFX5200 DX9 for $79 is nothing but a marketing gimmick. I sincerely think however a solution based on the RV350 would clearly be a better choice for someone whom wants to buy a DX9 card.
That might be true. But the RV 350 doesn't cost 79$ now does it ?
Yeah, but the RV280 does. As my comments above indicate, I think the 5200 is "DX 9 level featureset with DX 8 level, or lower, performance". This applies to the 9200 in that I think PS 1.4 is a lower image quality (likely invisibly so usually, and if the 5200 depends on integer processing for acceptable performance, perhaps not any lower in image quality at all for the purposes of comparison with the 5200) "DX 9 level featureset, in the realm of what it can execute rapidly, at good DX 8 level performance levels".
purely a marketing scam aimed at OEMs and maintaining graphics market share with a crappy low end product.... that is how it is like the MX. Further if the silicon isn't under the hood doesn't that make it technically not DX9 hardware?
Sure, it might end up being a big piece of crap. But again, why judge it so prematurely ? (Edit: was a bit to slow
)
I agree with this part, and would go so far as to disagree with the comments you reply to. I do think it is quite likely to be "crappier" than the 9200, for performance, but I don't think that will be shown to be unacceptable, and unless fp16 performance is completely hosed, it will likely offer "mathematically" better image quality to offset its performance disparity.
Oh, and my confidence in its performance being acceptable to some segment of the marketplace is based on 1) the transistor count in association with the lack of a vertex shader, and presuming OEMs will pair it with high speed CPUs, 2) what they showed in the nvidia video presentation (unless they were pulling some extreme chicanery in what they were running), since I have to think running Dawn at >1 fps indicates a fair bit of PS performance.
In any case, I certainly disagree with the comparison to the GF 4 MX.