It doesn't need to outperform the NV25. The NV25 has quite a lot more fillrate. All I'd like to see is it outperform cards that perform similarly to it in DX8 games. The GeForce4 MX would be an okay comparison, but I'm not sure that's really fair. A better comparison may be against a product in ATI's R2xx line (I'm not sure...is there a card there that performs similarly in DX8 games? I would hope so...for the 5200's sake...).Ailuros said:That I want to see. I haven't seen a 5200Ultra yet being able to outperform a NV25 in preliminary Doom3 benchmarks, I'll be generous and leave the 64bit versions of the former out of the discussion.
"Cope" can take different meanings. Remember that DOOM3 is supposedly designed with a GeForce2 MX or GeForce SDR in mind as the minimum. Many already have the feeling that these cards will not be enough to feel the full experience of DOOM3. Remember that this is coming from a mindset of people who want to be able to play games at high resolution, with high levels of FSAA and anisotropic, and with all effects turned on. Not all users need all of these things, and one who ons the bare minimum to play a game certainly won't be able to do all of these things.As for real dx9 games only time will tell. But to be honest I don't even expect todays high end dx9 games to be able to cope adequately with true dx9 games, let alone a budget iteration of those.
I wouldn't expect a 5200 Ultra to be playing at 1024x768 in DOOM3. I would expect somewhere between 640x480 and 800x600 with all details turned on (and this probably without FSAA/anisotropic...all details simply meaning full shaders, meaning full specular highlights, or whatever other unannounced things there may be).I'd say s.o. is lucky if he'll get 30fps with high detail in 1024*768 in Doom3 with a 5200Ultra, unless we mean some weird version of point sampling AF here.
Update:
I just looked at Anand's review of the 5200 Ultra and 5600 Ultra. It looks to me like the Radeon 9000 Pro would be good to compare against the 5200 Ultra. The only issue is that the results from just that one review are all over the place, sometimes the 9000 coming way ahead, sometimes the 5200 coming way ahead. So, we'd need a number of data points to show a change in trend.
So, I propose a specific definition: The GeForce FX 5200 Ultra can be described as a "DX9 card" if its performance becomes noticeably higher than that of the Radeon 9000 Pro in the majority of games that use DirectX 9-level effects not to do additional math, but to improve performance.
Second edit:
Yes, I realize that this may be hard to really see as a definition. An easier one may be to just say that it's a "DX9 card" if its performance is still acceptable in games, which are bound to be more numerous, that use DX9-level shaders to actually do more math (and hence will reduce the performance). The only problem is that this is a highly subjective judgement (an example is that I thought the 32-bit color on the original TNT was quite usable). I think the only thing close to an objective judgement would be the one I listed above. Hopefully we'll have enough sample points to realize it...