Windfire said:I figured Nvidia, with their history of "image quality is king" would one-up ATI as a the second punch in the old one-two.
I don't think that's really the case. It seems more to me that nVidia is more concerned with "rendering correctness is king," which has really served them well for anisotropic filtering (I still think it's quite a bit better than ATI's implementation), but they really have been slacking off for a long time with FSAA. While they were the first to introduce usable FSAA with the GeForce DDR (in the 5.xx drivers, if I remember correctly), and were also the first to introduce MSAA with the GeForce3, they clearly haven't done as much as they could have in terms of AA quality. In particular, they only have one mode that is not ordered-grid (2x, which was okay with the GeForce4, but is just sad with the FX). This is, I feel, the one thing that nVidia is clearly behind in. Hopefully they'll fix it ASAP.Windfire said:I figured Nvidia, with their history of "image quality is king" would one-up ATI as a the second punch in the old one-two.
RussSchultz said:Where's the "i'm a cheap bastard and won't buy any card over $150!" option?
Althornin said:Windfire said:I figured Nvidia, with their history of "image quality is king" would one-up ATI as a the second punch in the old one-two.
What history of "IQ is king"??
nVidia has always been "framerate is king" - not IQ.
SirPauly said:Back in the early nVidia TNT days when 3dfx had the performance crown it seemed nVidia was more about over-ll IQ back then, imho.
Yes, well, their FSAA has never been the best in quality, and they've never really spent much time in improving it. Now I think it's finally caught up to them. Amazing how it took a full three years for the competition to finally put out a product that truly could best nVidia in the FSAA department (previous products were too low-performing to do so).Windfire said:That is what I was thinking more about.SirPauly said:Back in the early nVidia TNT days when 3dfx had the performance crown it seemed nVidia was more about over-ll IQ back then, imho.
Chalnoth said:but they [Nvidia] really have been slacking off for a long time with FSAA. While they were the first to introduce usable FSAA with the GeForce DDR...
Windfire said:I figured Nvidia, with their history of "image quality is king" would one-up ATI as a the second punch in the old one-two.
That's not true either. First of all, Quincunx isn't that bad, as long as you're looking at a pure 3D scene. Only problem is, pretty much every game in existence has text, so that makes the FSAA mode rather pointless. Maybe if some game moved to polygonal text Quincunx would work, but that's not going to happen.Doomtrooper said:..so yes in the early years Nvidia centered on IQ, then took a corner and headed for speed only...
Yes, please notice the "widely-usable" in the post above. I think I used the FSAA on my GeForce DDR for Freespace 2, and that's about it.John Reynolds said:OGSS was the worst FSAA implementation I've ever seen. It was hardly "usable", IMO.
Now I think it's finally caught up to them. Amazing how it took a full three years for the competition to finally put out a product that truly could best nVidia in the FSAA department
The Bottom Line: The GeForceFX 5800 Ultra is a very hot and noisy beast that may give you a bit of an edge over the current king of the hill, the ATI 9700 Pro in some applications. If you are an NVIDIA f@nboy, this of course has your name all over it. At the current US$400.00 price point, the GFFX simply does not seem worth it to us. If NVIDIA can work some driver magic and pull an extra 20% increase in frame rate out of the bag like we have seen in the past, they had best start pulling. Either that or pull out the NV35 chipset, and quick.
Amazing how it took a full three years for the competition to finally put out a product that truly could best nVidia in the FSAA department (previous products were too low-performing to do so).