Define 'rotated'. Relative to what?
GF3 sample pattern (2X and QX):
GF4 sample pattern (2X and QX):
R9700 sample pattern (2X):
FX5800 sample pattern (2X):
FX5900 sample pattern (2X):
Source for all images: B3D
And if it was limited to 2xRGMS AA while it used OGSS for higher AA samples, that still wouldn't invalidate my argument with regards to overall IQ.
So you're saying that RGMS AA is the same as RGSS AA? Again, let's not let those facts get in the way, eh?
Geeforcer said:GF3 and GF4 had 2x rotated grid AA (RGMS) which provided largely identical edge aliasing reduction to 2x RGSS of V5 while AF filtering did a better job of clearing up the textures then 2s SS + bilinear filtering ever could.
What is the point of this argument again? Voodoo 5 is better than GF3? Maybe it is for retro gaming, where you can run Glide games with AA.
But if you want to play a game that uses T&L heavily or uses shaders, you aren't going to want a Voodoo 5. Just from a simple performance standpoint, Voodoo 5 can not compete at all. From a quality and feature standpoint, other than its interesting (but slow) AA capability, it's missing bunches of features.
It sure was a disappointment to see how awful the 2D output remained on GF3 though. I skipped every GeForce from 2-5 because of how blurry the earlier cards were. I went through the Radeons, from the original to the 9700. They all had excellent 2D. I finally got my first GF card in years with my laptop and its 6800 Go, back in '04. And 6800 has its own problems when it comes to texture filtering (compared to 9700 and X800).
Voodoo5, on the other hand, while lacking a zillion 3D features because of the company's inability to make progress, had fantastic 2D quality. I picked one up a couple of years ago for a Win98 rig. However, I've found that a GeForce FX 5600 I have can be a better choice, with its T&L/DX8, AF and AA support in combination with a Glide wrapper or OpenGL renderer replacement (UT). NV's OpenGL support is also better than 3dfx's ever was.
Voodoo5, on the other hand, while lacking a zillion 3D features because of the company's inability to make progress, had fantastic 2D quality.
I still wanna know where people are buying these bad GF based cards with crappy 2D quality. I've owned atleast 1 GF card from each line made and none of them ever had crappy 2D quality. Text was always crisp, clean and clear for me on my 17" and later on 21" monitors and 1600x1200x32bit resolutions.
Nah, they had a fully-integrated 350 MHz RAMDAC in GeForce 2. That's just shy of the 360 MHz RAMDAC in a G400 MAX. Even Riva 128 has a integrated RAMDAC. You'll find a separate RAMDAC for the second display on lots of these GeForce cards though because the chips only have a single RAMDAC.Actually my recollection back that was that prior the Geforce 4, all nvidia cards had a really low quality and low frequency RAMDAC, including the reference card.
What I can't remember however, was if the RAMDAC was integrated onto the video processor or not? And I'm too lazy to look it up right now.