I just think it's absolutely stupid how you people want a drawback to continue just because ATI's current hardware does it. That is idiocy.
Just because you can live with a problem today doesn't mean you should have to with future hadware.
For example, I can currently live with the 16-bit dithered decoding of DXT1 in my GeForce4. That doesn't mean I wouldn't immediately praise nVidia if they (finally) used 16-bit decoding in the NV30. Nor do I think it's a problem that shouldn't be fixed.
You people that are claiming that the 45-degree "bug" shouldn't be fixed are on the brink of idiocy. It's just like saying we should never have 32-bit color because "I can live with 16-bit...I don't see the problem in my games."
And multisampling is definitely better than supersampling fundamentally, because the performance hit for the same image quality will be superior. If you want less texture aliasing, you should be looking for better texture filtering, not SSAA.
Just because you can live with a problem today doesn't mean you should have to with future hadware.
For example, I can currently live with the 16-bit dithered decoding of DXT1 in my GeForce4. That doesn't mean I wouldn't immediately praise nVidia if they (finally) used 16-bit decoding in the NV30. Nor do I think it's a problem that shouldn't be fixed.
You people that are claiming that the 45-degree "bug" shouldn't be fixed are on the brink of idiocy. It's just like saying we should never have 32-bit color because "I can live with 16-bit...I don't see the problem in my games."
And multisampling is definitely better than supersampling fundamentally, because the performance hit for the same image quality will be superior. If you want less texture aliasing, you should be looking for better texture filtering, not SSAA.