A World Without Limits

I just think it's absolutely stupid how you people want a drawback to continue just because ATI's current hardware does it. That is idiocy.

Just because you can live with a problem today doesn't mean you should have to with future hadware.

For example, I can currently live with the 16-bit dithered decoding of DXT1 in my GeForce4. That doesn't mean I wouldn't immediately praise nVidia if they (finally) used 16-bit decoding in the NV30. Nor do I think it's a problem that shouldn't be fixed.

You people that are claiming that the 45-degree "bug" shouldn't be fixed are on the brink of idiocy. It's just like saying we should never have 32-bit color because "I can live with 16-bit...I don't see the problem in my games."

And multisampling is definitely better than supersampling fundamentally, because the performance hit for the same image quality will be superior. If you want less texture aliasing, you should be looking for better texture filtering, not SSAA.
 
I just think it's absolutely stupid how you people want a drawback to continue just because ATI's current hardware does it. That is idiocy

What is idiocy is someone jumping to conclusions when noone actually stated what you're offering.

I read an improved or tweaked anisotrophy feature in the future.
 
Entirely to much speculation in this post.

Let's just get some things straight, that are pretty much 99% fact.

R300 will be faster then Gf4.
R300 seems to be impressing review sites/programmers.
ATi WILL be speed king for when the time R300 is released, to whenever NV30 comes out (or a earlier solution, which I see highly unlikly (drivers) :rolleyes:)
 
All that we know is that the R300 will be faster in DOOM3. This may just mean that it has some optimized paths for stencil shadows. We cannot yet be certain that it will be faster in today's games.

Of course, I would be highly, highly disappointed in ATI if they couldn't outperform the GeForce4 at launch in nearly every benchmark there is. I expect they will be able to get performance a fair bit higher than the GeForce4.
 
Chalnoth said:
I just think it's absolutely stupid how you people want a drawback to continue just because ATI's current hardware does it. That is idiocy.

Just because you can live with a problem today doesn't mean you should have to with future hadware.

For example, I can currently live with the 16-bit dithered decoding of DXT1 in my GeForce4. That doesn't mean I wouldn't immediately praise nVidia if they (finally) used 16-bit decoding in the NV30. Nor do I think it's a problem that shouldn't be fixed.

You people that are claiming that the 45-degree "bug" shouldn't be fixed are on the brink of idiocy. It's just like saying we should never have 32-bit color because "I can live with 16-bit...I don't see the problem in my games."

And multisampling is definitely better than supersampling fundamentally, because the performance hit for the same image quality will be superior. If you want less texture aliasing, you should be looking for better texture filtering, not SSAA.

Problem ONE: No one claimed they want the issue to remain. We are just pointing out your vendor specific hypocracy, which has become quite annoying.

Problem TWO: Your last bit about MS is you opinion, and is 100% wrong as far as i am concerned.
No amount of anisotropic + MS will succeed in reducing aliasing as much as a lower lever of aniso + SSAA. Anisotropic makes textures look clearer, it doesnt do that much to decrease aliasing. In fact, making them look clearer increases the high frequency noise that IS aliasing.
 
I just think it's absolutely stupid how you people want a drawback to continue just because ATI's current hardware does it. That is idiocy.

I never implied that its ok for this 45 degree angle. I would wish that it works in this case. But I know that in the real world there are always design tradeoffs.

I also told you in another thread over at nV news that I have only once seen this issue show up in a real game, and that was one level, one map, one room. Yes its an issue but since its happened only once in all of my FPS games, then I dont see why its not an acceptable condition. I agree if I was a flight sim nut, I would not want it. But I am not. Gimme FPS please :)


Just because you can live with a problem today doesn't mean you should have to with future hadware.

Just like how many versions of TC bugs in the GF2, GF3, GF4 ext Heres hoping that the get it right next time :) BTW I am just kidding here.


And multisampling is definitely better than supersampling fundamentally, because the performance hit for the same image quality will be superior. If you want less texture aliasing, you should be looking for better texture filtering, not SSAA.

Again its all a matter of trade offs. No one method is always correct in EVERY situation. Besides the fact that you can get away with less texture filtering with an Supersample approach does imply for AA then you allmost get two for the price of one. Just that one big price :)
 
Just wondering how people would be bitchin' if Nvidia were using their MSAA plus offered ATI's way of doing aniso. Would they be doing the right thing, offering a good performance/quality tradeoff, or would people mainly complain how both MSAA and aniso are only unfair hacks and tricks to make the competition look bad performance wise at the cost of IQ?

Personally I think it would be an awesome combination to have right now (the future can and will bring even better possibilities though).

PS: Regarding IQ, I stumbled across this while reading some old (20th century :) ) B3D articles, funny how things change: "Speed is not everything. Image quality and features are equally important and 3dfx's claims are very nice, but NVIDIA (with the TNT 2) now proves that you can get good speed and good image quality together."
Today this could be: "Speed is not everything. Image quality and features are equally important and Nvidia's claims are very nice, but ATI (with their Anistropic filtering) now proves that you can get good speed and good image quality together." :LOL:
 
Back
Top