Thanks for agreeing with what I said: NVIDIA's only "free" trilinear is bilinear Use brilinear aggressively (as NVIDIA appears to be doing) and you get mostly bilinear.Chalnoth said:Then you're not understanding what I'm saying. Trilinear filtering is nothing more than two bilinear samples averaged together. This is true whether or not anisotropic filtering is enabled. An architecture that could do single-cycle trilinear could also do single-cycle 2-degree anisotropic filtering. But enable trilinear filtering and 2-degree anisostropic, and you'll still get a performance hit from trilinear with this archtecture.
So what you're really asking for isn't single-cycle trilinear filtering, but rather hardware that is capable of more bilinear texture samples per cycle. But that won't solve the issues with trilinear filtering.
In any event, I wasn't "asking" for anything. I was merely pointing out why NVIDIA is using brilinear.
Newp. Brilinear first appeared on the NV30. And wasn't this original discussion about shimmering on NV40 and G70?Side comment: don't forget that it was ATI that started doing brilinear (albeit less aggressively), and their first implementation of anisotropic filtering didn't even support trilinear.
Now you've gone and painted yourself into a corner. According to you, old games don't need optimizations and new games need more ALU power and therefore don't need texture optimizations. So tell us: Why is NVIDIA going through all this trouble?Old games are high-performing anyway, so you can just turn the optimizations off and have no problems.
Oh wait, all of this is just a bug with the app profiles! Which means that NVIDIA is using heavy brilinear on applications and it can't be turned off. But it's just a bug. But that's what NVIDIA always says.
-FUDie