...Shadow Warrior dev writes my source calls it differently (and vehemently) with the #LazyDevs theory, which is not my opinion BTW.
Always the best and most productive answer.
...Shadow Warrior dev writes my source calls it differently (and vehemently) with the #LazyDevs theory, which is not my opinion BTW.
In the same way that if you run an audio signal through a filter with a gain of 1, there's "no amplification happening." But we don't call that a gain of 0, because that would imply the signal got stamped out entirely.With only one sample, there's no filtering happening.
Like 0xAA, there's no antialiasing in effect with one sample.
If you measure volume, it's 1. If you measure amplification, it's 0.In the same way that if you run an audio signal through a filter with a gain of 1, there's "no amplification happening." But we don't call that a gain of 0, because that would imply the signal got stamped out entirely.
Because one is measuring how much antialiasing there is, whereas the other is measuring how many samples are being used.2 samples is 2xAA. 4 samples is 4xAA. 128 samples is 128xAA. 1 sample is... 0xAA?
Indeed, but everyone's happy with it.Maybe the issue here is that we also call it "no AA," and since "no" can often mean "0", we're substituting the latter. But the "no" in "no AA" isn't being used in the same way as the "X" in "XxAA" notation, so the substitution doesn't align right.
But purposefully choose 0xAF instead of 2xAF or 4x when the cost from 0x to 2x (or 4x) is negligible * on a capped game with solid fps and brings a lot of IQ improvement?
Those 16x surfaces are still plentiful. If not, there wouldn't be any visible difference when switching to 16x. Instead we see ground textures markedly improved, and that's a decent percentage of the screen there. Add in a few oblique building walls and there's definitely enough to make 16x worthwhile and more expensive.
We know that anisotropic filtering isn't in core OpenGL for a reason: IP issues. We also know that this is slightly ridiculous as hardware support is ubiquitous...
And I do find it odd that there is concern over the performance impact of AF on consoles (especially PS4) when a 7850 on PC can do 16xAF practically for free in every game.
You mean lazy, right?At 4K I understand there can be a somewhat heavy cost, but at 1080p it is mostly negligible.
Also they seem to be toggling AF in the driver which would enable max AF for every surface. That is silly.
Occam's Razor suggests that the PS4 dev tools default to no AF for all surfaces whereas the Xbone tools don't. There is no logical reason why a dev would purposefully leave AF off on a PS4 version when it is on for the Xbone version. The GPUs are practically identical, except that one is quite a bit bigger...
There are ports of indie games like Unfinished Swan from the PS3 where AF was taken out of the PS4 version. It doesn't seem like Sony had an IP problem before.
If that were the issue, games wouldn't be patching it in. Unless we believe they are paying MS a royalty, but no-one's saying as much, instead blaming implementation 'difficulties'.