The Great PS4 Missing AF Mystery *spawn

With only one sample, there's no filtering happening. Like 0xAA, there's no antialiasing in effect with one sample.
 
With only one sample, there's no filtering happening.
In the same way that if you run an audio signal through a filter with a gain of 1, there's "no amplification happening." But we don't call that a gain of 0, because that would imply the signal got stamped out entirely.

Like 0xAA, there's no antialiasing in effect with one sample.
:(

2 samples is 2xAA. 4 samples is 4xAA. 128 samples is 128xAA. 1 sample is... 0xAA?

Maybe the issue here is that we also call it "no AA," and since "no" can often mean "0", we're substituting the latter. But the "no" in "no AA" isn't being used in the same way as the "X" in "XxAA" notation, so the substitution doesn't align right.
 
In the same way that if you run an audio signal through a filter with a gain of 1, there's "no amplification happening." But we don't call that a gain of 0, because that would imply the signal got stamped out entirely.
If you measure volume, it's 1. If you measure amplification, it's 0.

2 samples is 2xAA. 4 samples is 4xAA. 128 samples is 128xAA. 1 sample is... 0xAA?
Because one is measuring how much antialiasing there is, whereas the other is measuring how many samples are being used.

Maybe the issue here is that we also call it "no AA," and since "no" can often mean "0", we're substituting the latter. But the "no" in "no AA" isn't being used in the same way as the "X" in "XxAA" notation, so the substitution doesn't align right.
Indeed, but everyone's happy with it. ;) Basically the 0x reference is inconsistent with the nomenclature measuring, but it works in the context to describe the effect. The NxAA tells you how many samples with N excepting the zero AA case, where there's on sample. But it'd be wrong to call that 1xAA as well because there's no AA in effect. So it makes sense to stick with zero to represent none of the technique present (where of course there's one sample taking place or there'd be no image) and increasing numbers to represent sample count.
 
But purposefully choose 0xAF instead of 2xAF or 4x when the cost from 0x to 2x (or 4x) is negligible * on a capped game with solid fps and brings a lot of IQ improvement?

Shouldn't that be the other way around.

I thought AF is dynamic in nature where ?xAF implies the highest level of AF that the hardware will use. 16xAF employs 2x, 4x, 8x and 16x where the number of the samples taken is dependent on the severity of the angle of the texture and 16x is only used on the most oblique angles.

Meaning that going from 8xAF to 16xAF is negligible in comparison from going to no AF to 2/4xAF.
 
Last edited:
Those 16x surfaces are still plentiful. If not, there wouldn't be any visible difference when switching to 16x. Instead we see ground textures markedly improved, and that's a decent percentage of the screen there. Add in a few oblique building walls and there's definitely enough to make 16x worthwhile and more expensive.
 
Those 16x surfaces are still plentiful. If not, there wouldn't be any visible difference when switching to 16x. Instead we see ground textures markedly improved, and that's a decent percentage of the screen there. Add in a few oblique building walls and there's definitely enough to make 16x worthwhile and more expensive.

Thanks.
 
I can scarcely tell the difference when gaming between 8x and 16x AF. If I stop and really look it's there but on console it's hard to justify >8xAF IMO unless you have nothing else to spend the milliseconds on.

4x to 8x on the other hand is a tremendous improvement.

And I do find it odd that there is concern over the performance impact of AF on consoles (especially PS4) when a 7850 on PC can do 16xAF practically for free in every game.
 
Has anybody considered the prospect that AF is actually IP encumbered. I say this because searching to see if thats the case led to this post,

https://www.opengl.org/discussion_boards/showthread.php/184261-Anisotropic-Filtering

We know that anisotropic filtering isn't in core OpenGL for a reason: IP issues. We also know that this is slightly ridiculous as hardware support is ubiquitous...

There seems to be a number of patents related to AF, which are owned by a number of different parties including Microsoft.

Can it be a case where Sony uses some alternative solution thats not as performance friendly? Or is not simply plug & play and requires some effort to incorporate.
 
There are ports of indie games like Unfinished Swan from the PS3 where AF was taken out of the PS4 version. It doesn't seem like Sony had an IP problem before.
 
And I do find it odd that there is concern over the performance impact of AF on consoles (especially PS4) when a 7850 on PC can do 16xAF practically for free in every game.

In the parallel GAF thread they decided to do some testing instead of just saying AF is free on PC over and over and found that it's actually not free, so there's that. Starting here and continuing throughout the thread: http://www.neogaf.com/forum/showpost.php?p=182861597&postcount=177

Their results seem to vary wildly by hardware and game.
 
At 4K I understand there can be a somewhat heavy cost, but at 1080p it is mostly negligible.

Also they seem to be toggling AF in the driver which would enable max AF for every surface. That is silly.
 
Occam's Razor suggests that the PS4 dev tools default to no AF for all surfaces whereas the Xbone tools don't. There is no logical reason why a dev would purposefully leave AF off on a PS4 version when it is on for the Xbone version. The GPUs are practically identical, except that one is quite a bit bigger...
 
Occam's Razor suggests that the PS4 dev tools default to no AF for all surfaces whereas the Xbone tools don't. There is no logical reason why a dev would purposefully leave AF off on a PS4 version when it is on for the Xbone version. The GPUs are practically identical, except that one is quite a bit bigger...

If that were true, then the inverse situation is that the Xbox One's performance disadvantage can be partly blamed on devs that are too lazy to turn AF on for any surface on the PS4 failing to turn it off for any surfaces on the Xbox One.
 
This suggests to me that some devs don't ever give even a passing thought about AF before shipping a game. There are even some PC games where AF is not in the graphics options - usually console ports.
 
In my opinion, PC comparisons are mostly irrelevant. In the graphics pipeline, everything happens in parallel, that means that the speed of the whole is determined by the speed of the slowest link of the pipeline.

On console, since the hardware is fixed, devs can optimize everything to extract every bit of performance, the pipeline is mostly balanced. Now if you increase the load on one part of the pipeline everything slow down.

On PC on the other hand, devs can not optimize their code as much, so there are already some bottlenecks that vary from GPU to GPU depending on their specs. So if you're already shader (ALU) bound, setup bound or ROP bound, increasing the load on texture units will be hidden and it will *seem* that anisotropic filtering is free while in fact its cost is just covered by another part of the pipeline. But nothing is free in 3D, especially not anisotropic filtering that must combine several texture fetches, it will take a toll on the texture cache and the memory bandwidth.

So, I think that at the end of the development during the optimization phase, if the devs have trouble hitting their frame budget on consoles, cutting the anisotropic filtering is one of the easiest thing that will gain the few milliseconds that are missing. On the PC it's basically the user that will do his own balancing by enabling different graphics options according to their GPU.

Of course this is just a theory, I may be completely wrong (and probably I am) moreover that doesn't explain why the PS4 seems to have more problems with anisotropic filtering than the xbox one.
 
There are ports of indie games like Unfinished Swan from the PS3 where AF was taken out of the PS4 version. It doesn't seem like Sony had an IP problem before.

Well, the PS3's gpu is nvidia based and at least at one time nvidia and amd used somewhat different solutions for AF. Not sure about now.

A quick and superficial search of AF patents through Google patent search and looking at citations and references it seem like MS and especially nvidia dominate the patents surrounding AF. Intel seems to have a notable one that is regularly cited by newer patents but all I can find is one old ATI patent.
 
If that were the issue, games wouldn't be patching it in. Unless we believe they are paying MS a royalty, but no-one's saying as much, instead blaming implementation 'difficulties'.
 
If that were the issue, games wouldn't be patching it in. Unless we believe they are paying MS a royalty, but no-one's saying as much, instead blaming implementation 'difficulties'.

The patents don't cover AF itself but rather the techniques and optimizations used. Couldn't the "patching it in" or "implementation difficulties" come from using a work around or alternative solutions?
 
Back
Top