Encyclopedia Brown & The Mysterious Case of the PS4 and the Missing Anisotropic Filtering

Status
Not open for further replies.
What setting to use to push the bandwidth usage to the max while lowering the load on CPU and GPU? High res with everything dialed down? Basically I want to be sure the setting will be bandwidth limited so changing the AF setting hopefully will have a tangible effect on the average FPS.

MSAA ought to push bandwidth consumption without increasing shader load.

But yeah, I'm not particularly convinced performance is the issue if XO can pull it off with DDR3.
 
How many lay people know what AF is or would notice it? It's like 900P vs 1080, but even less a difference. Maybe that is why these devs cut corners.

Although it makes no sense in cases like DMC where PC and X1 both sport working AF.

However, X1 is making a case for a superior port quite a few times over this little issue, which is a disaster given the GPU differences, so it could grow to be quite PR damaging to PS4, although I doubt it could ever near reach the scale of what failing to reach 1080P did to X1 especially in the early days (<1080 on X1 seems more accepted and less of a scandal each time now, but I think the psychological damage is long done and reflected in the sales figures).
 
Time to tweet Yoshida and ask him whats up with all this crappy ports. Tell him you will get xbone version of every multi platform games if this doesn't get sorted out so they know they are not making your money.
 
To close this endless cycle and end the debate... the answer to the whole missing Arsenic Fingering debacle is simple...

... it's Lazy Devs™

If it's a case of RTFM, then yea that trademark is entirely appropriate.
If it's a case of waiting on a UE3 patch from Epic until it was too late instead of fixing it yourself then, well is procrastination = laziness fair?
I would say that those 2 scenario's are the most likely at this point. Certainly more likely than some unexplained performance issue.
 
Driveclub has low aniso in game because it's a GNMX port of Unreal Engine 3. And also a bug.
It's basically no better than Night Driver on the Atari 2600

1976-10-01%20Night%20Driver%20%28Atari%29.png


Which I note also lacks anisotropic filtering because the devs were so damn lazy :yep2:
 
Last edited by a moderator:
So I tested varios AF level using Tomb Raider benchmark mode on my Kaveri rig. Unfortunately it doesn't have MSAA (only SSAA, which is overkill on my rig), so I have to settle using FXAA. Anyway, the result is that the minimum FPS isn't really affected from choosing trilinear up to 8xAF. On 16xAF, the minimum reduced 1FPS (from 31 to 30). The max FPS difference trilinear to 8xAF (16xAF number is weird, need to triple test it) is around 1 to 2FPS (from 46 to 48). Average FPS difference trilinear vs 8xAF is around 1FPS (37 to 38). Anyway, there was a lot of variation when I tried to re-run the test. A lot as in 1-3FPS on trilinear, 4xAF, 8xAF, although I can see a trend where trilinear is a bit faster (around 1 to 2FPS) vs 4xAF and 8xAF. 4xAF usually 1FPS faster than 8xAF. All of this have the same minimum FPS (which the game probably bottlenecked by something else, probably shader). Using bilinear filtering add 1FPS to the minimum FPS (to 32FPS), average to 40 and max to 50. Although I must warn you that on one of my benchmark run, the 4xAF test also hit 50FPS max.
So I think the worst case scenario (at least in Tomb Raider), trilinear to 8xAF reduce 2 to 3 FPS. 4xAF around 1 to 2 FPS. I don't think there is no reason for not using at least 4xAF unless you are on the edge on your target FPS and you're very strict in achieving that target.
If the FPS on a game is the same for both consoles, then there might be a performance reason on not turning on AF on the PS4 version. But if the PS4 version have better FPS, then there should be no reason not to enable/use higher AF (or to disable/use lower AF on X1 to match the PS4 performance). I don't think the cost of enabling AF on PS4 is any different vs X1 or PC GCN GPU.
 
Been trying to push DF to do some testing with APUs & bandwidth issues like AF & DDR3 speeds, but even so you have to keep in mind the driver/API.

I need to do some more testing. But at a glance, 0AF to 4AF produced no noticeable hit (I think I'm using Tomb Raider). My system is Kaveri with 2133 RAM. Mind you that they system might be bottlenecked somewhere. What setting to use to push the bandwidth usage to the max while lowering the load on CPU and GPU? High res with everything dialed down? Basically I want to be sure the setting will be bandwidth limited so changing the AF setting hopefully will have a tangible effect on the average FPS.

Yes, APU testing would be coolbeans. Perhaps Mantle could get around the API issue for now? Also it might be interesting to test:

- Rendering performance as CPU memory access increases; perhaps dropping res to increase frame rate, and/or run a CPU bench on one core that does lots of random memory accesses
- Ramping up memory latency to mimic the kind of access times that the PS4's memory has. Could probably work this out from the 220+ cycles of CPU latency and the clockspeed.

That said, the memory controller in the PC APUs is based on DDR3 and may be more evolved and better at handling contention (lower penalties) than the GPU style GDDR5 in the PS4.

It occurred to me that the PS4 has 4 high latency memory channels to feed two CPU clusters, and 8 colour and 8 depth blocks, and 72 TMUs.

Kaveri has two lower latency memory channels to feed two CPU modules, 2 depth and 2 colour blocks, and 32 TMUs. Looking beyond pure bandwidth for a moment, it looks like the number of memory channels and the latency of memory accesses may be better 'balanced' (there's that word!) for an APU.

Xbox one has 4 slightly lower (than PS4) latency memory channel to feed two CPU clusters, 4 colour and 4 depth blocks, and 48 TMU's. And with a bit of luck, most of the colour and depth units accesses will be from the esram, leaving the four DDR3 channels to service the two CPU clusters and the 48 TMU's.

... or something. I'm out of my depth here, I just think that the memory setup may lead to variance in the way the different APUs respond to CPU access and the hit from reading multiple samples per fragment.
 
How many lay people know what AF is or would notice it?
It can be pretty jarring when the ground is a blur. I think plenty of people notice that even if they don't know the cause.

It's basically no better than Night Driver on the Atari 2600

Which I note also lacks anisotropic filtering because the devs were so damn lazy :yep2:
You're kidding, right? 16x AF on that black striped black road texture. It was an amazing accomplishment, completely under-appreciated.
 
So I tested varios AF level using Tomb Raider benchmark mode on my Kaveri rig. Unfortunately it doesn't have MSAA (only SSAA, which is overkill on my rig), so I have to settle using FXAA. Anyway, the result is that the minimum FPS isn't really affected from choosing trilinear up to 8xAF. On 16xAF, the minimum reduced 1FPS (from 31 to 30). The max FPS difference trilinear to 8xAF (16xAF number is weird, need to triple test it) is around 1 to 2FPS (from 46 to 48). Average FPS difference trilinear vs 8xAF is around 1FPS (37 to 38). Anyway, there was a lot of variation when I tried to re-run the test. A lot as in 1-3FPS on trilinear, 4xAF, 8xAF, although I can see a trend where trilinear is a bit faster (around 1 to 2FPS) vs 4xAF and 8xAF. 4xAF usually 1FPS faster than 8xAF. All of this have the same minimum FPS (which the game probably bottlenecked by something else, probably shader). Using bilinear filtering add 1FPS to the minimum FPS (to 32FPS), average to 40 and max to 50. Although I must warn you that on one of my benchmark run, the 4xAF test also hit 50FPS max.
So I think the worst case scenario (at least in Tomb Raider), trilinear to 8xAF reduce 2 to 3 FPS. 4xAF around 1 to 2 FPS. I don't think there is no reason for not using at least 4xAF unless you are on the edge on your target FPS and you're very strict in achieving that target.
If the FPS on a game is the same for both consoles, then there might be a performance reason on not turning on AF on the PS4 version. But if the PS4 version have better FPS, then there should be no reason not to enable/use higher AF (or to disable/use lower AF on X1 to match the PS4 performance). I don't think the cost of enabling AF on PS4 is any different vs X1 or PC GCN GPU.

Thanks for taking the time to do that testing!

A couple of things I think it would be interesting to test are if there's any noticeable impact on CPU load - or rather on CPU utilisation - as the AF increases. My thought here is that the issue might be more than pure bandwidth use, and also be related to the number of individual accesses that the GPU is demanding of the memory controller and it's four channels to ram.

Perhaps CPU efficiency is dropping (and therefore utilisation rises) as higher levels of AF are used. In this case, we might expect to see performance drop as the CPU becomes the limiting factor. While with a discrete GPU, a CPU bottleneck would be an excuse to crank up the AF and MSAA, on an APU it might work in reverse.

If you do notice CPU efficiency dropping as you hit high levels of aniso, it would be fun to see if bumping memory latency right up made CPU efficiency drop even further ...

To see where I'm going with this ... Resident Evil Revelations 2 runs like shit on the PS4. Far worse than you'd expect from the GPU, and worse even than the difference in CPUs would seem able to explain (10 ~20% or whatever). What if it's not just a bug, and the CPU is the bottleneck for this old ass engine, and the 16X ansio is a factor in making this worse?

You could speculate all day! Frame rate drops in TLoU remastered? The reason for lowering AF while in game in Drive Club with all the simulation going on? Unreal Engine 3?

And hey, at least it's new ground for this thread, and not just reposting tweets ... :eek:
 
You're kidding, right? 16x AF on that black striped black road texture. It was an amazing accomplishment, completely under-appreciated.

Yep, the road stays sharp even into the distance. They must have bypassed GNMX and gone to the metal. If that's 60 fps then they reached Wizzard.

Also, bonus Battlestar Galactica at the bottom of the screen!
 
Driveclub has low aniso in game because it's a GNMX port of Unreal Engine 3. And also a bug.

There a difference between no anisotropic and a little AF like in Drive Club and between Drive Club and Strider or Unfinished Swan. I am sure the first one push much further the PS4 than the last two one.

In DMC, Unfinished Swan or Strider PS4 have less aniso than PS3 That is relevant to the debate...

I think exclusives are irrelevant to the debate. exclusives developer do some compromise and decidd what they want to do. On Xbox One Forza Horizon 2 has low AF and Xbox One has good AF in Strider. It means nothing... And I am sure Strider can run with AF without problem on current gen consoles... It is not a title pushing the console...

TLOUR is 16xAF not running like shit like RE Revelations 2 with framedrop to 30 fps and KZ SF 8x AF for ground texture, The Order has a little AF... Like I said exclusives are not relevant to the debate...
 
Last edited:
There a difference between no anisotropic and a little AF like in Drive Club and between Drive Club and Strider or Unfinished Swan. I am sure the first one push much further the PS4 than the two last one.

In DMC, Unfinished Swan or Strider PS4 have less aniso than PS3 That is relevant to the debate...

I think exclusives are irrelevant to the debate. Xbox One Forza Horizon 2 has low AF and Xbox One has good AF in Strider. It means nothing...

TLOUR is 16xAF not running like shit like RE Revelations 2 with framedrop to 30 fps and KZ SF 8x AF for ground texture, The Order has a little AF... Like I said exclusives are not relevant to the debate...

You are wasting your time.
 
Do test with a PC is not the same thing. Not same API, not same OS. No CPU/GPU has the same spec than PS4. Only developer know why there is not AF in some PS4 games.
 
There a difference between no anisotropic and a little AF like in Drive Club and between Drive Club and Strider or Unfinished Swan. I am sure the first one push much further the PS4 than the two last one.

In DMC, Unfinished Swan or Strider PS4 have less aniso than PS3 That is relevant to the debate...

I think exclusives are irrelevant to the debate. Xbox One Forza Horizon 2 has low AF and Xbox One has good AF in Strider. It means nothing...

TLOUR is 16xAF and KZ SF 8x AF for ground texture, The Order has a little AF... Like I said exclusives are not relevant to the debate...

You're mistaken.

Exclusives are not relevant to any "versus" discussion, but are relevant to the "does AF cost" debate. Which is actually the debate that's interesting.

For example, Drive Club using reduced / low AF in-game, but high in photo mode, shows that high AF is not free. It is not even effectively free. And this is for a big budget, single platform, long-in-development, first party technical showcase.

There's is an insane amount of effort going into not acknowledging this.
 
Status
Not open for further replies.
Back
Top