Encyclopedia Brown & The Mysterious Case of the PS4 and the Missing Anisotropic Filtering

Status
Not open for further replies.
All Unreal Engine 3 (and CryEngine3) games have reduced AF level on PS4 (because not sure if 0xAF or 2xAF, see previous posts), without exception. If like I strongly suspect the problem really comes from API / engine port problem, then all future UE3 and CE3 games on PS4 will unsurprisingly display the same anomaly.

Don't these devs have the engine source to where they could just fix the bug themselves if needbe?
 
Maybe Sony should drop the API that allows easy porting from DX. If its runs the game it might be good enough for some publishers and old engines but needs more sacrifies.

Im pretty sure that is why Resident Evil runs like shit. UE3 game ports might also only use that API because its not officially ported to PS4 by Epic
 
If you take a UE3 game and port it to PS4, you still have to modify the source code for the renderer to use the GNM API. I don't really understand why the engine itself would not support AF on PS4. It is a feature of the engine, so I don't know why there'd be a PS4 render path that didn't support it. Only thing I can think of, is implementation in GNM is non-trivial for some reason. I don't know if I'd say that's an issue so much with UE as it would be an issue with the API. Texture filtering shouldn't be that difficult. Does UE3 even support PS4? Are these developers porting the renderer on their own? I'd imagine some of the bigger titles made modifications to the renderer anyway.
 
If you take a UE3 game and port it to PS4, you still have to modify the source code for the renderer to use the GNM API.
I'm sure that's included in UE, though don't know if UE3 was ported to PS4. They can't really sell the engine as cross-platform if you have to rewrite its core renderer!
 
I'm sure that's included in UE, though don't know if UE3 was ported to PS4. They can't really sell the engine as cross-platform if you have to rewrite its core renderer!

That's what the last bit of my post was about. I don't know if UE3 even supports the PS4, and I'm geniunely curious to know if the devs are porting to GNM themselves. If they are, it's kind of weird that none of them have been able to successfully port texture filtering, which seems like a pretty basic thing. It would almost make more sense if UE3 did support PS4, and it was broken in the engine, so all of the licensees had the same issue.
 
How much effort is it to port from UE3 to UE4? Unity 5 is very nearly immediately compatible with Unity 4 as I understand it, and I'd expect the same from Unreal.
 
Why do you assume they would use GNM and not just GNMX? http://www.eurogamer.net/articles/digitalfoundry-how-the-crew-was-ported-to-playstation-4

"Most people start with the GNMX API which wraps around GNM and manages the more esoteric GPU details in a way that's a lot more familiar if you're used to platforms like D3D11. We started with the high-level one but eventually we moved to the low-level API because it suits our uses a little better," says O'Connor, explaining that while GNMX is a lot simpler to work with, it removes much of the custom access to the PS4 GPU, and also incurs a significant CPU hit.

DMC is surely ported from the PC version
 
You'd hope so. But even if it didn't, why would so many AAA supporting engines be using GNMX?

On a different note ... does anyone have any idea how interrupting GPU access to feed the CPU would affect it?

I'm not sure that average BW figures tell the whole story. GPU's can handle high latency memory access, and indeed need to in order allow them to schedule accesses to get more ideal access patterns (iirc). But if the GPU had to wait an additional 100+ cycles (on top of normal latency) then wouldn't that hurt?

What if during sampling texels for a given fragment several such interruptions occurred - couldn't the GPU effectively stall? There are only so many other operations you can move onto before the cache is empty and units go idle. Taken over a very short period of time, it's possible that the GPU could have all the BW to itself, or perhaps very little ... or perhaps even none?

Would be nice to get my hands on an APU to do some crude testing of the effect of CPU and GPU contention with various render settings!
 
Would be nice to get my hands on an APU to do some crude testing of the effect of CPU and GPU contention with various render settings!

Been trying to push DF to do some testing with APUs & bandwidth issues like AF & DDR3 speeds, but even so you have to keep in mind the driver/API.
 
UE3 isnt supported on PS4 by Epic at all..

Does it make a difference? Surely GNMX supports texture filtering.

With GNMX you dont have access to nitty gritty on GPU and full power. Maybe AF just destroys performance on it with some engines because you cant do any modifications to how its handled.

Resident Evil PS4 has 16xAF and has performance drops up to 30fps..
 
Last edited:
I wouldn't have thought enabling AF would require too much effort, especially with a high-level API. I could be wrong. I'm not sure what the API could be doing that would destroy performance.
 
"Most people start with the GNMX API which wraps around GNM and manages the more esoteric GPU details in a way that's a lot more familiar if you're used to platforms like D3D11. We started with the high-level one but eventually we moved to the low-level API because it suits our uses a little better," says O'Connor, explaining that while GNMX is a lot simpler to work with, it removes much of the custom access to the PS4 GPU, and also incurs a significant CPU hit.

That interview is retrospectively very interesting. Because a quick port from all those old engines (UE3, CE3, Gamebryo, Chrome Engine 6) would probably mean using the easier GNMX (like they did initially for The Crew) in order to quickly port those engines. Because you only 'port' engines because you don't have time (or resources) to rebuild the engine entirely in the harder GNM API so it would make sense to use the easier GNMX when you do those, and I quote, 'engine ports [that] are shit'.

As usual if I am doing unreasonable assumptions (like I often do here I admit) please correct them.

Wait, why didn't they use GNMX in the final game? well, we know why, because:

"It removes much of the custom access to the PS4 GPU". What kind of custom access?

Finally it's interesting to note that Sony shipped an API with the PS4 that doesn't provide full access to its GPU...Why didn't they emulate it in directx-like 'custom' functions?
 
I'd say since there are games that perform equal or better on ps4 than on xbox one and with the same level of AF, it must not be a hardware issue.
Every game is different though. Even titles that run on the same engine. The issue at question is bandwidth not necessarily hardware. Though I think the fault likely lies with incompatibility of engines that were designed for older APIs.
 
ROFL shit. I just searched up what Encyclopedia Brown was hahaha.
That is like hardy boys in North America I think.

Awesome, just awesome.

Without viewpoints into the SDK there's no way to know. From the Xbox SDK I did read a part where AF can have a performance hit when it comes to fetching, as it continually does the bilinear fetch multiple times over. I'll have to go home and find that part. But I'm not sure if it would apply here.
Yup, nice title. Something related to the hollisters would also be good. I mean The Happy Hollisters
 
That interview is retrospectively very interesting. Because a quick port from all those old engines (UE3, CE3, Gamebryo, Chrome Engine 6) would probably mean using the easier GNMX (like they did initially for The Crew) in order to quickly port those engines. Because you only 'port' engines because you don't have time (or resources) to rebuild the engine entirely in the harder GNM API so it would make sense to use the easier GNMX when you do those, and I quote, 'engine ports [that] are shit'.

As usual if I am doing unreasonable assumptions (like I often do here I admit) please correct them.

Wait, why didn't they use GNMX in the final game? well, we know why, because:

"It removes much of the custom access to the PS4 GPU". What kind of custom access?

Finally it's interesting to note that Sony shipped an API with the PS4 that doesn't provide full access to its GPU...Why didn't they emulate it in directx-like 'custom' functions?

Well Globalisateur. I hate to burst your bubble, but we're going through this thread... for a 3rd time.
Your argument already happened and it began here, with the tweet and all: https://forum.beyond3d.com/posts/1825080/
I had to endure some awesome replies such as - 'You're telling me Sony Santa Monica Studios is doesn't have the talent to code around problems?' (paraphrased in reference to Unfinished Swan) .. anyway, I'm not bitter or anything about it. But those folks were "ADAMENT' that it could only be a SDK issue, and I was off my rocker.

So there are a total of three people confirming AF is not an issue.
1. ICE Team
2. That tweet we all posted earlier.
3. Another rendering programming on GAF that I assumed since he has not been banned has been vetted and wrote this:
http://www.neogaf.com/forum/showpost.php?p=155345197&postcount=779

Jux: Godamnit people, stop it already. There aren't any hardware or software issue with AF on PS4...

>>But there is
Jux:No there aren't. Read the thread. I work with a PS4 everyday of the week so I think I know what is and isn't better than people only guessing stuff according to anecdotal evidence and various websites full of misinformation.

Jux:I'm a rendering programmer.

Jux: Because the fact that AF has no performance impact is a misconception? Re-read my previous posts, I've given plenty of information about how AF can impact performance.
Now, this does not explain why AF is not present in some of the games listed here, but it can be AN answer. Obviously, only the devs of said games can give the real answer,

Originally Posted by Monty Mole
So why would AF have less of an impact on Xbox One?

It doesn't.
Different ports, by different teams, at different resolution, choosing to cut corners at different places, an artist mistakenly switching AF off on a given texture...
There are plenty of reasons that can explains what's seen here. None of them have anything to do with an hypothetical hardware or software issue.

You might be onto something, but don't write it out like I did. Man did I have a lot of people jump on me. Still to this day, yea.. anyway, I'm leaving this thread again. Too many bad memories. I assure you shortly after this post I'm going to receive a reply that tells me I'm an idiot because AF doesn't affect their FPS on their game.

I'm not sure if this will apply to PS4 but this is from the Leaked SDK of Xbox One:
Image fetches and buffer fetches have different performance characteristics. Images fetches are generally bound by the speed of the texture pipeline and operate at a peak rate of 4 texels per clock. Buffer fetches are generally bound by the write bandwidth into the destination registers and operate at a peak rate of 16 GPRs (General Purpose Registers) per clock. In the typical case of an 8-bit four channel texture, these two rates are identical. In other cases, such as 32-bit one channel texture , buffer fetch can be up to 4 times faster.

Many factors can reduce effective fetch rate. For instance, bilinear filtering, trilinear filtering, anisotropic filtering, and fetches from volume maps all translate internally to iterations over multiple bilinear fetches. Bilinear filtering of data formats wider than 32 bits per texel also operates at a reduced rate. Floating point formats that have more than three channels operate at half rate. Use of per-pixel gradients causes fetches to operate at quarter rate.

By contrast, fetches from sRGB textures are full rate. Gamma conversion internally uses a modified 7e4 floating-point representation. This format is large enough to be bitwise exact according to DX10 spec, yet still small enough to fit through a single filtering pipe.
 
Last edited:
Been trying to push DF to do some testing with APUs & bandwidth issues like AF & DDR3 speeds, but even so you have to keep in mind the driver/API.
I need to do some more testing. But at a glance, 0AF to 4AF produced no noticeable hit (I think I'm using Tomb Raider). My system is Kaveri with 2133 RAM. Mind you that they system might be bottlenecked somewhere. What setting to use to push the bandwidth usage to the max while lowering the load on CPU and GPU? High res with everything dialed down? Basically I want to be sure the setting will be bandwidth limited so changing the AF setting hopefully will have a tangible effect on the average FPS.
 
Status
Not open for further replies.
Back
Top