Encyclopedia Brown & The Mysterious Case of the PS4 and the Missing Anisotropic Filtering

Status
Not open for further replies.
Can you compile any directX11 code (maybe with some minor modifications) directly into PS4 binary code with a tool (compiler) provided by the PS4 SDK?
You should read the Eurogamer interview with The Crew developer as they explain how they ported the game to PS4. From this account no, you can not compile DirectX code to run on the PS4, you need to port functions to one of the two PS4 APIs.
 
Can you compile any directX11 code (maybe with some minor modifications) directly into PS4 binary code with a tool (compiler) provided by the PS4 SDK?

That information can't be NDA protected, right?

Someone could make a wrapper to basically translate the Direct3D code into GNM(X) code. It would still be compiling/linking GNM(X) against the SDK libraries.
 
Funny The Crew team use shader constant on PS4, they replaced GNMX system by their own optimized shader constant system.
GNMX, like many APIs, comes with CPU overheads which is the point really - the API is doing work so you don't have too. But API functions like this are typically designed for many usage scenarios so will be suboptimal for something purpose written for a specific requirement.
 
Another suggestion is a simple, and unfortunately quite common, Bug. Evolve is riddled with shadow bugs, especially on the Xbox One as you can read on the forums. A Patch is on it´s way to fix those on Xbox! So if the Xbox has shadow problems that others do not, other may also have bugs Xbox has not.
I don't disagree. When they release notes about it being a bug that requires fixing I'm on board. As of this moment they haven't, so this is my stance in the mean time.

Unfinished Swan is made by Sony Santa Monica, and unlike Driveclub it has no reasons not to support 16xAF on the PS4 since it supports it on the PS3.
Like I stated earlier, AF is enabled in that image. It is just not 16x.
Studios can head up more than 1 project at a time as well. See Child of Light for Ubisoft.
Developers may not all be able to operate at a low level API. There are not many developers that can operate at that level. GNM allows deep low level full control over the hardware, some would say still lower than what DX12 can offer. GNMX acts as a higher level language like DX11. IF GNMX is no good and that's being used a lot, it's easy to point the finger in that direction.

I keep writing that Cerny has indicated there is no hardware issue or SDK issue restricting AF. So he is lying here? If you think so, please outright say it.

Good theories don't only use cherry picked data, they have to work for all the evidence. When you cherry pick to support a theory based on bias, then you are acting like global warming deniers or creationist.

The AF data is all over the place. The data cannot simply support a performance trade off. There is too much data from games that have no such trade off to make. The swan game is likely internally rendering over 100fps. I'm guessing Strider also has a huge headroom. A theory has to support this evidence or it is a bad theory.

You have empirical data suggesting it is?

I find this quote highly offensive for 2 reasons, (a) I am an environmentalist with extremely low CO2 footprint (b) I believe in both science and evolution. You might find my definition of performance mildly offensive. Do you agree that in a face-off between a novice tennis player with the best racquet in the world would lose against a pro tennis player with the worst racquet in the world? Of course you would, this is obvious. It is a tool. PS4 is a tool, like the PS3 is, like everything before it. Developers require skill in leveraging that tool, and if they cannot make performance happen downgrades happen to ship a title. Developers are very much apart of the definition of performance. If you do not agree with this definition of performance, that's unfortunate, but that is what performance is.

Developers have been developing for a long time, and one thing I know that has not changed since the evolution of console hardware is performance tuning. Generation to generation developers tune their game for performance, that means removing things, that means optimization. I have every single console game in existence as my proof of this, that is a lot of data to work with. This is also my argument, developers have been providing performance tuning for console generations, if the developers had the room to work with a higher precision of AF in the time frame they were working with, I assure you it would be, there are also some things that are impossible to achieve on weaker hardware, but I'm not indicating that is the case with AF.

Your data set unfortunately is only inclusive of the PS4 titles released since October of 2013. Correlating PC games to the performance of a PS4 is folly, and it is also a straw man. PCs do not have the unified memory architecture of PS4, and thus your data set becomes only PS4 titles. Of the PS4 titles released we see that there are AF support in all ranges, from None (Strider) to max 16x AF (TLOU R) for instance. This tells me that AF works in their SDK.

This isn't a biased point of view, that is just the way games have been made for a long time. You disagree with statement however and call me biased because I'm pro Xbox. No problem. Let me present the argument in a different light. The argument used to position PS4 superior to Xbox One is that they use exactly the same hardware, both CPU cores and the based on the same GPU architecture but the PS4 has more of it. The memory architecture is different however, esram + DDR is nothing more than a big kinect mistake and a budgetary savings. GDDR5 is the real deal. Therefore by these simplistic generic arguments PS4 will forever be the superior device. It should be easier to code for as well. Yet developers are able to code AF for X1 and not PS4? That developers are unable to implement AF in this seemingly easier to code hardware? This is folly. They are the same hardware are they not, X1 is harder to code for, or is it not? You need to make a choice. I'm sure you'll agree that they are the same piece of hardware. From a resource and asset perspective they should be the same, you won't need new textures going from PS4 to X1. But PS3 to PS4 may not be the case however. Let me continue.

Xbox One games have been compromised for a long time, yet no one has made any rage about the lack of AF in Xbox One games. In lead games like AC Unity, PS4 has better AF, i'm sure there are other instances as well. Yet no one will say anything about including just more AF on X1, yet when the lead powerhouse console cannot afford more AF (when X1 has higher AF), it's a bug, it's a SDK issue. AF is free.

The irony that with the exactly the same hardware as a weaker competitor, the X1 is capable of higher precision AF in a handful of titles over PS4. It's easy to combat such a statement; PS4 is running higher resolution in all those instances, it's running more AO, better AA, better this or that. We've never seen a game to _date_ that X1 has run exactly the same settings as PS4 but higher. And I'd agree. X1 hasn't. But in situations where AF is cut, X1 is running a lower resolution, lower this lower that, it made cuts too. Cutting large things allowed AF back into the mix. But for some reason when it comes to AF missing on the PS4 the thought process is that PS4 magically has more headroom to support it. Wouldn't you call that bias?

Mainly because you believe AF is free, its not, but also because a truth you don't want to hear; the PS4 is not as powerful as you believe it to be. PS4 like any other piece of hardware will undergo performance tuning. AF is out if they think it looks better. If AF stays in, something else will go.
 
Last edited:
This is logical, except even in the Stryder scenario that players continually post as indication of a SDK bug, even Xbox does not have as high of a precision in AF than PC.
Are we sure about that? In the comparison image, the PC version seems to have been rendered at a higher resolution (look at the aliasing on the hilt, for instance), which for similar AF settings would result in the textures being resolved better.
 
We have discussed this subject to death.
There are 2 possible explanations for the lack of high AF settings in 3rd party titles on the Ps4. Reason 1. Titles built with DX 11 as the base Api cause some sort of issue with implementing high levels of AF in a hand full of engines on the Ps4 (something is being lost in translation).
Reason 2. These titles are using lower resolution on the Xbox One version thus leaving extra bandwidth needed to implement higher AF. AF is not free and it's cost almost always lies in bandwidth.
Both consoles have comparable theoretical bandwidth. The X1 versions of titles are having their framebuffer shrunk to fit comfortably into esram. This leaves the possibility that extra bandwidth is available from the consoles main memory which could be used for higher AF settings. It is impossible for us to find the truth without speaking to one of the developers of the titles in question.
I don't understand how this issue is turning into an argument about which console is superior. They both have their strengths and weaknesses so please stop taking this personal.
 
Wait. Would you say it's enabled at... 2x?

And what about the others games like Thief, Evolve or Dying light? Have they all also AF enabled at only 2x?
Only because 0 AF can barely resolve at all. 0 AF looks like this. I guess that means Bilinear and Trilinear filtering is still on the table however, so I should be careful about saying 0 AF. Anyway, we don't see the below behaviour with PS4 games. So it has more than nothing. Not counting strider of course.
Anisotropic.gif
 
Last edited:
Are we sure about that? In the comparison image, the PC version seems to have been rendered at a higher resolution (look at the aliasing on the hilt, for instance), which for similar AF settings would result in the textures being resolved better.
This is a good question. But I thought the comparison was 1080p all across the board. I know that PC is running a higher AA though, maybe those comparison images are folly. Re-reading DF they brute forced the AA with super sampling and had it run a much higher resolution. So you are correct in this regard.

I'm okay with chalking Strider up as a bug in the game engine. They could not resolve that problem for whatever reason, I do not know why. I am not comfortable applying this theory to every other title that PS4 has less than ideal AF precision.
 
Wait. Would you say it's enabled at... 2x?

And what about the others games like Thief, Evolve or Dying light? Have they all also AF enabled at only 2x?
One method to confirm your theory is to grab a PC edition of such a game and replicate a scene from DF, slowly increasing the AF from 0 until you reach the level of quality then we'll know for sure if it's a DX port problem as per your theory.

I might as well go a step further. But can anyone help Globalisateur? He's been very kind at pixel counting for all of us, if you have the PC game of the above titles can you assist him with the AF check?
 
I keep writing that Cerny has indicated there is no hardware issue or SDK issue restricting AF. So he is lying here? If you think so, please outright say it.

Cerny also said that? I do remember Cort Stratton saying that, but not Cerny!

And in Court's case, it's exact words were:

"No hardware/SDK issues that I'm aware of."

Not saying you are not correct, but the truth is: we do not know or understand the reasons why that is happening. But we do know that Strider PC requests a Geforce 740 as recommended specs, a 720 for adjusted specs, and almost any current card as minimum. And the 740 is way less powerful and has way less bandwidth than the PS4 GPU (80 GB/s on the GDDR5 version). And yet, 16xAF is there!

We are seeing games that should not have problems with problems. Yet the hardware is capable! That is not logical at all!

Developers may not all be able to operate at a low level API. There are not many developers that can operate at that level. GNM allows deep low level full control over the hardware, some would say still lower than what DX12 can offer. GNMX acts as a higher level language like DX11. IF GNMX is no good and that's being used a lot, it's easy to point the finger in that direction.

Even Sony Santa Monica in Unfinished Swan? The team who made God of War at 720p... on the PS2? And who used libGCM for the PS3 for an entire generation? It seems very hard to understand how they do not know how to handle a low level API since they are working with libGCM for so long.

Something is strange here. That's all I can say! The reason: I just don´t know, just know there is no apparent logic in this!
 
Cerny also said that? I do remember Cort Stratton saying that, but not Cerny!

And in Court's case, it's exact words were:

"No hardware/SDK issues that I'm aware of."

Not saying you are not correct, but the truth is: we do not know or understand the reasons why that is happening. But we do know that Strider PC requests a Geforce 740 as recommended specs, a 720 for adjusted specs, and almost any current card as minimum. And the 740 is way less powerful and has way less bandwidth than the PS4 GPU (80 GB/s on the GDDR5 version). And yet, 16xAF is there!

We are seeing games that should not have problems with problems. Yet the hardware is capable! That is not logical at all!



Even Sony Santa Monica in Unfinished Swan? The team who made God of War at 720p... on the PS2? And who used libGCM for the PS3 for an entire generation? It seems very hard to understand how they do not know how to handle a low level API since they are working with libGCM for so long.

Something is strange here. That's all I can say! The reason: I just don´t know, just know there is no apparent logic in this!

I get your concern and your perspective. Don't get me wrong it's baffling. But its more important to focus on the group of games and not the outliers. The outliers which in this case are 2 of many many games. And this seemingly tends to happen with cross generational games (2 titles so far). So something to consider there. Strider is a bug, I'll concede that, but it's not right to apply that to all cases where AF isn't ideal.

The majority of games on PS4 have AF.

As for Cerny, yes that is the correct quote, I misquoted the ICE programmer for Cerny. You are correct.

But lets consider the reality that Sony has had 1.25 years after launch to address this. No games have been patched to have better AF either. ICE team hearing about it but not doing anything about it, just not likely.
 
Last edited:
Don't get me wrong it's baffling.

Although AF can be very heavy on Bandwidth, it is!

Please correct me if wrong on the following:

Unfinished Swan runs on about 25 GB/s Bandwidth on PS3. PS4 has 176 GB/s! AMD uses adaptive AF, which means it can apply AF only to sloped surfaces. So, if we are talking about a screen difference between the PS3 and PS3 of, lets say, 1843200 additional pixels where AF is used (2x720p), and a 30 fps difference, with 16xAF and 64 samples per pixel, 4 bytes per texel, and a 4:1 texture compression, we would get:

1843200*64*4/4*30/1024 = 3456 MB/s or about an additional 3.5 GB/s.

Given the 151GB/s difference, I´m pretty sure Unfinished Swan could have spared this! Even more!

ICE team hearing about it but not doing anything about it, just not likely.

Also true, and not logic. And Sony has Two Teams on it: Ice Team and Razor Team (SN Systems). That is why adding A+B+the fact the hardware is capable, it is Baffling!
 
Although AF can be very heavy on Bandwidth, it is!

Please correct me if wrong on the following:

Unfinished Swan runs on about 25 GB/s Bandwidth on PS3. PS4 has 176 GB/s! AMD uses adaptive AF, which means it can apply AF only to sloped surfaces. So, if we are talking about a screen difference between the PS3 and PS3 of, lets say, 1843200 additional pixels where AF is used (2x720p), and a 30 fps difference, with 16xAF and 64 samples per pixel, 4 bytes per texel, and a 4:1 texture compression, we would get:

1843200*64*4/4*30/1024 = 3456 MB/s or about an additional 3.5 GB/s.

Given the 151GB/s difference, I´m pretty sure Unfinished Swan could have spared this! Even more!



Also true, and not logic. And Sony has Two Teams on it: Ice Team and Razor Team (SN Systems). That is why adding A+B+the fact the hardware is capable, it is Baffling!

Well, unfortunately you don't know how things are setup. If the lead platform was PS3 but last minute they decided to port it to ps4 and still try to make the same deadline or a tight deadline there are a ton of things that can go wrong in that port.

It's not as simple nor has it ever been as simple as just looking at the theoretical numbers. We are all guilty of it though, I'm a terrible person for that type of thing. But looking back at my own experience sometimes you code games to be stylistic in one way or the other. For instance for my game we didn't have a graphics artist so everything had to be simple shapes. So I used the CPU to generate graphical effects on the screen. So far so good except PSN forces you to code in Lua and the library gets more and more limited and your idea may not fit the available tools handed to you.

So do you rewrite the game to fit the tools? Or do you just scale back where you can? It can happen like that on ports. It probably happens like that a lot when you don't work on two platforms simultaneously.
 
Is it worth mentioning (by way of reverse engineering this 'issue') whilst 16AF is present on LoU, why not have a lesser AF an lock the game @60FPS?

I just don't believe that performance is the issue, is there a list of games without anywhere?
 
Also, are we expected to believe that every third-party developer has come to the same conclusion; drop AF? Even if there's no apparent performance issues.
 
The argument used to position PS4 superior to Xbox One is that they use exactly the same hardware, both CPU cores and the based on the same GPU architecture but the PS4 has more of it. The memory architecture is different however, esram + DDR is nothing more than a big kinect mistake and a budgetary savings. GDDR5 is the real deal. Therefore by these simplistic generic arguments PS4 will forever be the superior device. It should be easier to code for as well. Yet developers are able to code AF for X1 and not PS4? That developers are unable to implement AF in this seemingly easier to code hardware? This is folly. They are the same hardware are they not, X1 is harder to code for, or is it not? You need to make a choice. I'm sure you'll agree that they are the same piece of hardware. From a resource and asset perspective they should be the same, you won't need new textures going from PS4 to X1. But PS3 to PS4 may not be the case however. Let me continue.

Xbox One games have been compromised for a long time, yet no one has made any rage about the lack of AF in Xbox One games. In lead games like AC Unity, PS4 has better AF, i'm sure there are other instances as well. Yet no one will say anything about including just more AF on X1, yet when the lead powerhouse console cannot afford more AF (when X1 has higher AF), it's a bug, it's a SDK issue. AF is free.

The irony that with the exactly the same hardware as a weaker competitor, the X1 is capable of higher precision AF in a handful of titles over PS4. It's easy to combat such a statement; PS4 is running higher resolution in all those instances, it's running more AO, better AA, better this or that. We've never seen a game to _date_ that X1 has run exactly the same settings as PS4 but higher. And I'd agree. X1 hasn't. But in situations where AF is cut, X1 is running a lower resolution, lower this lower that, it made cuts too. Cutting large things allowed AF back into the mix. But for some reason when it comes to AF missing on the PS4 the thought process is that PS4 magically has more headroom to support it. Wouldn't you call that bias?

Mainly because you believe AF is free, its not, but also because a truth you don't want to hear; the PS4 is not as powerful as you believe it to be. PS4 like any other piece of hardware will undergo performance tuning. AF is out if they think it looks better. If AF stays in, something else will go.

So your point is "there are games where PS4 drops AF where XB1 doesn't so XB1 isn't weaker as everybody suggests."

Glad to see your agenda. You're basically arguing that PS4 dropped AF on Evolve to hit 1080p over 900p because PS4 can't handle it and PS4 is weaker. It must have also dropped AF to hit near perfect 30fps (according to DF).

Surprise, 1080p's performance hit over 900p is higher than 16x AF. Lets see how that works in reality.

Oh I also see how you're ignoring the host of all other games that apply AF perfectly and meanwhile achieve better resolution and/or better framerate for the PS4.
 
Also, are we expected to believe that every third-party developer has come to the same conclusion; drop AF? Even if there's no apparent performance issues.
It is as easy as not focusing on PS4 and looking at its nearest competitor. Why does Xbox not have 16XAF in all of its titles either? If there are no "apparent" performance issues why is no one raising a flag here?

Why is it in lead Xbox games like call of duty AW and AC Unity does PS4 have better to equivalent AF?

You gut feeling is baseless unfortunately. You don't know how much AF affects the game, each game is different, and this is the first time you have a tablet based processor working on completely unified memory. You cannot base how PC performs in this area compared to these machines, it is not a proper baseline to use.
 
Status
Not open for further replies.
Back
Top