The Great PS4 Missing AF Mystery *spawn

Thinking about the whole framerate Vs resolution thing, I hope I'm not going to upset anyone here, but I realised that the Halo 5 beta had the game running at 1280x720 (is this the lowest resolution in the final game?) and at 60fps, so even if 343i wanted to they likely couldn't get the game up to 1080p at 30fps as the pps is higher (55m Vs 62m). So the idea for Halo definitely wouldn't be viable.

I also remembered that there's another game that trades resolution for framerate; Shadowfall. Interestingly it was another game that would have had to go the more complicated route from 30 to 60fps. I guess that the multiplayer sections are less taxing on the CPU, similar to the new Rainbow Six.
 
I have a radical idea, whilst I know gaming journalism doesnt exist in any legitimate state.
But perhaps someone could email the dev's of tonyhawk and ask them why there is no AF on ps4?
Radical huh

Yeah, great use of their time to inquire about a missing feature almost nobody really notices or gives a single shit about when the game is a complete shit show on just about any level that actually counts.
 
Then that means enabling af on the ps4 will lower framerate because af isnt free
So you think PS4 with more texture RAM BW than XB1 and more ROPS and more CUs and the same GPU architecture will have a greater penalty from AF, more than several frames, than XB1 has? Further, why do the devs care to turn off AF on PS4 to stop this framerate penalty, but didn't care to disable it on XB1 (which has inferior framerate as more drops) or drop the res on XB1?
 
^ My thoughts exactly, the problem is not just the absence of AF on Ps4, but the inclusion in almost every X1 multiplat no matter if it performs the same, better or worse. And this is why people are bringing this up.
 
So you think PS4 with more texture RAM BW than XB1 and more ROPS and more CUs and the same GPU architecture will have a greater penalty from AF, more than several frames, than XB1 has? Further, why do the devs care to turn off AF on PS4 to stop this framerate penalty, but didn't care to disable it on XB1 (which has inferior framerate as more drops) or drop the res on XB1?

I think he was joking...
 
Radical huh

Well, probably a bit more radical than that would be to read the leaked SDKs for XB1 and PS4 and -if you are somewhat fond on developing and cg - get your personal idea about why.
However, it is illegal as you never signed an NDA with MS or Sony (and if you did, then it would be illegal to talk about).
 
So you think PS4 with more texture RAM BW than XB1 and more ROPS and more CUs and the same GPU architecture will have a greater penalty from AF, more than several frames, than XB1 has? Further, why do the devs care to turn off AF on PS4 to stop this framerate penalty, but didn't care to disable it on XB1 (which has inferior framerate as more drops) or drop the res on XB1?
Well, if you don't optimize your memory-access patterns you get less bandwidth out of the PS4 because of memory contention. XB1 on the other hand forces the developers to optimize for the second memory pool.
But maybe MS has just an "always on until application overrides" option for AF as you already mentioned.
 
Well, if you don't optimize your memory-access patterns you get less bandwidth out of the PS4 because of memory contention.
There's more BW available form DDR/GDDR even factoring in contention (which XB1 has too). There's no BW advantage for XB1 come AF. At least there shouldn't be. It's exactly the same as two similar AMD APU in a PC, one with something like 50% more BW available, running an undemanding game - AF will work from the texture caches in GPU, so shouldn't be seeing any cunning benefit from ESRAM BW. Which still isn't enough to justify the difference, especially when we see titles getting AF added without a performance penalty. Enabling AF on a ground texture isn't the difference between 60 and 30 fps on any GPU!
 
The issue with this discussion is that it began sometime in February 2015. It's been nearly a year now with these issues. Why hasn't Sony resolved the issue, or why hasn't the middleware/engine resolved the issue by now. They've had more than enough time and coverage on the topic. We've been on how many threads about how SDKs will improve over time for X console and for Y console -- so why hasn't something as simple as this fix for a basic feature been realized by now.

I don't mind being open to new evidence, there is a lot of evidence that suggests there are bugs in the SDK or developer mistakes or what not. But I also ask that you do not ignore the sweeping amount of evidence from developers that have also cited there are no issues either.

This post from Neogaf summarizes a dev talking about AF as discreetly as they can.
http://www.neogaf.com/forum/showpost.php?p=177437849&postcount=1969

So perhaps the answer is that amateur teams are suffering more. Relatively speaking we also expect amateur teams to use higher level APIs like GNMX. Maybe that's where we need to head, I don't know. But this concept that developers don't know how to enable AF really confusing.

More shocking to me would be that if this were a 'bug' issue that during cert Sony would not flag it as being part of the Certification process after 1 year of this exposed problem would be worrying. They're great at providing me notes about how shitty my game was running and cited several frame rate issues. I'd expect them to offer up a simple statement that says: yo your AF is off - you should probably do X and Y to fix that.
 
None of it makes sense. The hardware is very capable. Devs, contrary to their reputation, are not lazy. The GNM(X) apis are spoken highly of. It's been a long time for a bug or implementation quirk in the SDK to persist.
 
It's a conundrum. A mystery. An enigma. A phenomenon. A thing that makes no fucking sense.
It's one of those things that it's clear, if we do not know, we shouldn't pretend to know. Period. It's like seeing a light in the sky and saying, I don't know what that is. Oh I know it must be a UFO.
We're at the same place for the 20th time. It's likely that our technical knowledge is working against us.

Let me bring up a point that most people don't agree with. For instance, I say it's performance, immediately we bring up the two being the same with PS4 having more bandwidth (single direction) and architecturally XBO and PS4 should be the same. I'm willing to bet your 100% correct.
But that's not how developers see it.
When you need to ship the product you disable features to make the frame rate. Then of course, you'd say well that doesn't make any sense because PS4 is more powerful so by X and Y it's impossible that Xbox could be faster.

This is our fixed way of thinking.

Instead of focusing on AF, perhaps we should focus on just performance, we see that for xbox games they remove AF as well when performance is a problem. So lets just follow that line of thinking for a second - when performance sucks the first feature to go is AF because most people don't notice it and maybe it's more taxing than we understand.

At this point in time you'd say, well if that's true Xbox's AF should be off too then if PS4 is off.
You would be right, but you didn't make that call by seeing the data graphs which is what developers see.

You didn't see the breakdown of each part of the frame and how long it took. What if there are parts in the pipeline is intrinsically faster on XBO than it is on PS4, like post processing and particle effects and alpha blending. And while XBO is behind or tied in other categories (like AF) but in this particular part it's faster. So that's how it catches up with PS4 at times. If everything was linearly the same for both PS4 and XBO you would be correct, but we know because of the memory subsystems that this cannot be true. So what if these particular categories PS4 is slower at because of memory contention, or that the XBO is exceptional at because ESRAM is a powerhouse when it comes to rewrites? The removal of AF on PS4 may be nothing more than a remedy to another symptom. It may not be the symptom itself.

The same thing applies to Xbox. As much as we have determined Xbox One to be the weaker system of the two. We have never _ever_ truly determined the reason why it's running 900p or lower for X and Y titles, or why it struggles to achieve 1080p. It's just assumed because it's weaker therefore lower resolution. But we've never determine the actual culprit. Is it the lack of ROPs? the lack of CUs? ESRAM? Is it a combination? The lower resolution may have always been a remedy to a particular symptom, but not the symptom itself.

I want you to keep that in mind, that we just take it for granted it's weaker and no one even flinches when it shows up to be weaker. But we're all up in arms when it does something right. We have some very hard anchored biases that need to be removed if we are to have a discussion about how this is possible.
 
Why is this discussion taking place???

PS4 has AF convertion problems... it´s a known fact! Several games had this problem before.
And when they were out, all of these arguments were placed!
But then:

Dying Light, 8xAf implemented on top of a myriad of tangible visual improvements, no impact on framerate.
USF4, 8-16xAf implemented and a very good quality AA solution, absolutely no impact on framerate
DMC4-SE, 8xAF implemented with absolutely no impact to the framerate......

I believe this is just another needless discussion!
 
The same thing applies to Xbox. As much as we have determined Xbox One to be the weaker system of the two. We have never _ever_ truly determined the reason why it's running 900p or lower for X and Y titles, or why it struggles to achieve 1080p. It's just assumed because it's weaker therefore lower resolution. But we've never determine the actual culprit. Is it the lack of ROPs? the lack of CUs? ESRAM? Is it a combination?

Looks like you've answered your own question. Are you suggesting a lack of AF in PS4 games justifies them being being considered proportional in power?

We definitely need to drop our biases when discussing the hardware. It is also clear from specifications and previously released games, that the difference in power is directly proportional to the difference in resolution or framerate. Sometimes more so.
 
Looks like you've answered your own question. Are you suggesting a lack of AF in PS4 games justifies them being being considered proportional in power?

We definitely need to drop our biases when discussing the hardware. It is also clear from specifications and previously released games, that the difference in power is directly proportional to the difference in resolution or framerate. Sometimes more so.
no - I'm not saying AF makes up the difference in resolution. That is not it, if I implied it that was not my intent. When you drop resolution _everything_ drops in terms of workload. When you disable a single feature you are only dropping a part of the workload.

What i'm suggesting is that every game be looked at case by case. And without data we cannot determine why AF was disabled or not. Sweeping generalizations are not working as we have gone back and forth on this topic forever. We've identified AF being low as a symptom; I'm asking people to consider that it could be a remedy. Both resolution drops and AF drops are viewed as remedies for Xbox, I don't see why PS4 is the exception to this rule.

I keep reading that well this game was patched and that game was patched - it was never in and now it's patched it has no performance difference. Okay. That's a really generic way of defining how performance was not affected. Or perhaps with that patch they just have a better optimized game? If the difference is 5% more performance for disabling AF, is it unreasonable to believe that developers couldn't optimize the game to get 5% more to enable AF at a later time?

I'm going to leave it at that. Without real data we do not know. It's all assumptions and we're all over the place.
 
Last edited:
When you need to ship the product you disable features to make the frame rate. Then of course, you'd say well that doesn't make any sense because PS4 is more powerful so by X and Y it's impossible that Xbox could be faster.

This is our fixed way of thinking.

Instead of focusing on AF, perhaps we should focus on just performance, we see that for xbox games they remove AF as well when performance is a problem...
I've avoided the repeats of this discussion because they've covered nothing new and I've no insight. But Tony Hawk does present a new piece of info. We have the game being identical on both boxes. We have the same resolution and framerate. We have the XB1 performing slightly under the target framerate.

I want you to keep that in mind, that we just take it for granted it's weaker and no one even flinches when it shows up to be weaker. But we're all up in arms when it does something right. We have some very hard anchored biases that need to be removed if we are to have a discussion about how this is possible.
All your considerations are valid in other games and the wider discussion, but this game is a dog! Post processing, AA, particles, there's nothing going on where XB1 can have an advantage due to ESRAM. Certainly DF doesn't recognise XB1 having an advantage there (no, "framerate suffers with particles, but XB1 handles these situations a little better). We're not comparing a game with post AA and higher res added on PS4 and AF added on XB1. Nor is the PS4 pushing higher res shadows or a longer draw distance. It's doing absolutely nothing different on the most basic of renderers. If XB1 is able to render the game at 1080p60, PS4 is doing that with room to spare.

I'm going to leave it at that. Without real data we do not know. It's all assumptions and we're all over the place.
You can work out missing data when you have a suitable test case dealing with one vairiable only.

In TH5, all the variables are removed leaving us with this single difference - AF on PS4, the clearly more capable machine in this game (maintains a higher framerate, game is untaxing, we know how AF works) is off. AF on XB1 is on. TH isolates all the other different games and different devs, and focusses on a single point. When we answer the TH question, we'll get some real insight into the AF question.

So why does a studio who's clearly not trying very hard and not tuning the game to each platform enable AF on the XB1 but not the PS4? There's only two answers to my mind.
1) Performance. PS4 suffers such a massive impact from enabling AF in this game that the framerate drops substantially below XB1's framerate to the point the devs felt it impacted the experience too much. Therefore the devs disabled it.
2) Implementation. Although the PS4 is just as capable of using AF as XB1, the devs didn't do what was necessary to enable it.

And thus we can discuss the likelihood of these two theories. I find 1) far, far less probable than 2) for obvious reasons.

Having just looked it up, this game is using UE3. Do we have other UE3 games with the AF discrepancy and have any been patched?

Edit: Bit of research. Borderlands the Handsome Collection is UE3, far more demanding than TH5, and runs AF16x on both consoles. It did at launch without the AF being patched into the PS4 version later.
 
Last edited:
Dying Light, 8xAf implemented on top of a myriad of tangible visual improvements, no impact on framerate.
USF4, 8-16xAf implemented and a very good quality AA solution, absolutely no impact on framerate
DMC4-SE, 8xAF implemented with absolutely no impact to the framerate......

I believe this is just another needless discussion!

We've had a developer in this thread state unambiguously that AF can have a significant impact on performance, and that it is best deployed (often sparingly) on a material by material or texture by texture basis.

Apparently, this discussion is still very much needed.
 
Back
Top