The Great PS4 Missing AF Mystery *spawn

There's no way that CPU is limiting performance to that degree in those games. It must be something else. Maybe it's simply a matter of balance. Console devs have the luxury of allocating the available bandwidth very precisely and thus adding in AF could indeed have a big impact while PC devs don't know how much bandwidth a system will have relative to it's GPU resources so they're unable to balance so carefully, thus there's often a greater surplus of bandwidth on the PC side that can be allocated to "free" AF. Obviously XBO has the esram which may help in that regard vs PS4.
 
There's no way that CPU is limiting performance to that degree in those games. It must be something else.

You might be surprised just how bad PC is. I installed a game I worked on an R9 280x at home and got 20fps less than PS4 at equivalent resolutions. Updating my driver got me up 60-70 consistently, which is about where the PS4 version runs without vsync. Based on the hardware difference between PS4 and the R9 280x I'd expect at least another 10fps, but it's sucked up by the driver. Bad drivers turn PCs into bad, bad platforms and people barely even realize how much of their hardware is left on the table.
 
You might be surprised just how bad PC is. I installed a game I worked on an R9 280x at home and got 20fps less than PS4 at equivalent resolutions. Updating my driver got me up 60-70 consistently, which is about where the PS4 version runs without vsync. Based on the hardware difference between PS4 and the R9 280x I'd expect at least another 10fps, but it's sucked up by the driver. Bad drivers turn PCs into bad, bad platforms and people barely even realize how much of their hardware is left on the table.

If you're using a completely inappropriate, ancient driver, then perhaps. But given DF will almost certainly be using the latest drivers then I'm afraid that's really not true at all. There's virtually no game out there that performs worse on a 280x than on the PS4 at the same settings with a sufficiently powerful CPU and the latest drivers (and the one or two that highly arguably may perform worse are widely acknowledged to be terrible ports). If you really did get those results, then your testing methodology was flawed (mismatched settings/broken PC/ancient drivers). A cursory glance at Digital Foundrys face off section is evidence enough of that. I personally have a 670 (coupled with a pretty old 2500K) which, especially in modern games, is fairly consistently slower than the 280x. And yet in every recent AAA cross platform game (virtually all of which I own) I can comfortably exceed PS4 settings at the same level of performance - using Digital Foundry's settings comparisons and frame rate video's as the basis.
 
If you're using a completely inappropriate, ancient driver, then perhaps. But given DF will almost certainly be using the latest drivers then I'm afraid that's really not true at all. There's virtually no game out there that performs worse on a 280x than on the PS4 at the same settings with a sufficiently powerful CPU and the latest drivers (and the one or two that highly arguably may perform worse are widely acknowledged to be terrible ports). If you really did get those results, then your testing methodology was flawed (mismatched settings/broken PC/ancient drivers). A cursory glance at Digital Foundrys face off section is evidence enough of that. I personally have a 670 (coupled with a pretty old 2500K) which, especially in modern games, is fairly consistently slower than the 280x. And yet in every recent AAA cross platform game (virtually all of which I own) I can comfortably exceed PS4 settings at the same level of performance - using Digital Foundry's settings comparisons and frame rate video's as the basis.

Maybe DF's comparisons aren't very accurate ? They might make mistakes on true console settings ?
 
If you're using a completely inappropriate, ancient driver, then perhaps. But given DF will almost certainly be using the latest drivers then I'm afraid that's really not true at all. There's virtually no game out there that performs worse on a 280x than on the PS4 at the same settings with a sufficiently powerful CPU and the latest drivers (and the one or two that highly arguably may perform worse are widely acknowledged to be terrible ports).

So firstly, it's just an anecdote and I don't have the setup (nor do I believe the infrastructure really exists in any useful form) to do performance analysis on PC at the level I can on a console. So there are certainly possible flaws in my observation, but comparing theoretical HW specs vs PS4 in dev environments it seems to me PC HW often punches below its weight. There are a few different reasons for this... some DX12 will address and some it won't. But it's crazy how easy it is to leave performance on the table on PC. When people talk about bad ports what they really mean is "this game did not work around all of the performance-sucking idiosyncrasies of the driver and API" which is an insane expectation to have across the board.
 
Drivers simulate consumed bandwidth for disabled features? Ok.. :???:
I'm not entirely sure how it's supposed to work haha. I'm not going to pretend I know. I'm just thinking out loud that it could be along the lines that the DX11 takes so many extra steps that adding in AF may not have an additional bandwidth factor.
 
Last edited:
Even if it's hard to implement or costly it's almost always worth it. AF is just as important as good AA and Res for good IQ.
 
Too bad they conveniently forgotten to talk about sniper elite 2 that has both better af and better fps on ps4.

How are they going to explain this case?

And what about ps3 and ps4 multiplats like unfinished Swan or strider that also have better af on ps3?

Selective memory ? Or most probably those cases didnt fit with their general message (xb1 can do better af that ps4) which was the real purpose of this article.
 
PS4 has a faster 'GPU'. It's entirely possible that a particular game could be taking a proportionately greater hit from AF on the PS4 but still be performing significantly better.

Different games will place different demands on the hardware. For example, CPU contention - a very real issue - will affect different games differently. Or on Xbone, main memory BW may be a greater or lesser limiting factor depending on how much heavy BW tasks can be squeezed into esram.

One game may take a proportionately bigger hit on PS4 from AF, another might take a bigger hit on Xbone.
 
Too bad they conveniently forgotten to talk about sniper elite 2 that has both better af and better fps on ps4.

How are they going to explain this case?

And what about ps3 and ps4 multiplats like unfinished Swan or strider that also have better af on ps3?

Selective memory ? Or most probably those cases didnt fit with their general message (xb1 can do better af that ps4) which was the real purpose of this article.

Or, like most of the DF coverage since the main DF guy stopped visiting here: it's incomplete and a bit not good overall?

What's that saying about malice and incompetence? :p I feel it fits here. DF articles outside of the interview have gone downhill for over a year and it makes me sad.
 
So, like Watch_Dogs? (900p vs 792p)

Yeah, something like that. Assuming that performance will scale roughly linearly with the number of pixels rendered, then 792p should give about a 29% performance boost. And that should, on the face of things, leave the latest AC at a pretty stable 30 fps, similar to the PS4 version of the game.

Maybe both using a dynamic resolution solution? Scale to the highest possible resolution during nonstressful scenes, and rubber band back where needed during more stressful events.

That might work. It looks like there should be areas where the PS4 could render at higher resolutions while maintaining 30 fps, so it could gain too.

Dynamic res is a bit of a brave new world. I thought it worked really well in Rage. Will be interesting to see how people react to Halo 5, and to see just how far and how fast you can push resolution changes before they become offputting.
 
Too bad they conveniently forgotten to talk about sniper elite 2 that has both better af and better fps on ps4.

How are they going to explain this case?

And what about ps3 and ps4 multiplats like unfinished Swan or strider that also have better af on ps3?

Selective memory ? Or most probably those cases didnt fit with their general message (xb1 can do better af that ps4) which was the real purpose of this article.
weird. Okay. my personal take away was that it was about why consoles get hit hard by AF and not PC. But if you saw XBO vs PS4 in there, then I guess that's what you read. There were slides they used about memory contention on PS4, but I think they used that as some sort of, hey, this is a possible avenue for the source of the problem. Memory contention exists on both PS4 and XBO, so that doesn't exclude Xbox. And they found with the APU they used that they could not accurately reflect what was happening on consoles.

And that's where the article stopped, at least for me. The question DF was trying to answer is why is AF such a big impact on console but not on PC.
 
Surely the only option to get a real understanding is to speak to those developers who released games without AF, then patched them to include it seemingly "for free". Only they would know why AF wasn't there in the first place, and why they were able to add it without affecting performance.

Can we please take this to the appropriate thread? Clearly the AF discussion is bound to go round and round yet again for days.
 
Last edited:
Surely the only option to get a real understanding is to speak to those developers who released games without AF, then patched them to include it seemingly "for free".
WTF! If people go around only asking experts for informed opinions then the entire internet will just fall apart. Are you trying to destroy the internet?

Are you? ARE YOU? :mad:
 
So firstly, it's just an anecdote and I don't have the setup (nor do I believe the infrastructure really exists in any useful form) to do performance analysis on PC at the level I can on a console. So there are certainly possible flaws in my observation, but comparing theoretical HW specs vs PS4 in dev environments it seems to me PC HW often punches below its weight. There are a few different reasons for this... some DX12 will address and some it won't. But it's crazy how easy it is to leave performance on the table on PC. When people talk about bad ports what they really mean is "this game did not work around all of the performance-sucking idiosyncrasies of the driver and API" which is an insane expectation to have across the board.

I'm certainly not denying the theory of this, it's absolutely what I would expect. The surprising aspect for me is that it doesn't seem to play out in actual benchmarks. Can you tell us what game it was you saw this result on and did you ensure settings parity before making the comparison?
 
maybe DF's CPU or GPU is not worked hard enough?
from my experience in fiddling with Skyrim, the lower the fps, the more the AF hit.

so for example, with 30 fps skyrim, turning off AF will give more 1 or 2 fps. But in 20 fps skyrim, turning off AF will give 5 fps.

maybe AF is affected with the "time to render"? the longer it is, the heavier the AF is.

im very uninformed in this kind of topic though. feel free to ignore :D
 
Back
Top