Encyclopedia Brown & The Mysterious Case of the PS4 and the Missing Anisotropic Filtering

Status
Not open for further replies.
I know this is a PS4 thread, BUT I can't imagine the API would be too much different to what the XB1 devs use ... And there are several pages devoted to "how" devs can implement anisotropy in their games in the leaked xdk..

here is one section talking about the "cost" of using it..

These are peak performance numbers.

Minimal cost in the shaders
Buffer single-channel read - 4 clocks
Buffer multichannel read - 16 clocks
Buffer atomic - 16 clocks
Texture point-sampled fetch - 16 clocks any format
Bilinear texture fetch - 16 clocks = 32bpp, 32 clocks 64bpp, 64 clocks 128bpp
Trilinear texture fetch - Bilinear texture fetch × 2
Anisotropic xN texture fetch - (Cost of a bilinear texture fetch or of a trilinear texture fetch) × N
 
I know this is a PS4 thread, BUT I can't imagine the API would be too much different to what the XB1 devs use ... And there are several pages devoted to "how" devs can implement anisotropy in their games in the leaked xdk..

here is one section talking about the "cost" of using it..
At the very least I imagine resolution plays a factor in there after the fetch(in terms of clockcycles, higher the resolution the more taxing the procedure is), I'm not sure if whether texture size does however. But this is good, and yes I don't imagine GNM being that different either.

I wish I bought a modern game on my PC that had uber large textures like AC Unity or something, something that obliterates VRAM and see how AF affects those games at varying resolution.
 
It's always nice to bring the CPU into things, especially where it might not belong.

We know that contention hurts CPU performance, as despite having priority if the memory bus is busy access times increase. So more accesses for texture fetch will have some knock on for CPU. Whether it's significant as part of the bigger picture I have no idea.

It probably can be on Xbox One - 1080p games with buffers that spill into main memory and suck up main memory BW sometimes have their aniso cut back or disabled (e.g. Forza or Sniper Elite) in order to improve performance. And if they're short of main memory BW we know that the CPU would be hurting too - direct from the Xbox SDK.

Reducing aniso on both these consoles may, in certain circumstances improve both GPU and CPU performance. Infact, I'd go so far as to say that I'd bet real money on the fact that it can on Xbone.
 
I guess we're escaping the real question here!
All of us are aware that tradeoffs may occur in some games, be it on Aniso or any other effect.
But that doesn't explain why there are cases on the PS4 where Aniso is lacking or not 16x, but you have improved AO, resolution and framerate! If it's a tradeoff how come you do it, but improve other equally demanding stuff? And much less explains why games not so taxing on bandwidth and resources, also lack AF.

And these are the real questions we would like to see answered.
 
I guess we're escaping the real question here!
All of us are aware that tradeoffs may occur in some games, be it on Aniso or any other effect.
But that doesn't explain why there are cases on the PS4 where Aniso is lacking or not 16x, but you have improved AO, resolution and framerate! If it's a tradeoff how come you do it, but improve other equally demanding stuff? And much less explains why games not so taxing on bandwidth and resources, also lack AF.

And these are the real questions we would like to see answered.

There aren't isn't a lot to say - we have 2 tweets coming from an ICE developer and 1 tweet from a developer indicating that there is no AF issue. Not a single developer has stepped up and said there is an AF problem either.

I mean if you outright reject their claims/tweets then you're going to have to search for evidence of the lie. If you do believe their claims you start looking into what went wrong, why they couldn't ship with AF. It's clear that AF is doable, it's the same hardware as any other GCN piece. But perhaps they handled AF differently on PS4, like the PC RGB spectrum on Xbox causing crushed blacks for example (it works, but it's operating in a non standard gamma curve or something IIRC), maybe its AF isn't as straight forward, they handled the output differently and only a few developers in the know who can code their own variants of AF can get around their SDK screwy version of it.

It seems the really bad cases have something to do with cross generational engine ports, or perhaps games specifically meant for last gen ported up. These cases appear on the outliers, but 2 outliers isn't even enough to call it a trend, you'd need at least 30 to be even statically valid.

What if the 2 outliers didn't happen? Would everyone still be on the mentality that there is an SDK bug? Xbox doesn't have any outliers but it doesn't have a library full of 16x AF titles, suggesting that AF does not appear to be a priority feature.

If it's a business decision, that might be something else altogether. How weird that is of course, in my eyes more AF has always looked better than less, but who knows, I don't do A/B testing with the general population.
 
I think my main issue is seeing The Unfinished Swan looking the way it does, compared to the PS3 version. Forget the X1 discussions. That's just weird.
 
  • Like
Reactions: Jwm
I wonder if to some extent there's a generational thing in the concerns about aniso.

I started out on 8-bit consoles where things like sprite count, resolution and colour palettes mattered a great deal. Then I went onto the 3D stuff like the Saturn and N64, where again sprite/texture quality mattered, and then the Dreamcast where its defining trait above even poly count and res was texture quality. Even with my Voodoo 2 graphics card I'd spend time tweaking so I could to go from bilinear with mipmaps to trilinear on the textures, and I noticed easily where even incredible games like Rallipsort Challenge 2 on the Xbox OG lacked anything higher than trilinear.

But perhaps it's not something most people look for now. Resident Evil 5 on the 360 for example, didn't even have trilinear - it was fucking bilinear with mip transitions with what looked like low aniso. Dithered mip transitions - something I'd tried to avoid on Voodoo 2 - and now I was chasing a line of shimmering ants across Africa. And in Daytona on the 360 - frikking mip transition lines.

Bastard.

TL:DR - perhaps the performance hit (be it big or small) is being amplified by the fact people are hardened to bad texture filtering these days, and simply don't give a shit. 1080p, though ... even if they can't see it, they understand it.
 
^ Well sure, it plays to PR. Remember when people started to buy 1080i TVs yet they did not have HD service, they would call you and talk about how they could see every blade of grass on the football field!? They were repeating PR talk to certain degrees. I was still on 1024x768 projectors, but the normal TV feed ran through dscaler so the picture was much better. People that would come over would ask why it was so crisp and what it was, I told them 480i being de-interlaced by my PC.

So I am going to stick to the theory that 1080p is needed for marketing, and if they need to drop things to make it stable they do so before looking to drop resolution. AC Unity being the oddball, where it needed the drop to 900p to compete properly. Heck they were caught advertising that as 1080p on many occasions anyway, official Sony site, etc. So with the general consumer maybe they saw that but never saw the 900p in forums/sites. It is a major part of how they advertise things, so my theory is based off that and nothing more.

Edit: That still would not explain Unfinished Swan, but that could be in the porting?
 
There is no such need for marketing. The developers pick the resolution, there are several non-1080P games on the PS4. Unless there is a Sony mandate that is a perfectly kept secret, that seems like a poor theory.
 
It is an especially impolite theory as function always implies (about 5 times a day) that if someone prefers 1080p, he is a dumbass anyway.

If the devs would prefer 1080p and set this as a priority, they would have chosen this on X1 again and get rid of even more effects in this version to maintain the framerate.

They didn't do it, hence, not a terrible good argument about 'stupid' bullet points.
 
It is an especially impolite theory as function always implies (about 5 times a day) that if someone prefers 1080p, he is a dumbass anyway.

There's rather more to my comments than that. I can see I'm going to have to continue repeating myself (maybe six times a day?).
 
Unless they thought it looked better which is hard to account for.

If you guys are on the stance that SDK is the problem you need to help Global. He needs data on AF for all PS4 games. You won't solve this problem talking about it here. If the data shows a steady trilinear/no greater than 2X that's a trend worth investigating, but if you see AF values all over the place that would suggest that it is not the SDK.

I do believe Sony and the ICE team so that limits my view of things.

Good luck.
 
And would not explain Evolve either. Why would you cut on AF, and then improve on AO? Release bandwidth on AF to use on another bandwidth intensive effect like AO? This makes no sense!
Darn yep you are right, a bit of an interesting thing this is. Does not make sense that the hardware would be limited as this family of GPU's have little issue here. Sony said the hardware was capable, did they not? But they did not mention the SDK?

Edit: Nevermind, I have no clue why we are seeing this.
 
It is an especially impolite theory as function always implies (about 5 times a day) that if someone prefers 1080p, he is a dumbass anyway.

If the devs would prefer 1080p and set this as a priority, they would have chosen this on X1 again and get rid of even more effects in this version to maintain the framerate.

They didn't do it, hence, not a terrible good argument about 'stupid' bullet points.

Everyone ( :p) knows the XBO can't do 1080p in games so they don't expect it or demand it. Developers are free to choose whatever resolution they see fit, although MS would like it if it was as close to 1080p as possible.

Everyone ( :p) expects 1080p in PS4 and if it isn't 1080p then questions start getting asked. Developers are under pressure to release at 1080p if at all possible. Not necessarily by Sony, but certainly by many consumers.

There's a bit of tongue in cheek there but also a bit of truth.

And would not explain Evolve either. Why would you cut on AF, and then improve on AO? Release bandwidth on AF to use on another bandwidth intensive effect like AO? This makes no sense!

I believe AO is far more compute intensive and less bandwidth intensive. Especially the better AO alogorithms And PS4 has compute to spare.

Regards,
SB
 
Lol!

Honest question: do you guys really believe this? I mean, you know that 'everyone' refers to a handful B3D posters and a dozen neogafs. You know, that there is also the real world...right?

I think that there are about 30 million this gen consumers who don't(!) care at all about 1080p...not 'everyone'.

If you have gaming friends who are not b3d poster (you should!)...ask them.


And by the way, honestly, name me a single dev that does what gamers want!? They follow their own (or their publishers) vision and thats about it imo :)

If they'd listened to 'everyone', we'd have 60Hz games only :mrgreen:
 
Everyone ( :p) knows the XBO can't do 1080p in games so they don't expect it or demand it. Developers are free to choose whatever resolution they see fit, although MS would like it if it was as close to 1080p as possible.

Everyone ( :p) expects 1080p in PS4 and if it isn't 1080p then questions start getting asked. Developers are under pressure to release at 1080p if at all possible. Not necessarily by Sony, but certainly by many consumers.

There's a bit of tongue in cheek there but also a bit of truth.



I believe AO is far more compute intensive and less bandwidth intensive. Especially the better AO alogorithms And PS4 has compute to spare.

Regards,
SB

You'd think they'd drop the AF on XB1 if they really cared to lock the game @30fps (which the PS4 version almost managed to do, and I doubt it did it by dropping AF).

But hey, believe what you like, even if it means thinking that devs are dropping stuff so they can hit bullet points for PS4.
 
Ok, I still think a list might help (am I flogging a dead horse!?) - looking at the GAF thread this seems the total of affected titles (only non exclusives), am I missing any? Is it really just 6 or so titles?

Evolve
Dying Light
Saints Row
Thief
Murdered Soul Suspect
Strider

Once a list is agreed we can maybe look at the finer details of each title...just a thought...
 
Status
Not open for further replies.
Back
Top