Horizon: Zero Dawn - Graphics de-Gushing

It's to save performances but the texture filtering isn't worse than in other games. All consoles games tend to have a mediocre texture filtering.

jpg

Many PS4 and Xbox One game have low anisotropic filtering game...
 
Last edited:
It's to save performances but the texture filtering isn't worse than in other games. All consoles games tend to have a mediocre texture filtering.

jpg


That's what I don't get. On PC AF 16x is basically "free" for years and years now. Even on old GPU without a lot of raw power and bandwith. Why is this so hard on console to have ? Especially on big first party game like this one.
 
That's what I don't get. On PC AF 16x is basically "free" for years and years now. Even on old GPU without a lot of raw power and bandwith. Why is this so hard on console to have ? Especially on big first party game like this one.

Because it is not free... On PC it is "brute force", few AAA PC games push the envelope most of them are tailored for consoles...

Maybe the problem will be there with a true PC exclusive like Star Citizen...
 
That's what I don't get. On PC AF 16x is basically "free" for years and years now. Even on old GPU without a lot of raw power and bandwith. Why is this so hard on console to have ? Especially on big first party game like this one.

It's clearly not free on console because almost no games have a high AF unless you take into account the less impressive ones such as the remasters, etc.
 
It's clearly not free on console because almost no games have a high AF unless you take into account the less impressive ones such as the remasters, etc.
It's free because GPU caches are sufficiently large to localize many texture fetches with AF, it's not free only on crippled mobile chips with very low bandwidth and crippled caches
 
Then why isn't it implemented? If it's free, surely it'd be on as default as a valuable IQ improvement with no downsides??
 
One of AF's drawbacks with PBS is shader aliasing on distant textures because of small and frequent texture details (close to 1:1 texels to pixels ratios), this might introduce more temporal aliasing which is obviously undesirable. I think higher resolution is the main reason of better AF on PS4 PRO in such games as Horizon or FFXV
 
That's what I don't get. On PC AF 16x is basically "free" for years and years now. Even on old GPU without a lot of raw power and bandwith. Why is this so hard on console to have ? Especially on big first party game like this one.
Your PC gpu has memory bandwidth to spare. Set your game to 0x AF and try downclocking your VRAM to just beyond the point you start getting fps slowdown, then activate 16x AF and see if you see a drop in framerate.
 
Last edited:
It's free because GPU caches are sufficiently large to localize many texture fetches with AF, it's not free only on crippled mobile chips with very low bandwidth and crippled caches
PS4 and higher end 7000 series cards have is the 512KB L2 cache. How is that sufficient to store every texture/mipmap in a scene that needs to be sampled 16 times for 16x AF? Additionally the cache may not be used exclusively for textures.
 
Last edited:
How is that sufficient to store every texture/mipmap in a scene that needs to be sampled for AF?
You don't need to store every texture/mipmap in cache, this would be stupid. With each draw call, GPU fetches only a small bunch of textures. With 4:1 DXT compression, 128*128*24bpp fragment of texture takes just 12 kilobytes of L2 cache, you can store 18 different fragments for each CU and still have 296 KB of L2 for geometry attributes and other stuff.
 
Last edited:
You don't need to store every texture/mipmap in cache, this would be stupid. With each draw call, GPU is sampling only a bunch of textures. With 4:1 DXT compression, 128*128*24bpp fragment of texture takes just 12 kilobytes of L2 cache, you can store 18 different fragments for each CU and still have 296 KB of L2 for geometry attributes and other stuff.
Thanks for the clarification.
 
One of AF's drawbacks with PBS is shader aliasing on distant textures because of small and frequent texture details (close to 1:1 texels to pixels ratios), this might introduce more temporal aliasing which is obviously undesirable. I think higher resolution is the main reason of better AF on PS4 PRO in such games as Horizon or FFXV

But if you compare a game like MG5, the PC version doesn't seem to get more aliasing with 16xAF. So, why the console version still gets lower AF ?
 
But if you compare a game like MG5, the PC version doesn't seem to get more aliasing with 16xAF. So, why the console version still gets lower AF ?

Because the differences are completely unnoticeable in motion from 6+ feet away from an average sized TV screen?
 
Consoles use unified memory pool, bandwidth to the ram needs to be shared between CPU and GPU. Devs probably don't want to compromise performance of CPU and GPU with bandwith-thirsty AF use.
 
You don't need to store every texture/mipmap in cache, this would be stupid. With each draw call, GPU fetches only a small bunch of textures. With 4:1 DXT compression, 128*128*24bpp fragment of texture takes just 12 kilobytes of L2 cache, you can store 18 different fragments for each CU and still have 296 KB of L2 for geometry attributes and other stuff.
So maybe its not a vram bandwidth issue and perhaps the simple act of sampling 16 times for 16x AF puts extra load on the shaders. Or perhaps it is a vram bandwidth issue and the occasional cache missefor textures fragments represents uses that bit more bandwidth which is already being strained by having to share it between the console's cpu and gpu. The cpu's only have a ratio of 512KB of L2 cache per core, typically PCs have double or quadruple that, making the juggling of bandwidth between the cpu and gpu just that much tighter.
 
He visited Epic Games and like I have said a few months ago it was the first engine he decided to ditch and he explained in an interview with the PS Blog he want to customise tools too not only the renderer.

He visited all first party studio, DICE too...

After he is now independent if one day he is not happy of his collaboration with Sony he can decide after Death Stranding to use UE 4 for example or more probably create a new engine... Or if he wants to do a new thing he is free...

Edit: I doubt he will not be happy with Sony, it is a relationship build upon years of trust and Sony continue to offer him the best support for Death Stranding
Crytek would be a great visit
 
Crytek would be a great visit

He said into the interview no commercial engine because he wants to modify the renderer and the tools. In commercial engine he said tools are a "black box" because of this it is not interesting for him.

This is why when he begins to do realtime trailer he used two first party engine probably Sucker Punch one for the E3 2016 trailer...
 
Last edited:
So maybe its not a vram bandwidth issue and perhaps the simple act of sampling 16 times for 16x AF puts extra load on the shaders. Or perhaps it is a vram bandwidth issue and the occasional cache missefor textures fragments represents uses that bit more bandwidth which is already being strained by having to share it between the console's cpu and gpu. The cpu's only have a ratio of 512KB of L2 cache per core, typically PCs have double or quadruple that, making the juggling of bandwidth between the cpu and gpu just that much tighter.

Since AF is done by utilizing N clock cycles, I wonder if sharing I/O requests with the CPU might be a non-trivial factor in the general case i.e. latency for CPU just goes up, so it's not just a resource problem. In split memory cases, it's a non-issue & the bottleneck would be simply bandwidth and fillrate for the GPU alone (bearing in mind that GPUs are already designed to handle high latency).
 
Crytek would be a great visit

This was not the first time Kojima went on a tech tour. He did it after MGS4 too, before setting out to develop the FOX engine, which was meant to trump all other engines he saw. Crytek was one of the studios he visited then.
 
Back
Top