PS4 Pro vs PS4 Graphical Comparison Thread

Proper HDR ranges (10k+ lux white outdoor walls hit by direct sunlight, 0.1 lux dark indoors) and physically plausible GGX specular result in very high brightness ranges. Old LDR games were muted in comparison. Didn't even need eye adaptation.

This video is a good example how bad GGX specular looks on industrial surfaces (DOOM like) when there's no specular antialiasing applied:

(FF over the marketing bullshit in the beginning)

Modern PBR rendering needs either temporal AA or some specific specular AA solution (Toksvig mapping, LEAN mapping, etc). MSAA does not adequatly remove specular aliasing.

One of the first titles with a lot of specular aliasing I noticed was Crysis 3. According to Crytek it had some PBR pipelines but no PBR materials.
TXAA could not help aginst the specular aliasing. Yes, developers have to intigrate something (specular anti aliasing) into the TAA to reduce shimmering.
 
TXAA could not help aginst the specular aliasing. Yes, developers have to intigrate something (specular anti aliasing) into the TAA to reduce shimmering.
Temporal AA is very good against specular aliasing. But you need to perform TAA filter before bloom and in tonemapped color space. And obviously have small enough multiplier in the exponential sum. Brian Karis's 2015 SIGGRAPH presentation has some comparisons and videos.

Slides: http://advances.realtimerendering.com/s2014/epic/TemporalAA.pptx
Video: http://advances.realtimerendering.com/s2014/epic/TemporalAA_Compare.mov

I don't know exactly how TXAA works (as it is a Nvidia proprietary solution). But I know that pure spatial single sample post process techniques such as MLAA and FXAA can in worst case even increase increase temporal aliasing (specular shimmering + edge shimmering in motion). As I said above, temporal AA needs to be integrated in the middle of the post processing pipeline. Also everything needs to go through temporal filter. TXAA itself or the way it is integrated to Cryengine might make it less efficient against specular aliasing.
 
Last edited:
But does specular aliasing cause so much bright circles on these industrial surfaces? it looks like small suns flickering on there! I had no idea this is the result of no AA, always thought it's deliberate!
That can get sort of complicated. There are of course lots of things that shimmer in the real world, a good example being anything made up of smooth facets that are sort of macro-scale to the viewer (such as the extremely self-descriptive example of glitter).

Or, how about water in sunlight with tiny choppy waves.
Up very close, you see ordinary glossy specular reflections off a curvy surface, with the surface changing rapidly.
Back off a ways, so that the individual waves are tiny in your vision and can't be directly distinguished easily. Now the water looks a lot like flat surface, but covered in smooth macrofacets that are popping rapidly in and out of existence. Some of them are oriented so that sunlight gets directed at you, causing glittering behavior.
Hop in an airplane and view the lake from many miles away. Now, the glitters are so small and dense that their intensity averages out over a given point in your vision, and rather than glitter, the lake looks like a flat surface made rough by microfacets.

Now find a game that replicates this behavior correctly at all scales and transitions between them convincingly :yes:
 
Overwatch has being patched with a much better AF on Pro, probably 16x now:

Overwatch-Training.png


http://powerupgaming.co.uk/2016/11/...80p-comparison-screens-showcase-improvements/
 
Still boggles the mind how a 1.84 tf console can't do 8x AF when I used to rock 8x af on my 6600gt playing Far Cry 1 at 1650 x 1020. Does it scale linearly with the complexity of the general graphics?
 
Still boggles the mind how a 1.84 tf console can't do 8x AF when I used to rock 8x af on my 6600gt playing Far Cry 1 at 1650 x 1020. Does it scale linearly with the complexity of the general graphics?
Your 6600gt had memory bandwidth to spare for Farcry @ 8x af averaging ~30fps. However at the worst moments it may have dipped into the mid to low twenties. That is something Overwatch cannot tolerate.
AAfarcry.png

Overwatch prioritizes framerate, and was designed to maintain a solid 60fps at even the most intense multiplayer moments. When you double the framerate you significantly increase memory bandwidth requirements. Multiplayer also increases the cpu's memory bandwidth requirements. Overwatch textures are also larger and use less lossy texture compression than farcry's so doing af to those textures also increases memory bandwidth more than doing so to farcry's. Overwatch also has more complex higher poly models, more bandwidth intensive visual effects, bit more for SMAA, etc.

Then there is the ps4/x1 hardware itself. Having unified memory architecture it has to share its memory bandwidth with the cpu, and cpu memory contention with the gpu will be high in this game given its a multiplayer game. Your 6600GT didn't have to split memory bandwidth and deal with memory contention.

Perhaps the most important fact over all of this is that the inherent memory technology between 2004 when 6600gt was release and 2013/14/15/16 in all but enthusiast cards has only gone up ~10-12x. And in the ps4 case when the cpu is accessing at ~5GB/s due to memory contention that only leaves ps4 with ~115GB/s bandwidth available to the gpu. This means ps4's gpu memory bandwidth has gone up 7.5x over the 6600gt while computational performance has gone up ~35x in ps4.
 
Last edited:
Thanks for breaking it down Pixel, I guess Cerny either overlooked the weakness of shared memory structure of APU or made a conscious compromise for the overall performance/budget of the PS4 architecture. I guess next gen with 3d stacked memory tech, the issue of low af would be gone once and for all.
 
Your 6600gt had memory bandwidth to spare for Farcry @ 8x af averaging ~30fps. However at the worst moments it may have dipped into the mid to low twenties. That is something Overwatch cannot tolerate.

Overwatch prioritizes framerate, and was designed to maintain a solid 60fps at even the most intense multiplayer moments. When you double the framerate you significantly increase memory bandwidth requirements. Multiplayer also increases the cpu's memory bandwidth requirements. Overwatch textures are also larger and use less lossy texture compression than farcry's so doing af to those textures also increases memory bandwidth more than doing so to farcry's. Overwatch also has more complex higher poly models, more bandwidth intensive visual effects, bit more for SMAA, etc.

Then there is the ps4/x1 hardware itself. Having unified memory architecture it has to share its memory bandwidth with the cpu, and cpu memory contention with the gpu will be high in this game given its a multiplayer game. Your 6600GT didn't have to split memory bandwidth and deal with memory contention.

Perhaps the most important fact over all of this is that the inherent memory technology between 2004 when 6600gt was release and 2013/14/15/16 in all but enthusiast cards has only gone up ~10-12x. And in the ps4 case when the cpu is accessing at ~5GB/s due to memory contention that only leaves ps4 with ~115GB/s bandwidth available to the gpu. This means ps4's gpu memory bandwidth has gone up 7.5x over the 6600gt while computational performance has gone up ~35x in ps4.
I don't agree with this. There is the theory and there are the facts. I think almost all the great looking Sony exclusives featuring good or very good amount of Anisotropic filtering proved us that this contention problem being the cause of low AF on many multiplatform games (but not all) was being blown out of proportion

You can't use some multiplatform games to judge a specific hardware problem. But if you take one of the best looking games this gen like: Infamous, bloodborne, Killzone and even a quickly done remaster TLOUR you realize they all have good or maximum AF applied while being graphically ones of the best.

Particularly the most impressive is Infamous SS IMO. Open world, high amount of polygons (>10 millions), high amount of high res assets and texture using very good amount (usually 8x) of anisotropic filtering applied. And that's almost a launch game so meaning the hardware is perfectly able of doing high AF and great graphics / performance with not much tweaking with the hardware and software.

I think they use low anisotropic filtering on many multiplatorm games because those games have to run on Xbox 1 hardware. Indeed this hardware showed us with some of its early exclusives (notably Halo5 and MCC) that it had serious trouble rendering decent amount of AF even when developed specifically for this hardware when 2 exclusives similar PS4 games (Killzone SF and TLOUR) were much better looking graphically while having a higher level of AF.

Overwatch had to run at 1080p on XB1 so they probably had to cut notably the AF (and use low resolution textures too) in order to do that (and the game still has trouble keeping its 1080p60fps target on XB1). But if you look at for instance The witcher 3 you realize this game has decent amount of AF applied on most textures (usually 4x or 8x and textures are higher quality) while being graphically one of the best looking multiplatform game with a very stable framerate. Well the game is running at only 900p on XB1...

My point being: yes the memory contention is a problem on all consoles but out of the box and just simply using the available tools, the PS4 is perfectly able to render great graphics + good framerate + decent level of texture filtering. Most exclusives and many multiplatform games showed us that it's a well balanced piece of hardware, well if you are using correctly all the cores of the CPU obviously.
 
Would posting a frame rate difference video count as a graphical comparison? Arkham City's got a hidden PS4 Pro mode that runs faster.
 
Those same games have higher AF on Xbox One hardware, so that excuse doesnt hold up.

Is aniso higher on PS4 Witcher 3, or is it that the PS4 game is running at a higher resolution, and possibly with mip transitions happening a little further back (more samples = transition further back to maintain same ratio)?
 
Is aniso higher on PS4 Witcher 3, or is it that the PS4 game is running at a higher resolution, and possibly with mip transitions happening a little further back (more samples = transition further back to maintain same ratio)?

Ps4 is 1080p, X1 is using dynamic res afaik (up to 1080p)

Ps4: http://images.eurogamer.net/2013/ar.../1/PS4_004.bmp.jpg/EG11/quality/90/format/jpg
X1: http://images.eurogamer.net/2013/articles//a/1/7/5/6/6/2/1/XO_004.bmp.jpg/EG11/quality/90/format/jpg

Pretty clear AF difference
 

x1 version is constant 900p. PS4 version is performing 44% more texture sampling operations for the same area of screen even if the aniso level is the same. And that will show onscreen.

Or put differently, if you render at 900p and upscale to 1080p then the image - and all the textures on screen - will look softer than a native 1080p image. This does not necessarily mean a lower level of aniso filtering.

I'm not convinced that TW3 is using a lower level of aniso on X1. I think the effect may be due to a lower resolution and the naturally lower number of texture samples for the final upscaled image.
 
Those same games have higher AF on Xbox One hardware, so that excuse doesnt hold up.
Which games ? the average games ported from Unreal 3 engines to PS4 ? Didn't we already get to the bottom of that?

Also we do have XB1 games that have less AF than their PS4 counterpart (like Sniper elite 3, a 1080p60fps game on both consoles, better AF and better performance on PS4). Not one of those XB1 games were later patched with more AF contrary to many Unreal 3 games that were subsequently patched on PS4 with zero performance penalty like Dying light.


Dynamic res-PR. It's was a thing back then but it didn't last long. Only the start screen was 1080p, the rest of the game was 900p. And i think the XB1 is using the same AF setting here, what you see is the resolution difference combined with a slight chromatic aberration effect impacting more the lower resolution.
 
Back
Top