Image Quality and Framebuffer Analysis for Available/release build Games *Read the first post*

What bothers me is when games like Uncharted 4/LL use native 1440p instead of checkerboard 4K. I realize it's easier to implement, but Naughty Dog aren't usually known for doing things the easy way.
The one catch with reconstruction in general is if you didn't have it built in mind, it's far from trivial to rebuild the whole game to implement it. That's arguably the largest advantage that X1X over 4Pro; in cases where the developer needs to go back and make a 4K version, you only need to ramp it up to a higher resolution, its a colossal waste of power, but the job gets done with higher IQ settings and makes people happy. the resolution marketing will go south once all/many games start going reconstruction from the start.

it's hard to blame ND on this one, they had already deployed the game, it's been well into development before 4Pro was launched. They didn't know how much power the 4Pro would have, they proceeded in my mind the best they could to meet their deadlines.
 
Did you read the article you linked too?...

Which is precisely my point... the game runs at a higher resolution on XB1 (close to 1080p) and native 4K is still too expensive for the PS4Pro.

So, once again, if we exclude sport games, remasters and indie games, i'm not expecting native 4K games on Pro.
 
The one catch with reconstruction in general is if you didn't have it built in mind, it's far from trivial to rebuild the whole game to implement it. That's arguably the largest advantage that X1X over 4Pro; in cases where the developer needs to go back and make a 4K version, you only need to ramp it up to a higher resolution, its a colossal waste of power, but the job gets done with higher IQ settings and makes people happy. the resolution marketing will go south once all/many games start going reconstruction from the start.

it's hard to blame ND on this one, they had already deployed the game, it's been well into development before 4Pro was launched. They didn't know how much power the 4Pro would have, they proceeded in my mind the best they could to meet their deadlines.
Okay, but what about Lost Legacy?

Sent from my OnePlus One using Tapatalk
 
Okay, but what about Lost Legacy?

Sent from my OnePlus One using Tapatalk
Same engine/tools ? It got released the year after right ? Edit: wait isn't it 4K?nvm: 1440p

TLOU2 is a better candidate schedule wise. They will have more time to rework the pipeline to meet those demands if that is the cause for a lack of reconstruction.
 
Last edited:
Usually 1620p or higher res on Pro. Which is pretty much what is expected. 1620p is 2.25 more pixels than 1080p, Pro has a GPU 2.3x more potent than PS4s.

Based on that the average resolution on XBX would be around 1950p.
Not sure if those guesses are very strong. From the article:
However, even testing this on less demanding scenes, we've yet to see Shadow of War truly hit 2160p - in fact, staring at the sky on PS4 Pro only gets the machine to a maximum of 1830p. That's with resolution mode selected too
When DF writes this, compute is probably not the bottleneck for the performance of the game. You'd suspect that if it was compute, looking into the sky would have resulted in 4K.

there are likely other bottlenecks at play here for SOW, possibly, memory footprint.
 
Not sure if those guesses are very strong. From the article:

When DF writes this, compute is probably not the bottleneck for the performance of the game. You'd suspect that if it was compute, looking into the sky would have resulted in 4K.

there are likely other bottlenecks at play here for SOW, possibly, memory footprint.
You can look in the sky and the game could still compute the whole level. Looking in the sky will more likely show a rasterization bottleneck (or lack of), not compute.
 
You can look in the sky and the game could still compute the whole level. Looking in the sky will more likely show a rasterization bottleneck (or lack of), not compute.
Hmm, well equally i suspect looking st the sky; There's little to shade. Therefore compute is going to have way less a factor there. I have my doubts about rasterization, as 4Pro equals the # of ROPs as 1X, just with less available bandwidth. If this is the case your calculation would be wrong for 1X, as there is over 100 more GB/s on 1X; and it would be wrong for the PS4 to 4Pro difference as their bandwidth difference is 40GB/s. Not 2.3x the number.

Deferred rendering has its advantages and disadvantages. One of them being footprint. Several Buffers at 4K plus textures and whatever else you need and space becomes and issue. Yes, im suggesting that it's coincidence that 4Pro is has exactly the same amount of compute difference as there is resolution, but they are both also stuck with 8GB of memory. Had 4Pro 12GB it's entirely possible we could be seeing 4K native in more instances I think.
 
Hmm, well equally i suspect looking st the sky; There's little to shade. Therefore compute is going to have way less a factor there.
If that was the case then devs woudln't need to invent countless strategies in order to compute and shade only what the player sees.
0RlreKB.gifv

ucoln8kedwfglsrlxvm5.gif

I have my doubts about rasterization, as 4Pro equals the # of ROPs as 1X, just with less available bandwidth. If this is the case your calculation would be wrong for 1X, as there is over 100 more GB/s on 1X; and it would be wrong for the PS4 to 4Pro difference as their bandwidth difference is 40GB/s. Not 2.3x the number.

Deferred rendering has its advantages and disadvantages. One of them being footprint. Several Buffers at 4K plus textures and whatever else you need and space becomes and issue.
Who says that ? Because there are no official sources about the number of ROPs on Pro, only assumptions. Because if the Pro had 64 ROPs then theoretically, even with the reduced memory bandwidth and lower GPU clocks, the Pro would have a small advantage in pixel output. But obviously this would be just a small advantage compared to what we actually know the XBX has more: Ttflops, bandwidth and memory.
Yes, im suggesting that it's coincidence that 4Pro is has exactly the same amount of compute difference as there is resolution,
Or it's not a coincidence and the Pro is a balanced machine with just enough power to produce a 4K CBR image (with some optimization) when the base PS4 outputs at 1080p. Which seems the case in already some noteworthy games.
but they are both also stuck with 8GB of memory. Had 4Pro 12GB it's entirely possible we could be seeing 4K native in more instances I think.

Yes the XBX will have better textures thanks to more available memory. But it won't be necessarily '4K' textures (not 4x higher res). For instance Tomb Raider on XBX has just the PC settings above Pro (I believe from memory PC Ultra instead of High). In most cases devs will just use the best textures available on PC.

there are likely other bottlenecks at play here for SOW, possibly, memory footprint.
I don't see how more memory could help framerate or average resolution though in those third parties games (if that's what you implied).
 
If that was the case then devs woudln't need to invent countless strategies in order to compute and shade only what the player sees.
Yea, with regards to frustum culling, the blue is what the player can see, and outside of that is being rendered even though the player can't see it. What happens when you point the frustum directly into the sky? Are you implying that you need to render 360 around the player all of sudden because they can end up looking anywhere? I don't understand where you are going with this point. With regards to frustum culling, most of the render work is still within the blue, so if your blue zone is pointing at a skybox how much outside of that should the engine really be rendering?

Who says that ? Because there are no official sources about the number of ROPs on Pro, only assumptions. Because if the Pro had 64 ROPs then theoretically,
Pretty sure we group calculated on the 1X having 32 ROPS.
even with the reduced memory bandwidth and lower GPU clocks, the Pro would have a small advantage in pixel output.
I'm not sure what you're referring to with this point. ROP calculations are precision * frequency * ROPS = Bandwidth/ROP bound.
For 4Pro
RGBA8 911 * 32 * 4 bytes (rop bound)
RGBA16F 911 * 32 * 8 bytes (bandwidth bound)

For PS4
RGBA8 800 * 32 * 4 bytes (rop bound)
RGBA16F 800 * 32 * 8 bytes (bandwidth bound)

1X
RGBA8 1172 * 32 * 4 bytes (rop bound)
RGBA16F 1172 * 32 * 8 bytes (rop bound)
^^
I don't think 1X will be bottlenecked before 4Pro when it comes to rasterization. Unless there are 64 ROPS on PS4Pro. That would be a different scenario.

Or it's not a coincidence and the Pro is a balanced machine with just enough power to produce a 4K CBR image (with some optimization) when the base PS4 outputs at 1080p. Which seems the case in already some noteworthy games.
I'm just discussing SoW. SoW doesn't use checker boarding. The note by DF indicated that even at Resolution mode it could not hit 4K when looking into the sky. I'm suggesting to you it's likely not a compute problem, I don't know a scenario in which a compute problem would not allow this device to hit max resolution when it's required to render as little as possible.

Yes the XBX will have better textures thanks to more available memory. But it won't be necessarily '4K' textures (not 4x higher res). For instance Tomb Raider on XBX has just the PC settings above Pro (I believe from memory PC Ultra instead of High). In most cases devs will just use the best textures available on PC.
This seems tangential to what I'm discussing. My point is that you can't use your calculations from PS4 -> PS4Pro resolution scaling and expect the same formula to work with Xbox One X. They have different hardware and you don't know where the bottlenecks are. I targeted your compute as being a flaw in logic since looking at a sky box and not having max resolution should not be a compute problem.
I don't see how more memory could help framerate or average resolution though in those third parties games (if that's what you implied).
It's about how the engine is designed. Not how the hardware is designed. I'm speaking strictly about SoW and the bottlenecks it may be hitting with PS4 Pro, but may not actually hit with 1X because of a difference in ram size.

Xbox One had this problem. We've found on many occasions that deferred engines cannot fit on XBO, the 32 MB esram + the slow DDR3 makes it a strong candidate for 900p.
 
Last edited:
i don't know if someone can pixel count from this:
Gxfg453.jpg


WKYnTCN.jpg




in a youtube video from cobraTV they were saying that the dev told them the game would run at native 4K on pro, so can you say from this pic if it's antive or not ?
 
i don't know if someone can pixel count from this:
Gxfg453.jpg


WKYnTCN.jpg




in a youtube video from cobraTV they were saying that the dev told them the game would run at native 4K on pro, so can you say from this pic if it's antive or not ?
Quick reply: not native. I don't have the time now to pixel count, but it's not native:
upload_2017-10-22_13-54-25.png
 
Yeah, it surely looks like some weird shit's going on, I noticed the first time, but it doesn't look native to me. Is this it or what happens?
 
Assassin's creed origins on Pro running at 4K from gamersyde video.

I find native 2560x1440 (1440p upscaled to 4K obviously) and ~1500p on another pic. They don't use checkerboard rendering, but I can notice some slight TAA artifacts very similar to the ones seen in Far Cry 4.

It's a notable resolution boost (>2.56x) from 900p seen on previous AC on base PS4.

http://images.gamersyde.com/image_assassin_s_creed_origins-36938-3880_0001.jpg
http://images.gamersyde.com/image_assassin_s_creed_origins-36938-3880_0003.jpg

The game is quite beautiful BTW.

The Pro game supports downsampling on a 1080p screen:
http://images.gamersyde.com/image_assassin_s_creed_origins-36936-3880_0005.jpg
 
Last edited:
Assassin's creed origins on Pro running at 4K from gamersyde video.

I find on 2 pics native 2560x1440 (1440p upscaled to 4K obviously). They don't use checkerboard rendering, but I can notice some slight TAA artifacts very similar to the ones seen in Far Cry 4.

It's a notable resolution boost (2.56x) from 900p seen on previous AC on base PS4.

http://images.gamersyde.com/image_assassin_s_creed_origins-36938-3880_0001.jpg
http://images.gamersyde.com/image_assassin_s_creed_origins-36938-3880_0003.jpg

The game is quite beautiful BTW.

That's disappointing considering Syndicate ran at 1620p on the Pro I believe.

Sent from my OnePlus One using Tapatalk
 
That's disappointing considering Syndicate ran at 1620p on the Pro I believe.

Sent from my OnePlus One using Tapatalk
This game looks more demanding than Syndicate with more details and bigger vistas. Also the TAA seems to work very well here.

One pic looks ~1500p. Most probably a dynamic res.
 
Last edited:
Back
Top