Image Quality and Framebuffer Speculations for WIP/alpha/beta/E3 games *Read the first post*

Another PNG that look upscaled here horizontally:

FFXV_Duscae_Stills_FINAL_NA20.png

Wait, the previous batch of images are JPG (not PNG) downscaled to 1280x720 images, that's a challenge but downscaling an image is not enough to make it impossible to roughly pixel count it :yep2::

This one is ~1080p maybe native vertically (at least minimum 1000):
screenshots__3_.jpg

But this one strongly suggests (because specular aliasing) a non-native horizontal resolution (maybe ~1400 instead of 1920):
screenshots__8_.jpg

The PNGs (at least some of them, featuring gameplay) look to be upscaled vertically and horizontally, maybe ~1400x800-ish
The JPGs (also some of them) seem to be running at ~1400x1080p, maybe native vertically and upscaled horizontally.
 
True but all of the content from the ps4 demo so far has been 1080p. Maybe they mean the xbox version or the fact that they didnt get to 30fps locked.
 
PS4's version will be discerned soon enough this month but would be weird to see Bone's version 800-900p while the Quad doesn't reach 1080p for the demo even.
 
Also, 0 AF in those screenshots above? :p (I think I'm starting to learn what that is ;) )
 
This one, presumably from the XB1 version (because batch of PNG instead of JPG):

I (still) find ~800p (maximum) and no AF, it really looks like some basic trilinear filtering...remember that those are PNGs so we can't complain about shitty compression to explain the blur on the ground textures.

FFXV_Duscae_Stills_FINAL_NA02.png


So maybe PS4 ~900p, XB1 ~800p and no AF on both versions.

Another PNG showing lack of AF:

FFXV_Duscae_Stills_FINAL_NA03.png
 
I (still) find ~800p (maximum) and no AF, it really looks like some basic trilinear filtering...remember that those are PNGs so we can't complain about shitty compression to explain the blur on the ground textures.

So maybe PS4 ~900p, XB1 ~800p and no AF on both versions.

Seems pretty reasonable. I wonder if they went with 792p (11/15) & 900p like certain other titles (Watch Dogs).
 
Seems pretty reasonable. I wonder if they went with 792p (11/15) & 900p like certain other titles (Watch Dogs).
Sounds about right as well to me. I would love to know what the engine is eating up to cause the game to perform better by lowering the resolution.
 
Sounds about right as well to me. I would love to know what the engine is eating up to cause the game to perform better by lowering the resolution.
I thought they were using a fat G-buffer. I forget.

DF may be doing a performance eval soon. *ahem*
 
I thought they were using a fat G-buffer. I forget.

DF may be doing a performance eval soon. *ahem*
heh, maybe for Xbox a Fat G-buffer would smoke esram. But PS4 shouldn't be affected by something like that right? I think with Fox Engine the fact that we see 1080p/720p the fat g-buffer is a reasonable guess. But when I see PS4 dip in resolution I feel like the reason is elsewhere. They seem to be both affected, equally, so it's something they both share in common, i'm wondering if clock speed/not enough shaders is playing a factor here.
 
But when I see PS4 dip in resolution I feel like the reason is elsewhere. They seem to be both affected, equally, so it's something they both share in common, i'm wondering if clock speed/not enough shaders is playing a factor here.

Gotta be the hair.

:p

It's certainly not those terrain textures!

I mean, what exactly was holding back Watch Dogs as well :?:
 
Here is a ~900p JPG (900p being the maximum, slightly sub-900p possible here strangely, anyway presumably from PS4 version) in a very similar spot (middle of the road) than the ~800p PNG.

The texture filtering in both versions is virtually identical:

~900p JPG
bRlrLp.jpg


~800p PNG
FFXV_Duscae_Stills_FINAL_NA02.png
 
Gotta be the hair.

:p

It's certainly not those terrain textures!

I mean, what exactly was holding back Watch Dogs as well :?:
I wish we knew. I'm guessing just lack of FLOPs/unused FLOPs for the type of code they were running.
I think things are going to get worse in terms of guessing what the issue will be as console developers continue to improve and bypass bottlenecks.

Older generation games it's easier to point your finger at the problem, everything was fairly standardized, we had an idea of the performance based upon numbers, but games in 2017 with DX12/mantle/Vulkan and beyond, they're going to really start to leverage more and more of that peak shader throughput, and the perspective of what 1.84 and 1.3TFLOPS will change.
 
Back
Top