Image Quality and Framebuffer Analysis for Available/release build Games *Read the first post*

Yeah it will produce artifacts on parallaxing objects and fast or erratically moving objects mostly, but for a good majority of the screen it will produce very high quality results, often identical to a true 1080p image (and more often than a 1080i deinterlaced image). But that's because essentially the render is still a full 1080p image in total, it's just split up into fields, and instead of just deinterlacing those fields, motion is properly reprojected using motion vectors from the past few frames. Whereas the Quantum Break renders are all 720p. Overlaying multiple 720p images can produce an image sharper than 720p (similar methods are sometimes referred to as super resolution), but you will never get near the same quality of a true 1080p image. It's a similar technique in terms of reprojection, but the original sampling technique can't reconstruct a 1080p image, unlike Killzone. Overall, I think Killzone's technique is an excellent way to optimize a game designed around 30fps to run at 60fps.
 
It resulted in artifacts. That's how it was figured out...
If the artefacts were that bad, it'd have been figured out much earlier. It went unnoticed for the most part until some artefacts were spotted. That contrasts with QB's technqiue which is always noticeable in the softness of the image. It's not massively from dynamic res in terms of results. "If player moving, RenderBuffer.res = 1280x720, else RenderBuffer.res = Lerp(1280x720, 1600x900, framecount/4)" :p
 
I'm not saying it's using adaptive resolution. Just the results seem conceptually similar, that I wonder what the choices are either way. That is, why did QB go this route instead of adaptive scale? Pros/cons?
 
The obvious pro is rendering at 720p, saving a lot of perf to hit a consistent 33.33ms frame delivery. For Remedy to have gone that way i guess rendering at 720p and accumulating frames is less h/w taxing than rendering at native 900p or 1080p. Plus they also get to do some cool tricks with that, use ghosting from previous frames for a visual effect associated with time manipulation like this one (the time dodge effect):

1q9uoz.jpg
 
Oh okay. 900p rendering might be prohibitively expensive when stationary, so adaptive scaling above 720p isn't possible. So the trick is to render 720p always and 'fake' (reconstruct for those who take umbridge with connotations) the higher resolution when possible at no added rendering cost.
 
The game probably could have been 900p if they used some post processed AA technique rather than 4xAA, although most of the time I think 4xAA looks better, but there are some exceptions, like with Uncharted 4, where they have a really good TAA effect to the point where I almost never noticed aliasing at all when playing the beta, and that was at 900p (compared to the 1080p SP)
 
Uncharted 4 also used a rather strong sharpening filter, which i wasn't a bit fan of, to mask the sub 1080p res. I hope they give us a sharpening option like The Division for the final game, i'd like to tone that down a bit.
 
(IMO) The reason why Quantum Break is baseline 720P:
SSAO buffer too taxing above 720P
SSGI buffer too taxing above 720P
SSR buffer too taxing above 240P (lol, so true)
and some others.

any of these thing going above 720P could Break the 33ms frametime.

After all this they do have so spare cycles. So that's why they thought of the 720P weird blurry vaseline sub- 900P feeling upscale algorithm which works... sometimes... when you are standing still... for a few frames. it was the least they could do, obviously.

In my theory, they could have easily fitted the geometry buffer at 1080P within their frame times, but all the above effects again would Break the image, making it feel weird: you would have sharp detail, but pixelated AO pasted on top.
 
I'm not even sure if they are running every screen space effect at 720p. And effects are not tied to geometry buffer resolution so there's no reason for remedy to stick to 720p other than performance optimization afaik.
 
Well, they're whole lighting system is 720p, no? Maybe they figured uniformly soft was a better compromise than having a disparity between the lighting and the rest of the image.
 
Uncharted 4 also used a rather strong sharpening filter, which i wasn't a bit fan of, to mask the sub 1080p res. I hope they give us a sharpening option like The Division for the final game, i'd like to tone that down a bit.

Well, the single player campaign will be 1080p, not 900p like the multiplayer. I personally didn't notice a sharpening filter on the multiplayer myself, but I think it would be a good idea to include that. Perhaps the sharpening was just a natural result of the PS4's built in upscaler?
 
Sharpening is most easily noticeable in textures, two example shots
23030366934_3e8e6b9154_o.png

23031362803_cfbe63bfdd_o.png


It's a tad too strong for my taste.
 
Btw as for image quality: compare the uncharted multiplayer screenshots to ANY quantum break screenshot where people say "wow it looks almost 900P!"
If quantum break looks like 900P then uncharted multiplayer looks like 8xMSAA 1080P
 
Back
Top