Trend of independent render resolution

orangpelupa

Elite Bug Hunter
Legend
Hello,
since last-gen era, games are starting to be able to render independently from native resolution. The game render in lower-resolution, then upscaled, then native UI is overlaid.

Usually this is limited to console-only but more and more games also open this option to PC.

Nowadays, a few big budged game that use this are:
BF4
Ryse
CoD Advanced Warfare
Mordor

Any reason why this only becoming a "trend" lately?
Is it simply because developers need their games to be able to run on wide-range of PC?

while PC have rather fast CPU, they often married with slow GPU. on the other hand PS4 and X1 have rather high-end GPU (should i call them medium-end enthusiast level?) with lots of vram and rather slow CPU.

*according to steam hw survey most PC users use
2,3 to 2,6 GHz intel and 3,3 to 3,69 GHz AMD.
1 GB VRAM
Native monitor 1080p
 
A range of new antialisaing methods started turning up in last few years that did a decent job of scene content but struggled with UI and text layers. There's also an increasing amount of post processing steps in modern games including things like compute shaders.

Lots of investigation went into these methods and many of the issues vanished by pushing the UI into a separate pass. Once you've taken the hit of doing that the rest is obvious.
 
Right, so in a sense you could say that the trend on PC has been to render the HUD in native resolution irrespective of the chosen gameplay resolution? I mean, there could be a link between the prevalence of LCDs that have a specific native resolution driving this change as well.
 
Judging from the words of developers in the 1080p thread started by fellow former Shortbread, that's a smart thing to do, instead of bruteforcing everything to 1080p rendering some parts of the picture at different resolutions might give excellent results.

I wonder if games will ever be as good as things like Blu-rays reusing "old information" in the framebuffer to avoid extra computational resources being used on repetitive parts of a scene.
 
yeah LCD really looks bad when not in native res. seems natural thing to do now with prevalent of LCD and various hardware performance.
 
I wonder if games will ever be as good as things like Blu-rays reusing "old information" in the framebuffer to avoid extra computational resources being used on repetitive parts of a scene.

It's not quite the same thing, but I guess what Killzone is doing is close to temporal compression. It draws half a frame and blends it with the previous frame in a kind of vertical interlace.
 
Back
Top