Dynamic framebuffer resolution

Alucardx23

Regular
Hello guys,

Can someone here explain why this is not something that is being done at a hardware/OS level on consoles? After reading the Digital Foundry article on Halo 5 dynamic resolution, it really would seem like a great idea to make sure a stable framerate is provided at all times. I'm personally a lot more sensible to an unstable framerate compared to a dynamic resolution. On PC if a game cannot maintain 60fps, I prefer to lock it down at 30fps, even when the average is 50/55fps. So in my case, momentarily lowering the resolution (1080P) by 10/15% to reach that stable 60fps would be imperceptible in most cases. I can understand how it wound't be so attractive if your top resolution is 720P, but now current gen consoles can go a lot higher than that.

Related articles:

What works and what doesn't in Halo 5: Guardians
http://www.eurogamer.net/articles/d...works-and-what-doesnt-in-halo-5-tech-analysis

Face-Off: Rage
http://www.eurogamer.net/articles/digitalfoundry-rage-face-off

Face-Off: Wolfenstein: The New Order
http://www.eurogamer.net/articles/digitalfoundry-2014-wolfenstein-the-new-order-face-off

WipEout HD's 1080p Sleight of Hand
http://www.eurogamer.net/articles/wipeout-hds-1080p-sleight-of-hand

Dynamic Resolution Rendering on OpenGL* ES 2.0
https://software.intel.com/en-us/articles/dynamic-resolution-rendering-on-opengl-es-2

Games that support dynamic resolution:

1- Wipeout HD


"Basically WipEout HD is the first game I've come across that seems to be operating with a dynamic framebuffer. Resolution can alter on a frame-by-frame basis. Rather than introduce dropped frames, slow down or other unsavoury effects, the number of pixels being rendered drops and the PS3's horizontal hardware scaler is invoked to make up the difference. It's an intriguing solution that works with limited impact on the overall look of the game (the tearing has far more of an impact on image quality - I'm assuming that kicks in when the framebuffer can't scale any lower).

The actual amount of horizontal resolution being dropped can change on a frame by frame basis: 1728×1080, 1645×1080, 1600×1080, 1440×1080. All have been seen in the Digital Foundry TrueHD captures. The shots above appear to be 1500×1080. The dynamic framebuffer is really quite an innovative solution to the perennial 1080p problem. Even though we're seeing major differences in resolution, the human eye really will have trouble realising the difference when the detail level is changing so rapidly in such a fast moving game.
In short, it's making an advanced-looking game like WipEout HD work at 1080p60 and that's pretty damn awesome."

2 -Rage

"Rage on console runs with a dynamic framebuffer that adjusts resolution according to engine load, but looks to run the game at as close to the full 1280x720 wherever possible. As you might expect from a game that operates at 60Hz, GPU time is at an absolute premium so neither version of the game features any kind of anti-aliasing, which is a bit of a shame as some of the more high contrast areas do lend themselves to noticeable "jaggies"."

"Load balancing is the key to Rage's enviable performance profile. The developers have put frame-rate first to the point where the game code physically downscales the visuals to make it run at 60Hz. We understand the principle behind it and we can see the way it works simply by looking at the captures we've taken, but there remains an element of mystery in what actually causes the scaler to kick in, which we'll come to in a moment. In terms of the basic mechanics, Rage operates an internal monitor that measures engine load and when the renderer is in danger of missing a 60Hz refresh, it downscales the next frame. A 720p vertical resolution is always maintained, but the horizontal can be adjusted dynamically - anything from 640x720 (cutting image definition in half, basically) up to full-fat 1280x720."

3- Tekken Tag Tournament 2

"Rather than render at a fixed sub-HD resolution, Tekken Tag Tournament 2 operates using a dynamic framebuffer, where the resolution is adjusted on the fly depending on the rendering load. For performance reasons, anti-aliasing has been also been completely omitted to free up GPU resources for Namco's impressive object blur effect. Both versions kick off running natively at 720p when there are just two characters on screen. As things heat up and combat intensifies the engine tends to drop the resolution down to a more manageable 1024x720 for extended periods of time: image quality is only slightly impacted, and we still get the feeling that we are looking at a near-HD presentation. It's certainly a step up from the fixed 1024x576 resolution we saw in the demo bundled with Tekken Hybrid.

Switching between characters and performing tag-team combos results in more severe compromises: according to a tech presentation given by Namco in Japan, the resolution is adjusted in various steps, with 900x720, 800x720, and 720x720 all being seen at various points. Resolution is pared back to its maximum extent when there are all four characters on screen in scenes which really tax the engine, and also in 3D mode where the rendering load is doubled up to generate distinct views for each eye."

4- Wolfenstein: The New Order

"After first isolating an obvious example of the tech at work on Xbox One, a more detailed look at the captures revealed that both versions of the game achieve their locked 60Hz update by adjusting the amount of pixels rendered at any given point, in effect balancing engine load in order to put consistent refresh and controller response first.

Having now completed our analysis, it's clear that the PS4 gains an advantage with smaller drops in resolution that occur less frequently than they do on Xbox One. Metrics in the area of 1760x1080 are found on PS4, while on the Xbox One this can drop to an extreme of 960x1080 in some scenes. This is usually identifiable by an increase in the amount of jaggies on screen, along with a slightly fuzzier appearance to the already gritty aesthetic that Machine Games employs throughout the game."

5- Halo 5

"343 has taken an approach similar to that of Rage, Wolfenstein The New Order, and WipEout HD - dynamic resolution scaling. By adjusting the resolution according to GPU load, Halo 5 is able to consistently reach its 16.67ms frame-time objective. It's a very impressive implementation too, bearing in mind how infrequent dropped frames are, to the point that they are essentially unnoticeable during play. Previous Halo 5 builds have utilised an adaptive v-sync, where torn frames are visible when the engine can't sustain its target frame-rate. In our latest captures we could only spot one solitary torn frame.

What this technology means for the player is a constantly changing resolution during gameplay, ranging from something in the region of 1152x810 all the way up to a full 1080p. Fortunately, based on what we've seen so far in this build, campaign mode gameplay tends to hang around 1344x1080, during which it's fair to say full HD is fleetingly attained during big set-piece battles, such as encounters with the giant Kraken in a later mission. Interior areas inevitably hold up better in this sense, with less strain put on the engine when the level design funnels the player in one direction. The pay-off is clear though; 60fps is locked at practically all times, while image quality shifts up and down the scale to ensure this consistency."

6- The Division

"Wherever you see heavy volumetric effects and lots of geometry on-screen, there's a strong possibility that The Division's frame buffer is switching to a lower pixel count on Xbox One. It's a clever trick - and the exact moment of the switch can be hard to catch by eye. In practise, it does start to blur spots in the image in direct comparison to PS4 and PC, particularly across long distances.

So what exactly is the resolution? Well, the short answer is, it varies depending on the rendering scenario. In the opening shot, Xbox One drops to 1792x1008 as we look at the recovery camp ahead, as especially noticeable on a rooftop structure to its right side. This is still much higher than the 900p output we see in other titles, and it ultimately sits around 87 per cent of a full HD image overall. In other words, it isn't always a vast difference, but it's notable in explaining why we see certain details losing some definition.

The logic behind this dynamic resolution is easy to explain too. Inevitably, looking up to the sky in the same initial area - even slightly - to force the demanding elements of this scene out of view, the Snowdrop engine switches itself to a full 1080p. But as you'd expect, lowering the camera again (bringing all objects and fog effects into view) gets us back to 1792x1008.

The end result is an occasional softening of the image, and tends to occur around outdoor scenes as opposed to interior traversal and combat. In terms of the actual breadth of resolution changes, we've not had time to assess the full range utilised by the Xbox One beta - though the lowest figure we've logged comes in at 1728x972 (or 81 per cent of a full HD resolution)."
 
Last edited:
Most probably because it's not easy to implement, keep in mind even H5 drops a few frames (although rarely) so it's not perfect still.
 
Most probably because it's not easy to implement, keep in mind even H5 drops a few frames (although rarely) so it's not perfect still.

I'm OK with 99% perfect ;). On my example a I was referring more to the possibility of something that the manufacturer would include at an OS/hardware level, so the developer wouldn't need to worry about it and just turn it On if needed.
 
Can someone here explain why this is not something that is being done at a hardware/OS level on consoles?

It's just not something that's feasible for the OS to do, since the game's software is in control of almost everything on the GPU (particularly for console games, which may have arbitrary access to memory and command buffers). It could potentially require integration into every part of the rendering pipeline, depending on how the engine is setup and how the dynamic resolution is implemented.
 
Most probably because it's not easy to implement, keep in mind even H5 drops a few frames (although rarely) so it's not perfect still.

Thoes could be cpu hiccups where resolution change wont help? IO glitches or similar?

Edit : lines for me

must read all replies before replying.
must read all replies before replying.
must rea........
 
It wouldn't make sense on a OS/Hardware level.
Every game engine is different and the reason a Dynamic resolution exists in the games you listed is because the engine will scale down it's resolution to match the rendering budget.

On a OS/Hardware level the manufacturer wouldn't be able to predict workload of games and scale accordingly. It's up to the developer in this case.

Perharps in SDK this could be something. If there was a api or code developers could use to implement into their games if they wanted.
 
Dynamic resolution is tricky to implement properly (especially on PC) since the CPU and GPU are running asynchronously. It is impossible to predict perfectly whether the current frame is going over budget. And if the frame time overflows the budget you get that information (to CPU) 1-3 frames later. Thus CPU side timing is not a perfect fit for dynamic resolution, or any other system requiring low latency response. You need to be highly over conservative in order to avoid missing vsync spuriously.

AMD GCN has a GPU timer, see timeAMD extension below:
https://www.opengl.org/registry/specs/AMD/gcn_shader.txt

GPU timer allows the GPU itself to notice that time budget is overflowing, even during the same frame, and react immediately (zero latency). For example after the G-buffer rendering, you could check the timer (in a compute shader) and choose the lighting & post processing resolution based on the remaining frame budget (use indirect draws/dispatches). I would like to see GPU timer hardware exposed for all the PC GPUs (by DirectX 12 and Vulkan). This would make dynamic resolution more common (also on PC).
 
Dynamic resolution is tricky to implement properly (especially on PC) since the CPU and GPU are running asynchronously. It is impossible to predict perfectly whether the current frame is going over budget. And if the frame time overflows the budget you get that information (to CPU) 1-3 frames later. Thus CPU side timing is not a perfect fit for dynamic resolution, or any other system requiring low latency response. You need to be highly over conservative in order to avoid missing vsync spuriously.

AMD GCN has a GPU timer, see timeAMD extension below:
https://www.opengl.org/registry/specs/AMD/gcn_shader.txt

GPU timer allows the GPU itself to notice that time budget is overflowing, even during the same frame, and react immediately (zero latency). For example after the G-buffer rendering, you could check the timer (in a compute shader) and choose the lighting & post processing resolution based on the remaining frame budget (use indirect draws/dispatches). I would like to see GPU timer hardware exposed for all the PC GPUs (by DirectX 12 and Vulkan). This would make dynamic resolution more common (also on PC).

Thank you for the excellent answer sebbi. Maybe I didn't present my question in the correct way, but it was more about why this is not included by the console manufacturer as something than is simply there for the developer to use, just like the included upscaler. I have a good idea why with your answer. Thanks.
 
Dynamic resolution is tricky to implement properly (especially on PC) since the CPU and GPU are running asynchronously. It is impossible to predict perfectly whether the current frame is going over budget. And if the frame time overflows the budget you get that information (to CPU) 1-3 frames later. Thus CPU side timing is not a perfect fit for dynamic resolution, or any other system requiring low latency response. You need to be highly over conservative in order to avoid missing vsync spuriously.

AMD GCN has a GPU timer, see timeAMD extension below:
https://www.opengl.org/registry/specs/AMD/gcn_shader.txt

GPU timer allows the GPU itself to notice that time budget is overflowing, even during the same frame, and react immediately (zero latency). For example after the G-buffer rendering, you could check the timer (in a compute shader) and choose the lighting & post processing resolution based on the remaining frame budget (use indirect draws/dispatches). I would like to see GPU timer hardware exposed for all the PC GPUs (by DirectX 12 and Vulkan). This would make dynamic resolution more common (also on PC).


This nonsense isn't needed on PC, that's why you buy a PC :D Keep it away!
 
Thank you for the excellent answer sebbi. Maybe I didn't present my question in the correct way, but it was more about why this is not included by the console manufacturer as something than is simply there for the developer to use, just like the included upscaler. I have a good idea why with your answer. Thanks.

If I'm not mistaken I'm pretty sure the Xbox one's GPU fully supports dynamic resolution scaling. It was talked about in the Digital foundry Xbox one architects interview. The Ps4 probably has support built into its GPU as well, but the way the interview put it makes me think the Xbox supports it on a software level as well.
 
Well at least the scaler chip/parts support dynamic res. I might be wrong but for the gpu this shouldn't be that hard... well at least the res switch, not the analysis of what res the next frame should be.
 
If I'm not mistaken I'm pretty sure the Xbox one's GPU fully supports dynamic resolution scaling. It was talked about in the Digital foundry Xbox one architects interview. The Ps4 probably has support built into its GPU as well, but the way the interview put it makes me think the Xbox supports it on a software level as well.

They do talk about this during the interview, but not as something that is included in the GPU.

"What we're seeing in titles is adopting the notion of dynamic resolution scaling to avoid glitching frame-rate. As they start getting into an area where they're starting to hit on the margin there where they could potentially go over their frame budget, they could start dynamically scaling back on resolution and they can keep their HUD in terms of true resolution and the 3D content is squeezing. Again, from my aspect as a gamer I'd rather have a consistent frame-rate and some squeezing on the number of pixels than have those frame-rate glitches."

http://www.eurogamer.net/articles/digitalfoundry-the-complete-xbox-one-interview
 
Considering how well it seems to be implemented on Halo, if you could get the same on PC, you could push your settings and potentially get imperceptible momentary resolution drops instead of noticeable stutters from dropped frames. I know which I'd choose.
 
Would you really prefer to lower the graphics settings for the whole game, when your hardware is not capable of that perfect 60fps (56/57fps ex.), instead of just having a dynamic resolution that kicks in when needed?
Definitely, but PC GPUs are already loads faster than consoles and so long as you're running at 1080p it isn't hard to maintain 60fps.

Hell I'm currently running a 1600x900 LCD with my GTX970 which is a complete joke :)
 
...even when the average is 50/55fps.
30fps gives better pacing on you default case of sync to 60hz but achieving 50fps, so be smart sync to 50hz and fully enjoy smooth updates at your achievable average.

Dynamic resolution has to be better understood in terms of overhead needed for the control.
 
30fps gives better pacing on you default case of sync to 60hz but achieving 50fps, so be smart sync to 50hz and fully enjoy smooth updates at your achievable average.

Dynamic resolution has to be better understood in terms of overhead needed for the control.

Locking the refresh rate to 50Hz works great for a monitor, but it's not an option for consoles, as most TVs are locked at 60Hz. It could be Dynamic resolution or some other method, but as VR becomes more and more popular maintaining a locked framerate will become a priority.
 
Locking the refresh rate to 50Hz works great for a monitor, but it's not an option for consoles, as most TVs are locked at 60Hz.

Do you have any data to support this? AFAIK most newer TVs support 24, 30 50 and 60 fps. But you might know something I don't.
 
Do you have any data to support this? AFAIK most newer TVs support 24, 30 50 and 60 fps. But you might know something I don't.

Pretty sure all EU hdtvs do 50hz as all broadcasters here use 50hz for hd content.

The os can lock to 50hz so snapped tv runs smooth, would be nice for games to have ability to target 50hz if asked by the os and to put thoes cycles to good use.
 
Back
Top