Dynamic resolution on XB1

Status
Not open for further replies.
I linked to the article so people could see where I was quoting from.
I'm assuming you meant dynamic resolution and not framerate as that is something else.

How many games was starting to use dynamic res on x360?
They seemed to make changes to the scaler to help facilitate it, yet I've not seen or heard of its use in current or future XB1 games.
Your pretty much saying what I said, the main difference is I'm curious to why we have not seen any indication that it's going to be used any time soon, and why is that?
I'm fully aware we're at the very start of this generation, but we're starting to see games that was developed for this gen and not simply previous gen engines ported up.
Is it as simple as a what a dev at Naughty Dog said
WipEout did that iirc, they had this horizontal scaling where rendering width decreased on the fly in frantic moments.
But tbh racing games are special; it would probably look bad in most other game genres : )
http://www.dualshockers.com/2014/07...-work-graphics-coders-freak-out-then-succeed/
Yet as mentioned in this thread it was used in Killzone Mercenary.
I would think the higher the resolution the less a slight drop here and there would be noticeable during game play.

I've got a feeling I'm just going to have to stay curious and wait until it's used :LOL: (Maybe the next forza motorsport game if that ND dev was correct)
 
I linked to the article so people could see where I was quoting from.
I'm assuming you meant dynamic resolution and not framerate as that is something else.

How many games was starting to use dynamic res on x360?
They seemed to make changes to the scaler to help facilitate it, yet I've not seen or heard of its use in current or future XB1 games.
Your pretty much saying what I said, the main difference is I'm curious to why we have not seen any indication that it's going to be used any time soon, and why is that?
I'm fully aware we're at the very start of this generation, but we're starting to see games that was developed for this gen and not simply previous gen engines ported up.
Is it as simple as a what a dev at Naughty Dog said
http://www.dualshockers.com/2014/07...-work-graphics-coders-freak-out-then-succeed/
Yet as mentioned in this thread it was used in Killzone Mercenary.
I would think the higher the resolution the less a slight drop here and there would be noticeable during game play.

I've got a feeling I'm just going to have to stay curious and wait until it's used :LOL: (Maybe the next forza motorsport game if that ND dev was correct)

My bad yeah I meant dynamic resolution. There are two games that have dynamic Res
on the X1 Wolfenstein The New Order and Tomb Raider DE. Tomb Raider gameplay is 1080p but the real time cinematics will drop to 900p when needed.
Like you said it is too early in the gen to determine whether or not Dynamic framebuffers will become common place. There are challenges with going this route which are explained in posts above by other members.You need to keep in mind though dynamic res isnt soley going to be used only on the X1.
 
Oh yea, I'm fully aware this isn't a XB1 specific thing.
I was mainly only talking about the XB1 due to their talk about their scaler and how it would help facilitate it etc.
I wasn't aware that it was used in Wolfenstein as iroboto also mentioned thanks.
http://www.eurogamer.net/articles/digitalfoundry-2014-wolfenstein-new-order-performance-analysis
UPDATE 23/5/14 10:05: We've analysed more assets and now believe that a dynamic framebuffer is indeed in effect - on both Xbox One and PlayStation 4 versions. MachineGames does appear to be using the technique used in Rage, where horizontal resolution is scaled according to engine load in order to sustain 60fps.
Be interesting to hear if they use the added facilities in the scaler or not.
I just found it strange that MS had to be the one to change the scaling sharpening even though the devs supposed to have access to the scaler.

Just occurred to me even if a game in development was going to use dynamic res whether they would say or not, given the <1080p discussions it would illicit lol.
 
I think people make way to big a deal about resolution sometimes.
Yea it is great to have 1080P games, but on a console I think consistent framerates are more important.
Dynamic Res can help games get there. I hope that devs use it more.
 
How about design your games to have a consistent framerate and a consistent resolution.. instead..?

I really hate what dynamic res does to games. Ninja Gaiden 3 looks absolute trash IQ wise compared to NGS2 because its always dropping down to some absurd resolution turning everything muddy and blurry.

Same with TTT2, its much worse than Tekken Revolution IQ wise, because they dropped the dynamic res and it was a better looking game for it. With TTT2, you could not even see the rendered assets of the stages, even with motion blur turned off
 
The problem with consistent framerate and res is that you've got unused resource when there's nothing happening. For me, I want at least a little scaling in quality when things got hectic to get that consistent framerate while having better IQ when things not as hectic. It doesn't have to be dynamic resolution only (which for me is fine as long as the drop is not too much. 1080p to 900p is ok) but it can also have LOD based on the engine stress and also effects scaling. The key thing is that the transition must be smooth or something that is not too obvious.
 
How about design your games to have a consistent framerate and a consistent resolution.. instead..?

Sure. Just design them to run on a NES or Atari 2600. On those platforms, predicting just how much time will be used to build a new frame is easy.

On platforms with out of order processors, caches, GPUs that dynamically schedule memory accesses, it's not quite that simple.
 
Even 8-bit consoles had slowdown and flicker ... flicker could be a (literal) killer in sprite heavy action games!

Variable frame rates wouldn't be so bad if monitors were capable of variable refresh rates, and sync for each frame was based on a sync signal from the video output. It would eliminate any need for tearing, would greatly reduce judder, and give a window of acceptable frame rates for a game depending on the type of gameplay (e.g. 45~60 for shootbangers, 25+ for graphically awes turn based RPGs).

As it is, consistent frame rate should be a priority, with "1080p" or fixed res being massively further down the list.
 
Sure. Just design them to run on a NES or Atari 2600. On those platforms, predicting just how much time will be used to build a new frame is easy.
You can certainly get a constant 60 fps on consoles by targeting well less than 100% utilisation (eg. make a Pong game a la Atari 2600). On PC it may be trickier because there's always something that happens in the background to glitch framerate. If you're sufficiently under-demanding and complete you game code and render in a few ms, there should be enough time left to fit in some other task until the next frame.
 
I've been playing Metro on PS4..and let me tell you, this console can handle good enough graphics at 1080p and 60fps basically locked. Just manage your game budget appropriately.

Dead or alive 5 last round is gonna look great, and run at 1080p locked on both consoles. Why? Because its not stressing the hardware none. As a developer, that's what they should be aiming, designing games with lower requirements than the actual base hardware leaving headroom for performance and IQ
 
Sure that's a totally valid and reasonable way, not stressing the hardware, managing budget, leaving a big headroom.
It's not the only way though.

I would prefer to stress the hardware, try to leave as little headroom as possible.
There are times when if the framerate or res drops it wouldn't be noticeable unless you are specifically looking for it.
Your way would probably look better on the stats in DF breakdowns though.

This isn't some sort of magic bullet or something.
Just another approach to engine/game design that I personally would prefer in games that it would work for.
It also doesn't mean that you wouldn't have to manage your game budget either.
 
Sure that's a totally valid and reasonable way, not stressing the hardware, managing budget, leaving a big headroom.
It's not the only way though.

I would prefer to stress the hardware, try to leave as little headroom as possible.
There are times when if the framerate or res drops it wouldn't be noticeable unless you are specifically looking for it.
Your way would probably look better on the stats in DF breakdowns though.

This isn't some sort of magic bullet or something.
Just another approach to engine/game design that I personally would prefer in games that it would work for.
It also doesn't mean that you wouldn't have to manage your game budget either.

I agree.
 
You can certainly get a constant 60 fps on consoles by targeting well less than 100% utilisation (eg. make a Pong game a la Atari 2600). On PC it may be trickier because there's always something that happens in the background to glitch framerate.

AMEN, BROTHER!

Windows always seems to schedule something stupid just in time to stomp on some time-critical code that gets pushed out of the cache and then has to be reloaded creating huge delays.

Four cores help because at least a few cores' caches are spared trashing.
 
I have been programming many locked 60 fps console games.

If you want to reach locked 60 fps (target = never drop a single frame), your average frame rate would hover somewhere above 70 (if the fps wasn't locked). Never dropping more than 10 fps below your average fps (=16% drop) requires huge amount of quality assurance and testing.

It's worth noticing that since the unlocked average would be 70 fps, and the low is obviously 60 fps, reaching 80 fps is as common as dropping to 60 fps. In this particular case (80 fps situation) the 60 fps lock causes the GPU to idle 25% of the frame time. Obviously you could make the game 25% heavier to render (25% "better" graphics), but then the minimum frame rate of 60 fps would also become a minimum frame rate of 45 fps... It would judder and tear like crazy :(

Dynamic image quality scaling is a good way to solve this problem. Many developers have been researching methods to achieve nice looking seamless scaling, allowing the GPU to be fully utilized at all times (better looking graphics), while never dropping frames from your target. I hope these kind of techniques become more common in the future.
 
Rather than simply dropping resolution, couldn't you do something like drop AA, reflections, or tessellation? I can imagine a scenario whereby a large explosion happens near to the player causing a framerate drop, in that situation wouldn't it be better to spend the rendering budget on making the explosion look incredible since the player won't actually be paying attention to the ancillary details?

Art suggests that your eyes are drawn to certain aspects of a painting or a photograph and that you should frame the image accordingly. Couldn't something similar happen with a game, so that you're drawn to the important part of the image and render a cutdown version of the less important at that particular moment? Then fade the other aspects in afterwards.
 
Rather than simply dropping resolution, couldn't you do something like drop AA, reflections, or tessellation? I can imagine a scenario whereby a large explosion happens near to the player causing a framerate drop, in that situation wouldn't it be better to spend the rendering budget on making the explosion look incredible since the player won't actually be paying attention to the ancillary details?
Firstly, you're mistaken. If all the glass windows suddenly lost their reflections, you'd absolutely notice it. So would shrinking draw distance or suddenly stopping shadows for a few frames. Brains are sensitive to change, and a dramatic change even for a frame will flash for attention. You could drop AA without making a huge visual difference, but that's basically the same as reducing resolution - reducing the number of data-points sampled used to construct the frame. It'd be the same as a marginal resolution drop in terms of performance released to the system.

You could use some art to hide the difference, such as blurring out the background (render at lower res and blur) and compositing the big explosion or whatever effect on top, but that'll be applicable in limited circumstances for limited games. A typical slow-down moment will have lots and lots happening all over and you won't know where the player is looking.

Secondly, you still have the same issue as adaptive resolution. If you can't predict when the framerate is going to take a hit, you'll drop a frame before adapting your workload to render subsequent frames faster. If you want to run a game at 60 fps with no frame drops whatsoever, you either need a 100% predictable renderer so you know when the demanding rendering is coming and can adapt ahead of time (perhaps possible with fixed hardware and some workloads), or you design your game to never go over 60 fps.

The lookahead method with dynamic resolution sounds interesting. I wonder if during developing, profiling of zones/events that cause slowdown could result in predictable situations? I wouldn't have thought so for most engines running several frames of processing concurrently.
 
Status
Not open for further replies.
Back
Top