What do you prefer for games: Framerates and Resolutions? [2020]

What would you prioritize?


  • Total voters
    42
From DF:
"However, even basic ports which barely use any of the Series X's new features are delivering impressive results. The Coalition's Mike Rayner and Colin Penty showed us a Series X conversion of Gears 5, produced in just two weeks. The developers worked with Epic Games in getting UE4 operating on Series X, then simply upped all of the internal quality presets to the equivalent of PC's ultra, adding improved contact shadows and UE4's brand-new (software-based) ray traced screen-space global illumination. On top of that, Gears 5's cutscenes - running at 30fps on Xbox One X - were upped to a flawless 60fps. We'll be covering more on this soon, but there was one startling takeaway - we were shown benchmark results that, on this two-week-old, unoptimised port, already deliver very, very similar performance to an RTX 2080."

The 2080 as benchmark is not relevant in this discussion. I just need to know that it's running 60fps ultra+ and that gives me a fairly good idea of how well it's going to perform when optimized.

So I'm wondering if that's

SX @ Ultra w/ SSGI + improved AO

vs

2080 Ultra vanilla ice-cream
 
Similar performance to RTX 2080 + your benchmark = 49.8 FPS. Of course the information about how the 2080 performs is relevant, it's the only thing that provides a fair comparison.

gears-5-3840-2160.png


You are using the 60fps number and then throwing it into a PC gaming benchmark to say the XBSX is 50% faster. It's not an honest comparison and defies basic logic, either the 2080 runs Gears 5 at 4K with Ultra settings at 60fps or at 49fps. Both cannot be true.
Or DF just assumed a 2080 was sufficient to hit 60fps ultra which is why they said similar performance to a 2080. And if not it’s still running higher settings than ultra. Much higher.

the future is clear with respect to this title; gears 5 will patch 60fps with ultra equivalent settings. That’s a clear goal for them.

The largest differential between is that we are measuring 2 like architectures for performance and We are given 1 number for XSX. We are doing an apples to apples comparison here, perhaps a Royal Gala to a Fuji.

Comparing it to Turing architecture is the definition of apples to oranges comparison.
 
Last edited:
So I'm wondering if that's

SX @ Ultra w/ SSGI + improved AO

vs

2080 Ultra vanilla ice-cream
Indeed. And we know the cost of Lumen as being the greatest hit to performance for UE5 demo.

perhaps a pseudo precursor to lumen
 
Last edited:
Indeed. And we know the cost of Lumen as being the greatest hit to performance for UE5 demo.
Epic's new business model will be to improve Lumen performance on your personal console by buying V-Bucks.
 
Wow, I'm getting called out while people here say that games will run at 1440p@30 on PS5 and at 4K@30 on XBSX.
The resolution difference is speculation - may be right, may be wrong. Your 10% difference is factually incorrect, unless you can present some clever maths to show otherwise. So yes, you're called out for false facts on B3D. ;)

By all means dispute the claims that PS5 will have to render at half the resolution, but do so with facts and legitimate arguments. You could have (and should have) used a 20% figure for the difference in broad graphics power.

Edit: 20% less GPU power equating to 20% lower resolution will be something near 1900p for PS5, vs 2160p (4K is really non-descriptive and I think we shouldn't use it in resolution discussions as it doesn't provide at-a-glance comparison)
 
The resolution difference is speculation - may be right, may be wrong. Your 10% difference is factually incorrect, unless you can present some clever maths to show otherwise. So yes, you're called out for false facts on B3D. ;)

By all means dispute the claims that PS5 will have to render at half the resolution, but do so with facts and legitimate arguments. You could have (and should have) used a 20% figure for the difference in broad graphics power.

Edit: 20% less GPU power equating to 20% lower resolution will be something near 1900p for PS5, vs 2160p (4K is really non-descriptive and I think we shouldn't use it in resolution discussions as it doesn't provide at-a-glance comparison)

So a 10% difference deserves a callout but saying that one console will render the same game at less than half resolution of the other one is speculation (hint hint more than 50%). Which, in addition, is based on some cherry picked data points and benchmarks that are completely misleading.

Or DF just assumed a 2080 was sufficient to hit 60fps ultra which is why they said similar performance to a 2080. And if not it’s still running higher settings than ultra. Much higher.

the future is clear with respect to this title; gears 5 will patch 60fps with ultra equivalent settings. That’s a clear goal for them.

The largest differential between is that we are measuring 2 like architectures for performance and We are given 1 number for XSX. We are doing an apples to apples comparison here, perhaps a Royal Gala to a Fuji.

Comparing it to Turing architecture is the definition of apples to oranges comparison.

What are you talking about? The benchmark with the RTX 2080 was done by Microsoft.


"For the benchmark they ran it at full ultra mode with no extra settings and THEY compared directly with a PC with an RTX 2080 and a 2950X"
"What that benchmark shows us is XboX Series X produce nearly identical results"
"The PC has some advantages but the performance is on par with an RTX 2080"

It's very simple:

1- Did MS achieve 60fps with the 2080 at 4K with ultra settings? Yes.
2- Is the 2080S better than the 2080? Yes
3- Would it make sense for the 2080 to outperform a 2080S in a benchmark? No
4- Is it misleading to say that the XBSX has a 50% advantage using a benchmark where the 2080S can barely reach 50fps? Yes.

Do you understand why you can't use the 60fps figure in that benchmark? Unless you want to tell us that the XBSX performs better than the 2080TI. Apparently that is "speculation" here. Lol.
 
*Damage Control

I think it was very telling that Gears 5 ran at higher framerate on a 2080 than on Series X. AMD provides a great solution for both Sony and MS and, while the have made a lot of progress on the CPU front, they don't seem to be able to close the gap with NVIDIA on the GPU side. They can compete in the mid-tier market, which may be enough for them, but they can't catch 2080 and even 2070 super performance (the 5700XT is very close tho). Will RDNA2 close the GAP? My bet is that a 3060 (even with less theoretical performance) paired with a 3600X will offer better performance in multiplat games than both consoles.

That PC used for the demo comparsion had a better CPU. I expect the XSX to top out at near 2080 level. Will be utterly outclassed once Ampere is out.
 
So a 10% difference deserves a callout but saying that one console will render the same game at less than half resolution of the other one is speculation (hint hint more than 50%). Which, in addition, is based on some cherry picked data points and benchmarks that are completely misleading.

"For the benchmark they ran it at full ultra mode with no extra settings and THEY compared directly with a PC with an RTX 2080 and a 2950X"
"What that benchmark shows us is XboX Series X produce nearly identical results"
"The PC has some advantages but the performance is on par with an RTX 2080"
Well, I apologize, I didn't watch this particular video, I was running off information of the original article and interpreted it poorly. But this is a good info here.

I was wrong about the benchmark, but that still doesn't change the nature of the my argument.
You are fixated largely on the wrong argument, I'm not saying XSX will run 4K60 and therefore PS5 will run 4K30.

I'm saying if a game is optimized exactly for 4K60 on XSX. Then by default PS5 will run below 4K60. Since the next closest framerate is 30fps, the option is to clip or scale resolution. If your engine does not support scalable resolution, you will clip.

I'm not saying there is a power differential between the 2 GPUs of that amount. I'm saying in the real world and how in the console world things are running fixed frame rates, these types of scenarios can happen.

Performance between these 2 GPUs can be determined through formulation.
But when it comes to retail release and specifications, formulation doesn't apply, we move onto probability and distributions.
If everything was as simple as formulation, we wouldn't have such as many outliers as we do in comparing PS4 Pro and X1X, some of which titles are running 4K on X1X and 1080p on 4Pro. Does formulation explain that? No, it doesn't. But that was the release. We have median values, and 4Pro for the most part performed in the median value with respect tot performance vs X1X. But that doesn't remove the outliers. Which has been my argument all this time. I am "speculating" that there will be some titles that fall into this category, because we've seen it happen before.

There were at least 2000 titles released last generation, we are likely to have another 2000 odd or so. Are you so confident that there is no way outliers could occur in which XSX will double the frame rate, or resolution will be doubled? I wouldn't bet on that, surely not based on the data we have from this current mid-generation. In the same vein, I wouldn't bet against that PS5 and XSX run the same settings either despite the 18-20% advantage XSX has over PS5.
 
Unless XSX is a 20.6TF console then yeah, it most definitely will be true, pure math here.
And yeah sure you get more motion clarity with 60fps but at the end of the day 30fps is clear enough for majority of the folks, the sheer amount of graphics afforded would be way more noticeable than being smoother in motion. But I guess you can call it however you wanted. The most eye pleasing, smoothest, etc.


Remains to be seen indeed. If both are aiming for 30fps for their respective first party title then the extra Tflops should offer either better Raytracing, slightly higher res (15% higher?) or a few settings higher in shadows, volumetric lighting etc. While the 2x faster SSD could potentially stream in more high res assets or higher res assets, better LOD, maybe even more advanced level design?
But what's certain is if it's 60fps vs 30fps then you could easily tell which one looks better right away, unless XSX magically attains 20.6TF at launch.
Interesting times ahead.

The GPU is what renders the image, not the SSD.
 
I'm saying if a game is optimized exactly for 4K60 on XSX. Then by default PS5 will run below 4K60. Since the next closest framerate is 30fps, the option is to clip or scale resolution. If your engine does not support scalable resolution, you will clip.

Mind you, it seems less and less likely that devs wouldn't be using dynamic resolution these days, especially with multiplatform where there are built-in solutions for COD, Frostbite, Ubisoft (w/e the name of their engines are), Unity and Unreal Engines. It's just becoming more prevalent.

Naughty Dog, in recent memory, seems rather atypical for whatever reasons - perhaps they rigorously and meticulously craft their levels in such a way that the variance is minimized around the 33ms target, which most developers may be less inclined to do for the purposes of development time.

If everything was as simple as formulation, we wouldn't have such as many outliers as we do in comparing PS4 Pro and X1X, some of which titles are running 4K on X1X and 1080p on 4Pro. Does formulation explain that? No, it doesn't. But that was the release. We have median values, and 4Pro for the most part performed in the median value with respect tot performance vs X1X. But that doesn't remove the outliers. Which has been my argument all this time. I am "speculating" that there will be some titles that fall into this category, because we've seen it happen before.

mmm.... well we don't know if that's mainly due to a lack of RAM in the extreme cases;I thought it was more common to see 1440p vs 4K? Anyways, if framebuffers take up a few hundred MB at 1080p (including intermediate buffers for post-processing, shadows etc), then they'll need a lot more than the extra 512MB to hit higher resolutions (1440p is ~1.8x the space of 1080p).
 
Well, I apologize, I didn't watch this particular video, I was running off information of the original article and interpreted it poorly. But this is a good info here.

I was wrong about the benchmark, but that still doesn't change the nature of the my argument.
You are fixated largely on the wrong argument, I'm not saying XSX will run 4K60 and therefore PS5 will run 4K30.

I'm saying if a game is optimized exactly for 4K60 on XSX. Then by default PS5 will run below 4K60. Since the next closest framerate is 30fps, the option is to clip or scale resolution. If your engine does not support scalable resolution, you will clip.

PS5 isn't going to make up a 20fps / 50% performance increase just by clocking a nudge higher.

then there will be cases where PS5 is 1440p30 and XSX is 4K30

I'm a bit tired of this conversation, but you were claiming differences in performance of 50% (speculation they call it) (using that benchmark) and even beyond that. If the difference in real performance is 20%, a game running at 60fps on XBSX with no headroom, would mean that the same game on PS5 would run at 48fps. I'm pretty sure they can optimize the PS5 version to reach 60fps by lowering some settings or using at slightly lower res.
 
Mind you, it seems less and less likely that devs wouldn't be using dynamic resolution these days, especially with multiplatform where there are built-in solutions for COD, Frostbite, Unity and Unreal Engines. It's just becoming more prevalent.

Naughty Dog, in recent memory, seems rather atypical for whatever reasons - perhaps they rigorously and meticulously craft their levels in such a way that the variance is minimized around the 33ms target, which most developers may be less inclined to do for the purposes of development time.



mmm.... well we don't know if that's mainly due to a lack of RAM in the extreme cases;I thought it was more common to see 1440p vs 4K? Anyways, if framebuffers take up a few hundred MB at 1080p (including intermediate buffers for post-processing, shadows etc), then they'll need a lot more than the extra 512MB to hit higher resolutions (1440p is ~1.8x the space of 1080p).
Indeed and bandwidth was less, so despite its massive ROP advantage it was unable to take advantage of it.

dynamic resolution is steadily becoming a normal feature but it may not play well with everything like VRS.
 
I'm a bit tired of this conversation, but you were claiming differences in performance of 50% (speculation they call it) (using that benchmark) and even beyond that. If the difference in real performance is 20%, a game running at 60fps on XBSX with no headroom, would mean that the same game on PS5 would run at 48fps. I'm pretty sure they can optimize the PS5 version to reach 60fps by lowering some settings or using at slightly lower res.
As you’ve proven successfully; you can chuck that benchmark argument out. I was looking for an example of something running at 60 and what was below it.
Yes the correct formulation would be 60 and 48 instead.

I don’t have any doubts that reducing resolution or graphics setting would enable them to meet frame rate goals or resolution goals. I’m just looking at a probability distribution and saying it’s unlikely to work out that way.

I’m more than willing to donation bet that there will be at least 1 outlier in the coming generation in which XSX version of a game will double the pixel output of PS5.
 
Maybe the fact that in the demo you mentioned your “playable character” is a marble rolling around in a very pretty and very static environment should give us an idea that expecting that level of realism and RTRT in games with all sorts of other technologies, animations, dynamic objects and environments, is not going to be easily attainable on consoles or even super duper Nvidia hardware. Hence, Minecraft RTX.

This is not the point. Spencer was running his mouth about how we have now reached the limit of photorealism. We now know that this is BS. We are not even close to photorealism without full scene ray-tracing. That marble demo, as flawed as it is, show us the full possibility of advanced hardware raytracing and is a clear step up from current gen. Those next-gen machines are simply not strong enough to provide it. While I understand the technical and economic reason for why this is so, there is no need to push the BS that this is how far we can actually go.
 
I'm a bit tired of this conversation, but you were claiming differences in performance of 50% (speculation they call it) (using that benchmark) and even beyond that. If the difference in real performance is 20%, a game running at 60fps on XBSX with no headroom, would mean that the same game on PS5 would run at 48fps. I'm pretty sure they can optimize the PS5 version to reach 60fps by lowering some settings or using at slightly lower res.

No one knows that, in fact going by official specs, the only thing one can say is that the performance delta is at least 17%. The PS5's GPU performance is constrained by a common power envelope shared with the CPU, the XSX GPU is not.
And that's just going by the raw performance metric. But we do know that MS has designed its system to exploit a whole host of performance tools/hacks: VRS, mesh shaders, stateless rendering (currently reading some very interesting patents wrt GPU command buffer and GPU driven primitive culling) and upscaling via directml. Sony has confirmed none of those features and yet they are core features for the expected gain in efficiency of the RDNA 2 architecture. VRS alone can result in a 10% gain in performance on some titles and Sony has been dodging DF on questions about it for months.
 
So a 10% difference deserves a callout but saying that one console will render the same game at less than half resolution of the other one is speculation.
Yes. Inaccurate arguments need to be discussed; factual data needs to be 'called out' (correct term is 'corrected'). You're absolutely right to contest the view that 1440p is 50% of 2160p and a completely unrealistic expectation (without a solid argument behind it), but you have to make that counterpoint with factually correct data.
 
Back
Top