What do you prefer for games: Framerates and Resolutions? [2020]

What would you prioritize?


  • Total voters
    42
Although VRR throws a bit of a spanner into the works. Couple that with dynamic resolution, and it seems quite likely that the difference in performance (whether that's framerate or resolution will vary from game to game) will be near enough the difference in TF's.

Either way, the XSX won't dip as much, but for a 4K60 vs 4K30 scenario, you'd be looking at the kind of shoddy optimisation that one of the first CoD's got on the XBoxOne.

indeed. From a user standpoint it’s going to be hard to see. But since we’re on a technical discussion I just wanted to point out that there will likely be exceptions.

Though if your dynamic resolution is on average holding 1440p and another system is doing 4K. Then it’s not really all that dynamic. It’s just dynamic when loads well over exceed its threshold.

VRR would reveal the true difference
 
I would LOVE for there to be 'performance' and 'quality' modes in every major release, and to finally, at the end of the generation, get statistics on whether gamer choose 30 fps pretties or 60 fps smoothness (and how that breaks down per genre).
The pretty graphics vs fps was the case for a lot of this gen, but it's changing.
With temporal injection/construction etc framerate will have an even bigger impact on the visual prettiness than ever before.
Do agree with your post though.

I think most games (AAA) will come with modes now, legacy of mid gen consoles and cross gen work.
 
Given the gap in PC CPUs vs what we're getting, most devs probably should be targeting 60fps anyway.
 
hmmm...
I wouldn't be so confident about that.
If a console is stuck inbetween 30 and 60, it will be capped back to 30 right.

well we talk about situations in which there's enough there for XSX to make it to 4K60 and just not enough to hold 40-50fps on PS5, and you have your 4K30 vs 4K60 scenario.

ti's not that hard to see happening.

We already know from DF reports that XSX is holding Gears 5 ultra at 60fps very little to no optimization.

So...
https://www.overclock3d.net/reviews/software/gears_5_pc_performance_review_optimisation_guide/9

we have a ballpark idea of where a 5700xt overclocked will land as well. Hint, it's average is 40fps. PS5 isn't going to make up a 20fps / 50% performance increase just by clocking a nudge higher.

Nah, you are clearly mixing contradictory information and omitting important facts.

DF said MS showed a benchmark where XBSX was on par with a 2080 and both hitting 60fps, with the 2080 providing a little better performance.

In that bechmark not even the 2080S (which is 10% faster than the 2080 by the way) can reach 60fps. So, either MS lied about the benchmark, or overclock3D hit some weird bottleneck that the PC MS used for the benchmark didn't. In any case, the XBSX is not matching the 2080s in performance, so in that bechmark would score below 50fps and much closer to the 5700XT.

Also, that 5700XT is RDNA1 and has lower clocks than the PS5. So there is that as well, your 50% difference pretty much gone.

.... there will be titles that land in this type of setup then there will be cases where PS5 is 1440p30 and XSX is 4K30....

LOL. XSX pushing 120% more pixels with a 10% theoretical advantage in raw performance. Good luck with that.

edit: It may be 18%, but I'm speculating that maybe Cerny has added a GeForce 256 to the PS5 to close the gap.
 
Last edited by a moderator:
What I would like is improved motion resolution, even if game play and most of the image would be rendered in 30fps.
Moving image at 30 fps is simply quite blurry with sample and hold displays and I would love to see motion interpolation methods which wouldn't cost additional ~80ms to latency like on TV. (And when done in engine they should allow better quality as well.)
 
It's 18% at a minimum for the GPU.

And that’s just for the GPU, theres a cpu differentional and a rather large bandwith difference too. All assuming max boosted clocks, offcourse.

Remains to be seen if any of the consoles will have the vrs advantage.
 
What I would like is improved motion resolution, even if game play and most of the image would be rendered in 30fps.
Moving image at 30 fps is simply quite blurry with sample and hold displays and I would love to see motion interpolation methods which wouldn't cost additional ~80ms to latency like on TV. (And when done in engine they should allow better quality as well.)
hmmm... like SSR and SSAO this would work much better with 2 framebuffers, for static world and dynamic objects. Adding motion interpolation it could be finally worth to do this?
 
DF said MS showed a benchmark where XBSX was on par with a 2080 and both hitting 60fps, with the 2080 providing a little better performance.
Not followed conversation, so could be missing the point.
But remember the xsx was doing things that aren't in the pc version even at ultra settings.
It was just to say look already at 2080 performance without optimization and running with settings that pc doesn't have access to yet.

Fact is we don't know how that build of the game would run on 2080 either.
 
Not followed conversation, so could be missing the point.
But remember the xsx was doing things that aren't in the pc version even at ultra settings.
It was just to say look already at 2080 performance without optimization and running with settings that pc doesn't have access to yet.

Fact is we don't know how that build of the game would run on 2080 either.

Irrelevant, for the test they used the same settings. If it’s on par with a 2080 then, in the benchmark from overclock3D, the XBSX would sit between the 2080S and 2070S, below 50fps.
 
Nah, you are clearly mixing contradictory information and omitting important facts.
From DF:
"However, even basic ports which barely use any of the Series X's new features are delivering impressive results. The Coalition's Mike Rayner and Colin Penty showed us a Series X conversion of Gears 5, produced in just two weeks. The developers worked with Epic Games in getting UE4 operating on Series X, then simply upped all of the internal quality presets to the equivalent of PC's ultra, adding improved contact shadows and UE4's brand-new (software-based) ray traced screen-space global illumination. On top of that, Gears 5's cutscenes - running at 30fps on Xbox One X - were upped to a flawless 60fps. We'll be covering more on this soon, but there was one startling takeaway - we were shown benchmark results that, on this two-week-old, unoptimised port, already deliver very, very similar performance to an RTX 2080."

The 2080 as benchmark is not relevant in this discussion. I just need to know that it's running 60fps ultra+ and that gives me a fairly good idea of how well it's going to perform when optimized.

Also, that 5700XT is RDNA1 and has lower clocks than the PS5. So there is that as well, your 50% difference pretty much gone.
That 5700XT is red devil, it's 2010Mhz. It's 10% less than PS5 but it also runs 10% more CUs at a full 40CU. 10% additional clock isn't going to give it a 50% performance boost.
RDNA1 to RDNA2, there's been no indication of expecting more performance for same TF. 50% performance per watt will result in higher clocks, but the TF count is nearly the same here.

40CU at 2000Mhz vs 36CU at 2230Mhz. I'm not saying this is going to be a straight expectation of performance, but any clustering/sorting algorithm would group PS5 and a 2000Mhz 5700XT together.
From a vector perspective, these are very equivalent GPUs.
  • They are both RDNA
  • They are both 10.24 TF
  • They are both 448 GB/s bandwidth
  • They have the same bus
  • They have nearly the same CUs
  • They have nearly the same Mhz

RDNA 2 would have to be 1 hell of an architecture to throw out all RDNA 1 results at equivalent settings.

Here's another benchmark. Overclocked 40CU to 2100 Mhz, OC memory to 484 GB/s so even higher.
https://www.techpowerup.com/review/xfx-radeon-rx-5700-xt-thicc-iii-ultra/16.html
Average 44 fps on ultra @ 4K

Under these circumstances, PS5 would not be able to achieve 4K60.
XSX will likely be there with optimization.

A final caveat: I’m not saying this is to be the expected norm. I’m just indicating the realistic probability that there could be exceptions to the rule that there should only be a 18-20% performance differential. Tech talk aside; it doesn’t matter, and most people won’t notice it. But if we are sticking to shop talk, then we must be open to the realities here.
 
Last edited:
And that’s just for the GPU, theres a cpu differentional and a rather large bandwith difference too.
One would expect the performance delta in resolution to be determined by lowest between GPU delta and BW delta. If both are 20%, the total performance difference won't be 40% less, or 36% less (0.8 x 0.8 = 0.64; 1.0 - 0.64 = 0.36) but 20% less because the resources are scaled uniformly. Similarly, a delta of 20% for GPU and 15% for BW should result in a 20% total impact because the BW for PS5's GPU would be proportionally higher.

This is probably a daft discussion though. The major point is the delta is far higher than the 10% jayco mentioned.
 
From DF:
"...very similar performance to an RTX 2080."


Similar performance to RTX 2080 + your benchmark = 49.8 FPS. Of course the information about how the 2080 performs is relevant, it's the only thing that provides a fair comparison.

gears-5-3840-2160.png


You are using the 60fps number and then throwing it into a PC gaming benchmark to say the XBSX is 50% faster. It's not an honest comparison and defies basic logic, either the 2080 runs Gears 5 at 4K with Ultra settings at 60fps or at 49fps. Both cannot be true.

One would expect the performance delta in resolution to be determined by lowest between GPU delta and BW delta. If both are 20%, the total performance difference won't be 40% less, or 36% less (0.8 x 0.8 = 0.64; 1.0 - 0.64 = 0.36) but 20% less because the resources are scaled uniformly. Similarly, a delta of 20% for GPU and 15% for BW should result in a 20% total impact because the BW for PS5's GPU would be proportionally higher.

This is probably a daft discussion though. The major point is the delta is far higher than the 10% jayco mentioned.

Wow, I'm getting called out while people here say that games will run at 1440p@30 on PS5 and at 4K@30 on XBSX.
 
Back
Top