What do you prefer for games: Framerates and Resolutions? [2020]

What would you prioritize?


  • Total voters
    42
Indeed and he chose the wrong set of examples to prove his point.

I didn't say it's daft to prefer lower frame rate over higher frame rate. I said it's daft to claim to see no difference between frame rates of the same thing. Not quite an apples to apples statement I'm making here. Your example isn't 30vs60fps. It's 720P@60 to 1080p@30.
Yes, but you said you must be daft to be unable to see the visual difference between 30 and 60 fps. My friend couldn't perceive it. You're stating he's stupid as a result.

That's akin to saying the Himbi tribe are daft for not being able to perceive blue, or we're daft for not being able to perceive the difference in greens they can. Intelligence/wisdom doesn't come into it.

Our ability for our brains to interpolate direction, and thus react to it, is based on frame rate, the higher the frame rate, the easier time we have of it.
That's not the only aspect to enjoying a game. And indeed, if you're happy for people to want higher framerate, you should be okay with them wanting more visual complexity without feeling they're stupid for wanting greater visual complexity.
 
Yes, but you said you must be daft to be unable to see the visual difference between 30 and 60 fps. My friend couldn't perceive it. You're stating he's stupid as a result.

That's akin to saying the Himbi tribe are daft for not being able to perceive blue, or we're daft for not being able to perceive the difference in greens they can. Intelligence/wisdom doesn't come into it.

That's not the only aspect to enjoying a game. And indeed, if you're happy for people to want higher framerate, you should be okay with them wanting more visual complexity without feeling they're stupid for wanting greater visual complexity.
You're not daft for being colour blind, or daft if you physically cannot see it. This is like calling a blind person daft for being unable to see, or someone legally blind for being unable to see something so obviously in-front of your face.

But if you want to just compare frame rates - you should ask your friend if he can see the differences here:
https://www.testufo.com

He may not know where to look, this forces you to see the difference. And there is no artificial motion blur being applied here which causes confusion on fluidity of motion.
Most people think they cannot see a difference north of 60fps. But with a 144FPS monitor it's very obvious on UFO how much sharper 144fps is over 60fps. I'm sure if I had a 240Hz monitor I'd probably see even more visual clarity in motion.

The challenge for most people is seeing a difference in the amount of pixel movement. the more pixels that can be delta per frame, the more obvious the blur occurs from frame rate.

It's not necessarily a fair test to your friend to look at 2 frame rates, 2 resolutions and declare a winner without isolating where to look and what to look for, and then assume he can't see it either. 30fps and 60fps look pretty close pixel movement to 240pixels per second. Even more so at 120px/second. If you showed him a game that was barely moving more than 120px-240px second, then I would absolutely agree with his assessment.

I am ok with people wanting more visual complexity. I'm OK with someone looking at low and high settings and saying I prefer HIGH. I'm okay with someone looking at low and ultra settings with DXR and saying I prefer the one with DXR and vice versa someone should want to lower some settings to obtain double the frame rate.

Without a real slider here, most people don't know how much is actually required to reduce to achieve it. But when people see 2x the framerate, they assume the gulf in graphical quality is so vast that they no longer think it's anywhere comparable.

I'm not okay with someone saying I prefer high settings but using 2 separate titles to do it. You say, I tried low with HFR, and I tried high with LFR, I prefer High settings.

If someone said comparing some shittier quality hair, for double the frame rate, as long as they could actually compare to two, that's fine, I accept that. But most of the time you don't even get the chance to see what a HFR version is, but people have put stakes in the ground.
 
Last edited:
You're not daft for being colour blind, or daft if you physically cannot see it.
Have you read the link? It shows how people without biological limitations on their colour perception cannot perceive colours. They're not biologically colour blind, but just can't notice the difference between blue and green. And the blue thing you can blame on them being colour blind, but the green thing is us being blind to colours that are obvious to the Himdi. That's not because we are biologically blind to green, but we just haven't learnt to perceive them the same way.

But if you want to just compare frame rates - you should ask your friend if he can see the differences here:
https://www.testufo.com
That's a test contrived to showcase the differences. We're talking about games here, and you're calling someone who can see 60 fps in games but can't really notice a difference between 60 fps and 30 fps daft. I stressed to him, "can you not see the difference?!" and he was like, "I guess it's a bit smoother maybe." Unlike you and me who see 60 fps and perceive it as much better, he didn't, and it's not because he's mentally deficient.

This is like calling a blind person daft for being unable to see, or someone legally blind for being unable to see something so obviously in-front of your face.
No because those people can't see, rather than can't perceive. It would instead be more like you and your Significant Other looking at this colour chart:

upload_2020-6-28_18-26-51.png

And your SO saying they only really liked 3, and you forgetting and at the store, grabbing 7 because they all look the same, painting the wall, and then your SO calling you daft for using the wrong colour.
It's not necessarily a fair test to your friend to look at 2 frame rates, 2 resolutions and declare a winner without isolating where to look and what to look for, and then assume he can't see it either. 30fps and 60fps look pretty close pixel movement to 240pixels per second.
This entire thread is about exactly that though! Given a game with a choice between resolution, quality, and framerate settings, which would you prefer? In this case, a choice between higher fidelity (which some people will say they can't notice - that doesn't make them daft) and higher framerate (which some people will say they can't notice - that doesn't make them daft), he chose resolution because it was immediately a better quality experience for him and I chose higher framerate because it was immediately a better quality experience for me. Neither of us is mentally deficient for our brains valuing a different aspect to the moving image rendition.

Indeed, that side-by-side in isolation comparison has the same result for every rendering aspect.
"Here's GTAV at 240p, 480p and 1080p - which do you prefer?"
"Here's GTAV with flat lighting+no shadows, flat lighting+shadows, and GI + shadows - which do you prefer?"

Everyone can see everything when shown in a white, silent room with just two or three variations to look at. ;)

Your statement and argument seems to be trying to deal with human perception as if it's a string of mechanical and logistical operations. It's far more complicated than that and you shouldn't assume that what you perceive (not see) is the same as everyone else.
 
Last edited:
Without a real slider here, most people don't know how much is actually required to reduce to achieve it. But when people see 2x the framerate, they assume the gulf in graphical quality is so vast that they no longer think it's anywhere comparable.

Well on console it kind of is obvious, there are big AAA games that are 30fps and ones that are 60fps and there is quite a big difference in quality.

And to you every game as long as it's Sony is the best looking thing ever made. You're generally incapable of any form of objective view on any topic so long as Sony games are involved.

This I feel is uncalled for just as much as you disregarding other people's opinions. I can't speak for anyone else but for myself I have been playing games since playing pitfall on my Gran Dad's Atari when I was 4 years old and I know what I prefer and in slow paced games like the Last of Us I prefer the better fidelity of what a 30fps game can bring over the sacrifices needed to get it running at 60fps.

Also using that Pragmata demo from the PS5 thingie I can say without doubt I would want it without Ray Tracing if it meant I had to stair at that God awful hair that was on the little girl ( I don't think that will be in the final game and I think it will have better hair and Ray tracing in the final product)

I fully understand the advantages of a higher frame rate and was hopeful that games may be able to be a lower resolution next gen but higher frame rate because it would help with the temporal upscaling. So you would kind of get the best of both worlds by saving resources on a lower resolution but gaining all the advantages of a better frame rate.
 
Your statement and argument seems to be trying to deal with human perception as if it's a string of mechanical and logistical operations. It's far more complicated than that and you shouldn't assume that what you perceive (not see) is the same as everyone else.
I concede.
After reading your response I understand that what I wrote was unfair and unnecessarily generically cruel.
 
This I feel is uncalled for just as much as you disregarding other people's opinions. I can't speak for anyone else but for myself I have been playing games since playing pitfall on my Gran Dad's Atari when I was 4 years old and I know what I prefer and in slow paced games like the Last of Us I prefer the better fidelity of what a 30fps game can bring over the sacrifices needed to get it running at 60fps.
Yea but you don't go proving it by comparing 2 separate console exclusives to do so. This is what I"m calling him out on, we've had years of a battery of benchmarks on PC that compare low and high settings. This is how we compare settings in exchange for frame rate.

The largest differences between games is production value and artistic design.

If you want to compare frame rate and visual elements, you need to do so with the same game.
 
we've had years of a battery of benchmarks on PC that compare low and high settings. This is how we compare settings in exchange for frame rate.

That not necessarily a good comparison either. There have been ridiculous ultra settings that completely tank a frame rate and make barely any perceptible difference and there have even been low settings that worsen the frame rate depending on your hardware configuration.

I feel it's much better to compare high budget console games that's running on the same console. There is a big difference in 60fps games vs 30fps games in that space. But like I said in an earlier post it's dependant on the type of game for me, I will take a graphical hit for a better frame rate in games like Apex or Warzone or any competitive multiplayer game really.
 
I want my fighting games 60fps always.
Now about first person shooters, I must say that I was delighted with Doom's 60fps and high quality visuals.
But never demanded it with third person adventure games and RPGs. There I certainly want as much detail as possible
 
Yea but you don't go proving it by comparing 2 separate console exclusives to do so. This is what I"m calling him out on, we've had years of a battery of benchmarks on PC that compare low and high settings.
I don't think that's fair as they don't target 30fps (or lower framerate). I don't think model detail is ever adjusted in PC settings - you just get more stuff and features. Game complexity is set at whatever and then settings are dialled up with greater or lesser efficiency.

I think the only realistic comparison is a broad-scale stochastic sampling across lots and lots of games across the past generations. The wider the sampling, the more averaging out we'll get between different budgets, techniques, etc. to get a general sense of what designing for 30fps on console gets you different to what designing for 60fps on console gets you.

It would seem at the moment, contributors are presenting stuff to support their view. It should just be about accumulating data without an agenda IMO, just to see what the differences are. Heck, if we did a good enough job, I'm sure Iroboto could create a neural net that can take 60fps titles and up-quality them to 30fps quality. :p
 
I don't think that's fair as they don't target 30fps. I don't think model detail is ever adjusted in PC settings - you just get more stuff and features.

I think the only realistic comparison is a broad-scale stochastic sampling across lots and lots of games across the past generations. The wider the sampling, the more averaging out we'll get between different budgets, techniques, etc. to get a general sense of what designing for 30fps on console gets you different to what designing for 60fps on console gets you.

It would seem at the moment, contributors are presenting stuff to support their view. It should just be about accumulating data without an agenda IMO, just to see what the differences are. Heck, if we did a good enough job, I'm sure Iroboto could create a neural net that can take 60fps titles and up-quality them to 30fps quality. :p
Lol. You can’t. I guess. What Xbat says has some truth. I mean, if you want to compare like for like content Then you need to find a game targeting like for like the same things. Shadow of Tomb Raider, UC4, TLOU2 are similar in vein in that respect. But the issue just comes down to production value then. And most studios are not willing to match what Sony first party studios do in that space.

For me the question was whether there could have been a 60fps mode and if it were preferred. Dropping the 1440p down for instance and just giving the option for people to decide, but it’s not there so I guess we will never know. Which puts the argument probably away from pre-conceived decisions on what we think is best, and just asking developers to provide the options for HFR in additional to the LFR and letting us decide.

this generation might have been far too hamstrung by the CPU to do it. Which is why even if we dropped graphical settings, we may still not hit that 60fps target
 
Last edited:
this generation might have been far too hamstrung by the CPU to do it. Which is why even if we dropped graphical settings, we may still not hit that 60fps target

Yup another reason I'm more hopeful about getting 60fps games next gen. Although Ray tracing is probably going to be a big driver of 30fps games.
 
Do people(not here) really think every game that is 30fps is only so because of binary choices like resolution and graphics..?

Not how development on console works. There will always be more things you can do with more cpu and gpu power at 30fps than 60. 60fps in itself is a sacrifice to achieve higher fludity. There is always a tradeoff somewhere.

Games with high overhead on ps5 like crossgen ps4 games and remasters of old games like demons souls that have perf modes wont be standard unless devs account for 60fps at the start of development as is the case with every gen dating back to the advent of 3d graphics when such tradeoffs became commonplace, and even before.

Games like nioh aimed for 60fps as a standard in the development of the game. Hence why performance modes even exist in that game.
 
Do people(not here) really think every game that is 30fps is only so because of binary choices like resolution and graphics..?
on PC that would largely be true, just because of the imbalance between CPU and GPU workloads, we've not had games that tax a high powered CPU such that 60fps is not attainable. Numerous draw calls would be a likely culprit in which a graphics workload would crush a cpu down. Outside of that, it's hard to say.

Leaving largely, the GPU side or some sort of memory bandwidth constraint, items scalable through graphic settings and resolution.
 
Good job, to prove frame rate doesn’t matter you provided examples of game stills where nothing is moving. But in your example Gears 5 is outputting the equivalency of 4x the number of pixels than TLOU2.

I'll put this succinctly, seems like a lot of people can't seem to notice resolution differences, or graphical differences unless DF is there to point out to them where 2 things are different side by side.

But you've got to be daft to not see the visual difference between 30 and 60 fps, and you immediately see and feel the difference without needing to do any form of side by side.
When the hell did I say I couldn't see the difference? I think you're confusing things here. I've posted repeatedly how 60fps sees a smoother experience due to better motion clarity but that doesn't gain you better graphics as in lighting, geometry, shaders, shadows, animation, simulation, SFX, particles, you know, everything that's got to do with graphics. 60 fps doesn't give you better graphics, 30 does if using the same console as basis.
Also using PC Ultra vs Low is not a good example at all. The power utilization on Ultra is absolutely wasteful because you're still constrained by the Low settings in your game design, level size, scope, asset quality, gameplay etc. Sure you can increase the LOD, density slightly and go 6k-8k, 120fps but that's a far cry from using those extra Tflops on something fundamentally different. You look at Asscreed Odyssey 4k/60fps on a 2080 Ti and compare to Horizon 2 4k/30, guess which one would look better to the majority of people?
Your constant attempt to belittle the graphics afforded by a 30fps targeted title is utterly disrespectful to the devs who try to maximize the visuals through a 30fps pipeline. Sure by all means if you can convince yourself 60fps is the most pleasing to your eyes for an overall experience then all the power to you. But don't act like majority of folks can't appreciate a smooth 30fps experience either with much more graphics shown on screen. You know, if 30fps is so blurry and unplayable I guess the reviewers and gamers alike must be blind and utterly insane to vote those 30fps titles as GOTY winners or DF must be smoking some heavy weed to crown all their Best Looking Games to 30fps titles.
 
My simple answer would be, it depends on the genre and mode. Halo and Gears' single-player for example is something I would not mind if they went 30FPS and pushed the visual fidelity to maximum - assuming of course we're talking a consistent, tear-free 30FPS.

I find myself kind of eye-rolling at some of ultragpu's posts (sorry man lol) cause at times they do kind of come off hyperbolic and something from hot-take island, but where I will agree with him is that with big budget, narrative driven single-player games; pixar-level visuals definitely makes me feel more immersed and glued to the screen. That's why I wouldn't mind if more developers offered the visual & performance modes. But I can empathize with devs if they feel they'd rather push one or the other full-stop.

Multiplayer on the other hand should be 60FPS, period. No ifs, ands, or buts about it. :p
 
I must say that I was delighted with Doom's 60fps and high quality visuals.

Its to me more impressive then any 30fps aaa game out there graphically, because it looks just as good, if not better and at 60fps. Its also a multiplatform game that scales well, and the game is probably less funded aswell, with a shorter develop-time. This means the studio could output more releases during the generations lifetime.
 
Last edited:
Silly topic. 60 FPS should be standard come next gen. And yes, some slow paced cinematic third person games can be 30 FPS and still be somewhat enjoyable.

But any fast paced games, like first person shooters, should be 60 FPS minimum. I've been playing through quite a few games during this COVID-19 spring (most 60 FPS, some 30 FPS) and it is clear as day that 60 FPS games are just so much more enjoyable to play. If you prefer games with hardly any game play, and hours and hours of cut scenes, then you may see things differently, but for me the minimum of 60 FPS is essential.

I love how Far Cry 5 (and its DLC sequel) looks, but the 30 FPS nature makes them harder to enjoy while playing the game. On the other hand, games like Titanfall 2, Cod:IW, CoD:WW2, CoD:MW, Wolfenstain New Colossus, Doom Eternal etc. plays so well, popping headshots in move is easier and provides lots of gratification and far less frustration.

And BTW, for fellow Sci-fi buffs, I can't praise Titanfall 2 and Call Of Duty Infinite Warfare enough. Brilliant games with fantastic game play.
 
Back
Top