Poll: Mid-gen console owners - do you pick performance or quality?

Given the option, do you choose higher framerates or higher resolution/graphics?


  • Total voters
    57
I'd wager if most people are rocking a 4k set then the data would be even more skewed towards the graphics and res option. All in all people are upgrading mainly for the pixels and effects, literally non of my friends including some die hard pc gamers care much about high framerate.
I'd think many PC gamers don't care about high framerate because they already play at >60fps. The higher the framerate, the less perceptible the gains are. But from 30fps to 60fps is a very big perceptible difference for many. But it depends of people perception and framerate tolerance.

Would you prefer playing at 15fps with double graphical effects or at 30fps with worse graphics ?
 
I'd think many PC gamers don't care about high framerate because they already play at >60fps. The higher the framerate, the less perceptible the gains are. But from 30fps to 60fps is a very big perceptible difference for many. But it depends of people perception and framerate tolerance.

Would you prefer playing at 15fps with double graphical effects or at 30fps with worse graphics ?
15fps is way too low obviously, it's slideshow territory while 30fps is quite acceptable and relatively smooth for most people. As for pc folks, if Ultra settings are more than just next to none existent difference compared to High settings, I'm sure they'll embrace the Quality settings more even at 30fps. But then again it's highly unlikely because the games are hamstrung by base consoles so the value of Ultra is easily neglected.
 
But Insomniac Games said making 60fps was no good!

Photorealistic games with a heavy emphasis in characters look bad to me at 60fps. The framerate makes all the animation flaws really obvious. Pretty much everything looks better at 60fps, specially Nintendo games.
 
Photorealistic games with a heavy emphasis in characters look bad to me at 60fps. The framerate makes all the animation flaws really obvious.
That's because the animations aren't being handled correctly. This isn't a new problem, either. If you go back and play Quake 1 without a source port and you can see the the enemies animate at something like 10-15 FPS. Quake 2 has it's own issues with the interpolation, where the textures shift. These aren't issues with modern games designed around high frame rates, only games designed without scalability in mind.
 
That's because the animations aren't being handled correctly. This isn't a new problem, either. If you go back and play Quake 1 without a source port and you can see the the enemies animate at something like 10-15 FPS. Quake 2 has it's own issues with the interpolation, where the textures shift. These aren't issues with modern games designed around high frame rates, only games designed without scalability in mind.
No, I mean actual modern games. The uncanny valley is strong at 60fps for me.
 
If you have a 4k set it's a no brainier for me. If you still on 1080p performance has it's merits then. There are very few games that will give you locked 60fps with performance mode.
 
"At 24fps I see a hobbit, at 48fps I see a guy dressed as a hobbit". The added realism can be counterproductive to me.

To me when I read this, it gets translated to...

At an unrealistic presentation speed, I see something unrealistic as being more realistic. At higher presentation speeds that get closer to reality, I see something unrealistic as being less realistic.

But, I understand what you mean.

For people that watch a lot of 24/30 FPS video, they expect to see animation and presentation artifacts (juddering/stuttering/etc.). And thus the lines become more blurred with regards to what is real and what isn't. IE - your mind is already making excuses for what it sees in an attempt to make it fit what it expects to see.

When they see something that is presented as closer to what their eyes see in reality, then it "feels" like something is off. So at higher framerates the presentation gets that much closer to what your eyes expect to see when it seems the world in motion (moving objects or just moving your eyes around), there's less things for your mind to accommodate and thus unrealistic things now have less reason to exist.

This was similar to the uncanny valley that a lot of people experienced as HD content started to appear as HD TVs started to replace SD TVs. Things were too sharp, too well defined, they could now see the makeup actors wore, they could now see the lighting in shows and film was overdone, etc. This also happened with a lot of people with the move from 4:3 TVs to 16:9 TVs. For many years a lot of people insisted on 4:3 content stretched to fit a 16:9 screen because it looked "wrong" to have black bars on the sides. IE - a filled screen looked more "realistic" to them even with the stretching than a 4:3 correct presentation with black bars.

As I never watch 24/30 FPS anything anymore if I can help it, 60 FPS is the new normal "unreality" for me. It's less divergent from how the world around me looks and operates, but still enough that it's not quite as smooth as reality. So there's still a bit of adjustment my mind makes to make it seem closer to reality and that takes along everything else with it.

So now, when I look at anything 24/30 FPS, all I see is a horrible mess where almost nothing makes realistic sense. Hobbits included. :)

Unfortunately for me, this means that without some form of good interpolation (plenty of tools for this on PC that are much better than TV interpolation in general), 24/30 FPS film and television look downright wrong and nasty.

End result? I hardly ever watch film or television shows anymore, because they all just look "wrong." The ones I do watch, I try to find a digital source and then convert it to 60 FPS.

Regards,
SB
 
Last edited:
To me when I read this, it gets translated to...

At an unrealistic presentation speed, I see something unrealistic as being more realistic. At higher presentation speeds that get closer to reality, I see something unrealistic as being less realistic.

But, I understand what you mean.

For people that watch a lot of 24/30 FPS video, they expect to see animation and presentation artifacts (juddering/stuttering/etc.). And thus things the lines become more blurred with regards to what is real and what isn't. IE - your mind is already making excuses for what it sees in an attempt to make it fit what it expects to see.

When they see something that is presented as closer to what their eyes see in reality, then it "feels" like something is off. So at higher framerates the presentation gets that much closer to what your eyes expect to see when it seems the world in motion (moving objects or just moving your eyes around), there's less things for your mind to accommodate and thus unrealistic things now have less reason to exist.

This was similar to the uncanny valley that a lot of people experienced as HD content started to appear as HD TVs started to replace SD TVs. Things were too sharp, too well defined, they could now see the makeup actors wore, they could now see the lighting in shows and film was overdone, etc. This also happened with a lot of people with the move from 4:3 TVs to 16:9 TVs. For many years a lot of people insisted on 4:3 content stretched to fit a 16:9 screen because it looked "wrong" to have black bars on the sides. IE - a filled screen looked more "realistic" to them even with the stretching than a 4:3 correct presentation with black bars.

As I never watch 24/30 FPS anything anymore if I can help it, 60 FPS is the new normal "unreality" for me. It's less divergent from how the world around me looks and operates, but still enough that it's not quite as smooth as reality. So there's still a bit of adjustment my mind makes to make it seem closer to reality and that takes along everything else with it.

So now, when I look at anything 24/30 FPS, all I see is a horrible mess where almost nothing makes realistic sense. Hobbits included. :)

Unfortunately for me, this means that without some form of good interpolation (plenty of tools for this on PC that are much better than TV interpolation in general), 24/30 FPS film and television look downright wrong and nasty.

End result? I hardly ever watch film or television shows anymore, because they all just look "wrong." The ones I do watch, I try to find a digital source and then convert it to 60 FPS.

Regards,
SB
Spend a few days playing N64 games and after all that suffering you'll be cured! Maybe...
 
"At 24fps I see a hobbit, at 48fps I see a guy dressed as a hobbit". The added realism can be counterproductive to me.
I'm still bummed that I never had the chance to see the Hobbit films at HFR. Mainly because I rather stick pins in my eyes than see them (I caught them on TV here and there and good lord they're rubbish), but it means that I can't really have an opinion on HFR movies. Has the technology been picked up by other directors? Haven't heard anything since that trilogy.
 
I thought the 48fps experience was awful. Sure, it's sharper, but at that framerate, nothing escapes your critical capacities. The Hobbit movies looked like 3-hour-long behind-the-scenes documentaries.
I don't think anyone's picked up on the technology since then. The reception was pretty poor and I'd imagine the strain on the effects studios was ridiculous as well. At 48fps and 3d, the rendering load is almost quadrupled over a standard 24fps experience after all.
I think Cameron said something about wanting to go 96 fps for his Avatar sequels, but then he's also full of himself like no other director.
 
I thought HFR was great, and even more so to those who watched the movies in 3D.

96FPS sounds even better.

But what I really do prefer is for the movie to look good.
 
To me when I read this, it gets translated to...

At an unrealistic presentation speed, I see something unrealistic as being more realistic. At higher presentation speeds that get closer to reality, I see something unrealistic as being less realistic.

But, I understand what you mean.

For people that watch a lot of 24/30 FPS video, they expect to see animation and presentation artifacts (juddering/stuttering/etc.). And thus the lines become more blurred with regards to what is real and what isn't. IE - your mind is already making excuses for what it sees in an attempt to make it fit what it expects to see.

When they see something that is presented as closer to what their eyes see in reality, then it "feels" like something is off. So at higher framerates the presentation gets that much closer to what your eyes expect to see when it seems the world in motion (moving objects or just moving your eyes around), there's less things for your mind to accommodate and thus unrealistic things now have less reason to exist.

This was similar to the uncanny valley that a lot of people experienced as HD content started to appear as HD TVs started to replace SD TVs. Things were too sharp, too well defined, they could now see the makeup actors wore, they could now see the lighting in shows and film was overdone, etc. This also happened with a lot of people with the move from 4:3 TVs to 16:9 TVs. For many years a lot of people insisted on 4:3 content stretched to fit a 16:9 screen because it looked "wrong" to have black bars on the sides. IE - a filled screen looked more "realistic" to them even with the stretching than a 4:3 correct presentation with black bars.

As I never watch 24/30 FPS anything anymore if I can help it, 60 FPS is the new normal "unreality" for me. It's less divergent from how the world around me looks and operates, but still enough that it's not quite as smooth as reality. So there's still a bit of adjustment my mind makes to make it seem closer to reality and that takes along everything else with it.

So now, when I look at anything 24/30 FPS, all I see is a horrible mess where almost nothing makes realistic sense. Hobbits included. :)

Unfortunately for me, this means that without some form of good interpolation (plenty of tools for this on PC that are much better than TV interpolation in general), 24/30 FPS film and television look downright wrong and nasty.

End result? I hardly ever watch film or television shows anymore, because they all just look "wrong." The ones I do watch, I try to find a digital source and then convert it to 60 FPS.

Regards,
SB

I agree with pretty much all of this, except I prefer 24fps film to the tv interpolation. I think if the industry was able to experiment with it more, they could make nice looking 60fps films that didn't look cheap or costumey. The financial barriers will probably prevent them from doing so. A 60fps Pixar movie would be really cool.
 
Back
Top