What framerate will next-gen target? *spawn

What framerates will next-gen games target?

  • VRR (Variable Refresh Rate)

    Votes: 15 30.6%
  • 30ps

    Votes: 23 46.9%
  • 60ps

    Votes: 26 53.1%
  • 90fps

    Votes: 1 2.0%
  • 120fps

    Votes: 1 2.0%

  • Total voters
    49
Status
Not open for further replies.
...
It was a hypothetical example for crying out loud. And the point is, whatever a theoretical TLoU 3 looks like on PS5 at 60 fps, it'll look a lot better at 30 fps. That'll be true until we can achieve photorealism at 60 fps. Unless you deny that screenshots and overall visual appeal won't be better at 30 fps than 60 fps, stop arguing against this fact.

The pro-60 fps side argues about the benefits to gameplay and player experience. They don't try arguing that 30 fps games won't look better. If you want to argue that the media will start raving more about game fluidity than eye-candy, go present some examples. Hell, present any evidence in favour of your argument, please! Anything to show 60 fps sell better, or get more media coverage on account of their higher framerates, etc.

I would argue that most games look better at 60fps in motion, and 30fps games only look better in screenshots. There was no mechanism for mass-distributed game trailers at 60Hz until 2014. I'd love to see data on performance modes on PS4 Pro and Xbox One X. It's a difficult sell for me that 30Hz looks better when it has such a negative impact on any camera movement from blur and judder, and it's impact on any kind of animation from characters to particles, or animated textures. For some reason motion doesn't count when people evaluate image quality, and I believe that's mostly because people are unequipped to evaluate it, and it's easier to compare screenshots. Screenshots really have nothing to do with gaming because you don't play screenshots.

With the exception of people with very large tvs, or people who sit abnormally close to their tvs, there is already a diminishing return on resolution past 1080p. Returns on frame rate don't start to diminish until around 120Hz.
 
But that takes us to the argument used for HFR movies, only moreso with games because games visually look better per pixel (at least in screenshots) at lower framerates than movies which still look like real life at higher and lower framerates.

'Looks better' is subjective. The question is what the mass market considers 'looks better', if the decision devs are facing is commercial success. Especially regards the OP which is talking about the entire platform, rather than specific games or genres.
 
I probably ought to give my position rather than arguing ultragpu's. ;)

Framerate won't be mandated next gen.
There won't be a common framerate next gen.
Devs will choose based on what's best for their games.
AAA 'cinematic experiences' will go 30 fps for more eye-candy and get the internet saying, "OMG that looks amazing."
I agree with ultragpu that it's the maximum eye-candy games that get people noticing. There are far more 'OMG that looks amazing' comments for the 30 fps landmark titles like HZD and GOW than there are in 60 fps games like Fortnite or RAGE.
Of course there are exceptions and outliers.
Action games that win fans with gameplay will target 60 fps.
There's nothing special about next gen that makes 60 fps any more likely than it has been every other generation. We have sub-30fps games every gen and we'll get them next gen too.
...Except if there's some frequency dependent technique like reconstruction.
The ratio of 60fps:30fps might change, but no-one's talking about that nor presenting any sort of measures for that data such that we'd probably never know anyway. Let's be honest - there's probably 10x as many games released on these consoles than we've ever heard of, and all we discuss are the big-ticket items. ;)
 
I really don't believe you can compare movies to games. Movies are filmed around the limitations of the medium. Unless you're playing a game that has little to no interactivity, basically a CG movie, then I don't think it's fair to say the same arguments apply. I believe it's totally fair to say that with respect to gaming judder and panel-based(sample-hold) motion blur are objectively bad.
 
...
There's nothing special about next gen that makes 60 fps any more likely than it has been every other generation. ...

Removing the cpu draw-call bottleneck by dedicating a little more silicon to the cpu, which it sounds like both Microsoft and Sony are doing with a zen cpu.
 
I really don't believe you can compare movies to games. Movies are filmed around the limitations of the medium. Unless you're playing a game that has little to no interactivity, basically a CG movie, then I don't think it's fair to say the same arguments apply. I believe it's totally fair to say that with respect to gaming judder and panel-based(sample-hold) motion blur are objectively bad.
We're talking about what makes people go "oooooo" on the internet, and share links with friends. There's zero point trying to qualify or measure 'better looking'. If a dev has a choice between a game that's 30 fps with a different aesthetic to the same game at 60 fps, and they want to get as many oooooos as possible, they'll pick 30 fps probably. If you want to call that worse looking, great - they'll pick the worse looking option because it's the more popular one that generates the greatest visceral response from their audience. Unless that has changed, or changes, and people would be more enamoured with a plainer looking 60 fps HZD or GOW than they are with the ground-breaking, jaw-dropping visuals we have at 30 fps.
 
30 fps is still going to have it's place in slow paced cinematic games, there so much that can still improve graphically that will have more wow factor. Better hair, clothes, deformable terrain,number of npcs and more I can't think of now.
 
Well, for one, if you were to try to make HZD or GOW run at 60fps on vanilla PS4, it'd probably have to be closer to 720p, and I don't think anyone would accept that. Next-gen games will probably be pushing somewhere between 1440p and 4k, most likely with temporal up/super-sampling, dynamic resolution. That will give a lot more headroom to vary resolution to achieve 60fps while maintaining a decent amount of resolution.
 
Interesting that you say you'd have to reduce resolution but not reduce visual quality. There are three aspects - resolution, framerate, and quality - that need to be balanced. You can have a 4K60 game now on XB1S if you make it really simple. You could possibly have a near photo-real game at 480p24. The choice is always left to the devs, even when we once had a resolution mandated. And no one of those aspects is dominant, and no one has an objective value that we can set as a minimum. The only one approaching a realistic limit is the resolution.
 
I probably ought to give my position rather than arguing ultragpu's. ;)

Framerate won't be mandated next gen.
There won't be a common framerate next gen.
Devs will choose based on what's best for their games.
AAA 'cinematic experiences' will go 30 fps for more eye-candy and get the internet saying, "OMG that looks amazing."
I agree with ultragpu that it's the maximum eye-candy games that get people noticing. There are far more 'OMG that looks amazing' comments for the 30 fps landmark titles like HZD and GOW than there are in 60 fps games like Fortnite or RAGE.
Of course there are exceptions and outliers.
Action games that win fans with gameplay will target 60 fps.
There's nothing special about next gen that makes 60 fps any more likely than it has been every other generation. We have sub-30fps games every gen and we'll get them next gen too.
...Except if there's some frequency dependent technique like reconstruction.
The ratio of 60fps:30fps might change, but no-one's talking about that nor presenting any sort of measures for that data such that we'd probably never know anyway. Let's be honest - there's probably 10x as many games released on these consoles than we've ever heard of, and all we discuss are the big-ticket items. ;)

This.

Convincing the general public or gamer that 60fps (motion smoothness) should be the utmost standard for the next-generation of systems, over maximum graphic complexity and shader rich imagery... good luck with that. I'm not denying the importance of advancing framerate speeds (60, 90, 120, and so-on) and the additional image clarity that it brings with high refresh-rates... but, it's still a loosening 'checkbox' sales point towards the masses.

If I had a choice between SMS, Naughty Dog or Guerilla's next masterpiece containing complex jaw-dropping scenes and shader rich imagery at 30fps, or a game that's just one or two steps above their prior titles at 60fps... I will always choose the eye-candy (30fps) choice without hesitation.
 
Would COD sell better if it was 30fps? Why did Halo switch from 30fps to 60fps if the former sells more?

Also, when ND said U4 was going to be 60fps everyone cheered, nobody complained about reduced fidelity. When they downgraded it to 30fps everyone continued to praise them. Sony exclusives automatically get praise no matter what so I don't think they're a good sample To draw conclusions from.
 
Would COD sell better if it was 30fps? Why did Halo switch from 30fps to 60fps if the former sells more?
That's already been discussed to death. These shooters benefit most from higher framerate - it's a priority. They sell on gameplay rather than people going 'ooooooo' on the internet. Also, Halo reduced resolution rather than image quality.

Also, when ND said U4 was going to be 60fps everyone cheered, nobody complained about reduced fidelity.
They showed superior next-gen visuals at 60fps in their reveal. Then reality bit them in the butt and they had to downgrade. They were faced with a choice to downgrade the framerate and keep the visual splendor, or downgrade the visual splendor and create something not as far ahead of the previous UC3.
When they downgraded it to 30fps everyone continued to praise them. Sony exclusives automatically get praise no matter what so I don't think they're a good sample To draw conclusions from.
Now you're just trolling. There are many, many 30fps and 60 fps to choose from. The exclusives are mentioned here because they're best selling and in that 'AAA cinematic' bracket.

I just typed "Assassin's Creed 60fps" into Google and came up with this (first item in results mentioning consoles):

Assassin's Creed: Unity will run at 30fps on PS4 and Xbox One, and Ubisoft isn't interested in pushing that number higher because action-adventure games feel better below 60fps, Creative Director Alex Amancio told Techradar. Amancio said it's the same case with resolution (Unity runs at 900p).

"If the game looks gorgeous, who cares about the number?" he asked.

Level Design Director Nicolas Guérin shared the sentiment, saying that Ubisoft for a long time wanted to hit 60fps in its games, but "you don't gain that much" from it.
If you want to argue 60 fps is universally preferred or better, go find a quote from a 'AAA cinematic' 60 fps game, like the Witcher 3, or Skyrim, or GTA, or Quantum Break, or Bloodborne, or Destiny, or Far Cry, or Zelda:BotW, yada yada. Yes, 60 fps games exist, and 60 fps better for certain gameplay, and some would argue it's better for clarity and 'better visuals' always, but for PR and for plenty of folk wanting something that looks ever more realistic, 30 fps with maximal eye-candy is the proven choice of developers and many of the best selling titles. One can hardly argue that 30 fps is a PR negative or loses sales...
 
Last edited:
i hope they stick to 30fps for a lot of games.

it's sad that some people don't have the faculty to adapt to lower FPS, but that's life, like some can't play VR without getting nauseous.
 
Interesting that you say you'd have to reduce resolution but not reduce visual quality. There are three aspects - resolution, framerate, and quality - that need to be balanced. You can have a 4K60 game now on XB1S if you make it really simple. You could possibly have a near photo-real game at 480p24. The choice is always left to the devs, even when we once had a resolution mandated. And no one of those aspects is dominant, and no one has an objective value that we can set as a minimum. The only one approaching a realistic limit is the resolution.

A near photo-real game at 480p24? Maybe an interactive movie where you have no control over camera movement. Say hello to motion sickness and input lag. Even moving a mouse cursor around at 24fps would be a monstrosity. You can make real-time graphics that are 24fps, sure. But a game? You'd have a hard time playing civ at 24fps.

The only reason I picked resolution is because of diminishing returns on resolution. If you have a 4k game at 30fps, you can probably try to target something like 1440-1600p at 60Hz, if you have the CPU headroom. That logic applies, but pixel density becomes increasingly an issue the closer you get to 1080p and especially sub 1080p. Obviously there are all kinds of other compromises you can make to visual quality, it was just an example. To be honest, I'm pretty much happy with anything that gets you to 60Hz.

I suppose there are just a lot of people that only really play games that are 30fps, so they're just conditioned to how it's the bare minimum in terms of playability.
 
Lol. This is an absurd point of view.
I agree, but I find it no less absurd than your POV...
I suppose there are just a lot of people that only really play games that are 30fps, so they're just conditioned to how it's the bare minimum in terms of playability.
It's not conditioning, but things looking pretty. People like things that look pretty, and they'll put up with lots to have that, including framerates that can drop into the 20s and personalities that make one miserable.

I really wish both sides could appreciate the others just have different values instead of feeling/suggesting they're wrong in the head somehow.
 
I suppose there are just a lot of people that only really play games that are 30fps, so they're just conditioned to how it's the bare minimum in terms of playability.

Not really. Nintendo fans and Switch userbase prefer performance over image quality and resolution, for the most part. Switch first party games are mostly 60fps, but of course, at the cost of image fidelity and resolution. That's not to say Switch games are bad, because they're not... however, I wouldn't consider Nintendo engines super complex when compared to Sony's and Microsoft's own internal engines.

Anyhow, maybe Sony and Microsoft will start offering more 'PC like settings' for console gamers wanting more performance or higher framerates, over image quality and complexity.
 
That's already been discussed to death. These shooters benefit most from higher framerate - it's a priority. They sell on gameplay rather than people going 'ooooooo' on the internet. Also, Halo reduced resolution rather than image quality.

They showed superior next-gen visuals at 60fps in their reveal. Then reality bit them in the butt and they had to downgrade. They were faced with a choice to downgrade the framerate and keep the visual splendor, or downgrade the visual splendor and create something not as far ahead of the previous UC3.
Now you're just trolling. There are many, many 30fps and 60 fps to choose from. The exclusives are mentioned here because they're best selling and in that 'AAA cinematic' bracket.

I just typed "Assassin's Creed 60fps" into Google and came up with this (first item in results mentioning consoles):

Assassin's Creed: Unity will run at 30fps on PS4 and Xbox One, and Ubisoft isn't interested in pushing that number higher because action-adventure games feel better below 60fps, Creative Director Alex Amancio told Techradar. Amancio said it's the same case with resolution (Unity runs at 900p).

"If the game looks gorgeous, who cares about the number?" he asked.

Level Design Director Nicolas Guérin shared the sentiment, saying that Ubisoft for a long time wanted to hit 60fps in its games, but "you don't gain that much" from it.
If you want to argue 60 fps is universally preferred or better, go find a quote from a 'AAA cinematic' 60 fps game, like the Witcher 3, or Skyrim, or GTA, or Quantum Break, or Bloodborne, or Destiny, or Far Cry, or Zelda:BotW, yada yada. Yes, 60 fps games exist, and 60 fps better for certain gameplay, and some would argue it's better for clarity and 'better visuals' always, but for PR and for plenty of folk wanting something that looks ever more realistic, 30 fps with maximal eye-candy is the proven choice of developers and many of the best selling titles. One can hardly argue that 30 fps is a PR negative or loses sales...
Can you prove that targeting 60fps would actually lead to a loss of sales? No. Did Insomniac games got more praise when they switched to 30fps? No. Does U4 even look as good as the original 60fps trailer? No.

"They were faced with a choice to downgrade the framerate and keep the visual splendor, or downgrade the visual splendor and create something not as far ahead of the previous UC3."

They downgraded the visual splendor by cutting the framerate in half. 60fps alone was already far ahead of UC3.

30fps developers don't praise 60fps. Imagine my shock. I mean, why wouldn't they talk poorly about their own products? Mhhh...

Whatever advantage they gain at 30fps is short lived since it only takes a few months before another game surpasses it. The choppiness of that framerate however remains forever.

Oh, but of course, that allows them to sell you the same product a generation later but with a decent framerate...

And if we're talking realism, 60fps does indeed look far more realistic than cinematic 30fps. Just take a look at real life 60fps vs 30fps footage.

30fps is nice for pretty screenshots but since we're talking VIDEOgames, framerate is ultimately far more important.

In fact, for pretty screenshots games now feature photomodes and can even render things better than the actual game. No need to sacrifice framerate for this.
 
Nowhere did I say all games MUST be 60 fps. It is most definitely preferable and objectively better.

I'm not sure how it's controversial, or subjective, to say:
-60 fps has less input lag, which leads to more responsive controls
-60 fps has less motion blur from retinal blur on a sample and hold display, and less perceived motion blur overall, leading to sharper image quality
-60 fps provides smoother animation
-60 fps has less judder when panning
-the lessened motion blur, judder and faster display updates allow players to identify and react to visual changes more quickly

To me, these are objective truths about frame rate, quantifiable improvements that can be had. I can't even understand why anyone would argue against them. Subjectively, people may say they don't care, and would trade any of those things for other visual differences, but 60fps is objectively better than 30fps.
 
Status
Not open for further replies.
Back
Top