What framerate will next-gen target? *spawn

What framerates will next-gen games target?

  • VRR (Variable Refresh Rate)

    Votes: 15 30.6%
  • 30ps

    Votes: 23 46.9%
  • 60ps

    Votes: 26 53.1%
  • 90fps

    Votes: 1 2.0%
  • 120fps

    Votes: 1 2.0%

  • Total voters
    49
Status
Not open for further replies.
Then why do 60fps games like DOOM, Wolfenstein and Battlefront 2 have a very fast and better motion blur than Uncharted 4 with sub 30fps for example? Don't always hype your games and bad-mouth others around like that. Don't confuse technique with artwork.
You can devote a huge chunk of resource on Motion blur, sure I like that effect but then looking worse in most other area is not how you wanna convince me with your point. Just saying for example.
But the main point is that 30fps with much better visuals is the way to go if not plainly evident by this E3's showing. Everybody goes gaga over Sony's Big four and CyberPunk yet almost no one cared about BF5 or Rage 2. I'm not bashing on those games of course and I absolutely meant no disrespect to the devs, just pointing out how well their products are received by the public and Press alike. I can't imagine the outcry when TLOU2 looking like UC4's MP but hey it's 60fps!
 
You can devote a huge chunk of resource on Motion blur, sure I like that effect but then looking worse in most other area is not how you wanna convince me with your point. Just saying for example.
But the main point is that 30fps with much better visuals is the way to go if not plainly evident by this E3's showing. Everybody goes gaga over Sony's Big four and CyberPunk yet almost no one cared about BF5 or Rage 2. I'm not bashing on those games of course and I absolutely meant no disrespect to the devs, just pointing out how well their products are received by the public and Press alike. I can't imagine the outcry when TLOU2 looking like UC4's MP but hey it's 60fps!

Oh please. Sony's big four and Cyberpunk 2077 are huge AAA single-player titles from critically acclaimed developers. BF5 is a MP focused game from a publisher which became infamous for it's greedy loot box strategies, and Rage 2 is the successor to a game which wasn't that popular, made by a AA developer which was responsible for the dull Mad Max game. Of course there is only mild enthusiasm for those two games. The fact that you are using such examples just shows that you don't have any strong arguments.

Next-gen will give developers a massive boost in CPU power, making 60fps viable for many games with bigger scopes and complexity. We saw it happening with shooters, which went from 30fps last-gen to 60fps in current-gen. And it will happen again, this time with open-world games. There will be more 60fps games than ever, and there is nothing that will stop this.
 
Those are just examples. Every example has other influencing factors so can't be considered in isolation. The totality of 60fps and 30 fps games needs to be considered. Go draw up a list of best selling games and see which ones are 30fps and which are 60 fps. Better yet, in games that provide an option on mid-gen refreshes, which option are people choosing? Certainly Insomniac publicly gave up on 60fps for the reasons ultragpu states.

Most importantly for media/social interest, people talk about visuals regardless of anything else. Brand new IPs can gain lots of attention by looking amazing. If Rage2 looked the most amazing thing ever, people would talk about it.
There will be more 60fps games than ever, and there is nothing that will stop this.
That's a totally unqualified assertion. More 60 fps total numbers? Higher proportion of 60 fps games? Are you including indies? There may well be more 60 fps options in games if they're easy to add, seeing as devs have started doing that already. Games targeting 60fps only - it means choosing a better gameplay experience for a worse marketing position on the whole; that's what's most likely to stop it.

Every single argument you can make for 60fps applied to every previous generation over its predecessor. "PS1 was limited in CPU. PS2 will allow 60 fps games. PS2 was limited - PS3 will have more 60 fps games. PS4 is more powerful than PS3, finally devs will be able to target 60fps as standard." Why should PS5 versus PS4 be any different?
 
Because frame rate is far more understood than it used to be by the general consumer, and there is demand for higher frame rates.

Developers have also at least expressed interest in adding 60fps modes to their games this gen, but the weak cpus sometimes don't allow it. Last gen we didn't have high frame rate modes and high resolution modes, so at the very least there will be more 60fps games next gen because games can be built for these new options from the get go.

When in this gen did we start seeing different rendering modes as options? I'm pretty sure it wasn't at the start.
 
Why should PS5 versus PS4 be any different?
I wondered if we're getting to the point where the budget for higher fidelity assets is hitting a plateau for risk & time while the requirements in processing for a leap in render quality would outpace the affordable tech we are expecting.

i.e. for the majority of devs/publishers, higher framerate is low hanging fruit to be exploited, especially for cross-gen.

----

Of course, we aren't seeing the "ultra" quality settings on consoles nor locked 4K, but I do find it interesting that a number of developers are introducing 60fps or lower-res/higher quality modes on the midgen twins as an option.

Certainly, it's pretty easy to just set 4K (as the other option being offered) and get really good utilization out of the GPU just because moar pixels per triangle.

I did wonder if devs are keeping track of folks who pick these modes right now as a way to gauge where they should put their efforts next gen as well, not unlike how MS had the ability to track what output resolution 360 users used.

/RandomEarlyMorningRamblings

hm...
 
Last edited:
Oh please. Sony's big four and Cyberpunk 2077 are huge AAA single-player titles from critically acclaimed developers. BF5 is a MP focused game from a publisher which became infamous for it's greedy loot box strategies, and Rage 2 is the successor to a game which wasn't that popular, made by a AA developer which was responsible for the dull Mad Max game. Of course there is only mild enthusiasm for those two games. The fact that you are using such examples just shows that you don't have any strong arguments.

Next-gen will give developers a massive boost in CPU power, making 60fps viable for many games with bigger scopes and complexity. We saw it happening with shooters, which went from 30fps last-gen to 60fps in current-gen. And it will happen again, this time with open-world games. There will be more 60fps games than ever, and there is nothing that will stop this.
Because frame rate is far more understood than it used to be by the general consumer, and there is demand for higher frame rates.

Developers have also at least expressed interest in adding 60fps modes to their games this gen, but the weak cpus sometimes don't allow it. Last gen we didn't have high frame rate modes and high resolution modes, so at the very least there will be more 60fps games next gen because games can be built for these new options from the get go.

When in this gen did we start seeing different rendering modes as options? I'm pretty sure it wasn't at the start.
I hope we don't go too far ahead of ourselves regarding the CPU performance on next gen. Sure they may be Zen class but the overall workload should still be balanced out in ratio to the more advanced graphics, simulation, AI and etc, devs will always extract as much CPU power out of them as possible. So in the end you will always wind up scraping for left overs. Another word, power doesn't dictate fps but priorities do. It's almost arrogant to claim because we're getting shiny new CPUs our fps should magically be 60 now. Don't forget we're talking next gen, not current gen.
And of course a separate framerate mode is entirely feasible on the basis of sacrificing res and visuals.
 
i.e. for the majority of devs/publishers, higher framerate is low hanging fruit to be exploited, especially for cross-gen.
agreed.
Higher FPS
Higher Resolution
and if possible ray traced lighting will be the low hanging fruit if the hardware has the muslce to support it.
Ideally they move to GPU side dispatch to be able to render lots of individual items instead of creating the labour to batch items together.

Certainly, it's pretty easy to just set 4K (as the other option being offered) and get really good utilization out of the GPU just because moar pixels per triangle.

I did wonder if devs are keeping track of folks who pick these modes right now as a way to gauge where they should put their efforts next gen as well, not unlike how MS had the ability to track what output resolution 360 users used.

Absolutely! The capabilities within MS, they are leaving nothing on the table when it comes to data mining.
 
Well Ultra settings are often extremely wasteful and only exist for future pc hardware or just because why not. Console resources are simply often better spent elsewhere. Unless the pc version is a 1:1 port of a console version with no benefits you'll never see ultra settings on consoles.
 
I also agree devs should research and conduct thorough survey for the different modes used by mid gen console owners. I have a feeling the resolution of the TV matters here too.
 
30fps should be the difference between convincing lighting and shading rather than more polygons. Exactly the same assets now but ramped up a level with lighting will look next-gen I think. Then throw in more procedurally reactive stuff like deforming and clumping snow and mud. Costs needn't increase much at all to make use of more performance for better visuals.

Takes something like Spider Man. It looks very video-gamey at 1080p30. Smae with TLoU2. 4K60 requires 8x the rendering power, so an 8x improvement in console power makes the game look the same in screenshots only at 4K, and move better. 4K30 means you've twice as much shading power per pixel to make the world look more realistic.

Would TLoU3 be best looking like the TLoU2 in 4K60, or looking a step closer towards photo-realism? I certainly know which choice would get the most attention on the Internet!
 
Well, if VR is going to be viable on console, they need to worry about having enough CPU to push very high frame rates.

There's no doubt that slow-paced games, or low-skill cinematic games have favoured 30 fps. People playing may be playing more for an atmosphere or "experience" rather than tight gameplay. Input lag and image clarity may not be the primary concern for a large segment of gamers. But if you look at the trend from say PS2 until now, games that can't maintain a stable 30fps are derided, where games like GTA Vice City and Shadow of the Colossus were monstrosities in terms of performance. And between PS360 to PS4/X1, pretty much all of the competitive games, shooters, sports, racing, fighting games have opted for 60fps. I can't think of a remotely decent shooter that isn't 60Hz. There is a progression towards performance as diminishing returns for resolution kick in. On PC there is a growing audience of people that opt for 1080p144 over 4k60, or now 1440p144 over 4k60. People on console are becoming more aware of the benefits of 60Hz. Uncharted 4 and Gears 4 were both 60Hz in multiplayer, because they knew 30Hz is just not viable. So, I don't see how this is controversial, but you'll probably see a growing number of games that take advantage of 60Hz, and a bit of a bump in cpu will facilitate that option.

I kind of wish the steam hardware survey tracked refresh rate.
 
30fps should be the difference between convincing lighting and shading rather than more polygons.
Right, so I'm saying that a generational leap in shading/dynamic GI is going to be a lot more than what we can afford for next gen while increasing the base resolution.

Good lighting artists are expensive.

Exactly the same assets now but ramped up a level with lighting will look next-gen I think. Then throw in more procedurally reactive stuff like deforming and clumping snow and mud. Costs needn't increase much at all to make use of more performance for better visuals.

Takes something like Spider Man. It looks very video-gamey at 1080p30. Smae with TLoU2. 4K60 requires 8x the rendering power, so an 8x improvement in console power makes the game look the same in screenshots only at 4K, and move better. 4K30 means you've twice as much shading power per pixel to make the world look more realistic.

Would TLoU3 be best looking like the TLoU2 in 4K60, or looking a step closer towards photo-realism? I certainly know which choice would get the most attention on the Internet!

Right, so all this is just the low hanging fruit where increasing samples in the current pipeline aren't enough to justify the processing cost for the return on visual quality vs gameplay fluidity, especially when temporal supersampling methods are getting better.
 
You can devote a huge chunk of resource on Motion blur, sure I like that effect but then looking worse in most other area is not how you wanna convince me with your point. Just saying for example.
But the main point is that 30fps with much better visuals is the way to go if not plainly evident by this E3's showing. Everybody goes gaga over Sony's Big four and CyberPunk yet almost no one cared about BF5 or Rage 2. I'm not bashing on those games of course and I absolutely meant no disrespect to the devs, just pointing out how well their products are received by the public and Press alike. I can't imagine the outcry when TLOU2 looking like UC4's MP but hey it's 60fps!

Still other developers manage to make the effect at 60fps in a much better quality. And just look at TloU II where the Motion Blur still doesn't look good. Some 60fps games like Wolfenstein doing the same in a 16ms render time that others manage with 33ms.

Why do these games have the most hits? Because they are different from other games or because they are new. At first glance, Battlefield V and The Division 2 for example look very similar. But those who will read more information will notice big differences. And even Battlefront II, which received much less Attention than Battlefront I, sold over 10 million copies in the first few months. Far Cry 5 is one of the best-selling games of the year and how many watched videos? I didn't notice much of a hype.
 
Those are just examples. Every example has other influencing factors so can't be considered in isolation.

Well they are bad examples. It's just a failed attempt to create a narrative that 60fps games are automatically not as popular as 30fps.

The totality of 60fps and 30 fps games needs to be considered. Go draw up a list of best selling games and see which ones are 30fps and which are 60 fps.

I'm not really sure what you are trying to say here. Are you insinuating that there is a negative correlation between framerate and commercial success?

Better yet, in games that provide an option on mid-gen refreshes, which option are people choosing? Certainly Insomniac publicly gave up on 60fps for the reasons ultragpu states.

The statement from Insomniac was made 10 years ago. That's almost an eternity in this industry. James Stevenson from Insomniac recently commented on this:

https://www.resetera.com/threads/q-...-overdrive-spyro-and-more.33176/#post-6202285

Most importantly for media/social interest, people talk about visuals regardless of anything else. Brand new IPs can gain lots of attention by looking amazing. If Rage2 looked the most amazing thing ever, people would talk about it.

Rage 2 is a 30fps game on PS4/Xbox One. The muted reception to the Rage 2 reveal has nothing to do with framerate or restriction to either 30 or 60fps. It just looks like a generic shooter from a developer whose previous games didn't exactly light the charts on fire. Have you followed why people are praising TLoU 2 right now? It's also because of the interesting story, the characters and because of the great animation... None of this would be negatively impacted by higher framerate.

Would TLoU3 be best looking like the TLoU2 in 4K60, or looking a step closer towards photo-realism? I certainly know which choice would get the most attention on the Internet!

TLoU 2 runs on a 4.2 TF console at 4K CB / 30fps. To run this game at 60fps you only need a 8.4 TF console with a better CPU. But of course PS5 will be much more powerful than this (likely 12+TF). So not sure why you think that there won't be a massive increase in effect quality in TLoU 3, even at 60fps. You are also ignoring the fact that framerate is a part of graphics, too, and can massively contribute to making a game look more realistic. I'm pretty sure that nobody will look at a 60fps TLoU 3, that looks much better than the previous game, thanks to an additional 4 TF GPU power (8.4 + 4 = 12.4 TF), and think "Oh, this looks disappointing!". In fact, I believe the reaction will be quite the opposite.
 
...
There's no doubt that slow-paced games, or low-skill cinematic games have favoured 30 fps...
In short, different game genres warrant different framerate choices. No change there going forwards. Mandated or targeted console-wide framerates are nonsense. Many games will still want high framerate. Many will still want maximal eye-candy. Nothing is changing next gen AFAICS to make higher framerate more preferable than it's ever been, such that genres that choose to be 30fps this gen will choose to be 60fps next.
 
Right, so I'm saying that a generational leap in shading/dynamic GI is going to be a lot more than what we can afford for next gen while increasing the base resolution.

Good lighting artists are expensive.
A computational system won't need artists. You'll just give it the scene and it'll light it. If that model can't be run on next-gen hardware, then yes, they'll forgo that and stick with current-gen visuals at higher framerate and resolution. I'd be surprised is devs can't push the envelope though.

especially when temporal supersampling methods are getting better.
That's actually the best argument in favour of higher framerate. Reconstruction techniques that benefit from higher temporal resolution, not possible on previous generations, will encourage adoption of higher framerates more universally to facilitate their implementation.
 
Well they are bad examples. It's just a failed attempt to create a narrative that 60fps games are automatically not as popular as 30fps.
Start posting less confrontationally. It's a fair argument to be debated - not an agenda or narrative or load of old nonsense.

I'm not really sure what you are trying to say here. Are you insinuating that there is a negative correlation between framerate and commercial success?
I don't know. The data should show as much.

The statement from Insomniac was made 10 years ago.
If you can find any big-name dev saying they find higher framerate leads to better sales, please present it. As your quote (about VR framerates) says, they had real data when they made that choice. Ultragpu reckons there's a correlation. And the biggest selling titles of the year are 30 fps. So at least he's presenting actual data and correlation unlike you who's just dismissing that data like it doesn't exist.

Rage 2 is a 30fps game on PS4/Xbox One. The muted reception to the Rage 2 reveal has nothing to do with framerate or restriction to either 30 or 60fps. It just looks like a generic shooter from a developer whose previous games didn't exactly light the charts on fire. Have you followed why people are praising TLoU 2 right now? It's also because of the interesting story, the characters and because of the great animation... None of this would be negatively impacted by higher framerate.
You're missing the argument completely. I already stated every game has a number of variables affecting it. I already said if you want to properly compare 30fps and 60fps title interest, a large sampling needs to be used across games rather than looking at specific games. But until someone has put together that sampling, proof of the theory exists in the clear evidence that we have a bunch of highly acclaimed titles getting lots of attention for their visuals and being 30 fps. Rather than dismissing them out of hand, present clear evidence to the contrary.

I can also point to Twitter. You get far more likes and retweets of pretty game WIPs than high-framerate WIPs.

TLoU 2 runs on a 4.2 TF console at 4K CB / 30fps. To run this game at 60fps you only need a 8.4 TF console with a better CPU. But of course PS5 will be much more powerful than this (likely 12+TF). So not sure why you think that there won't be a massive increase in effect quality in TLoU 3, even at 60fps.
It was a hypothetical example for crying out loud. And the point is, whatever a theoretical TLoU 3 looks like on PS5 at 60 fps, it'll look a lot better at 30 fps. That'll be true until we can achieve photorealism at 60 fps. Unless you deny that screenshots and overall visual appeal won't be better at 30 fps than 60 fps, stop arguing against this fact.

The pro-60 fps side argues about the benefits to gameplay and player experience. They don't try arguing that 30 fps games won't look better. If you want to argue that the media will start raving more about game fluidity than eye-candy, go present some examples. Hell, present any evidence in favour of your argument, please! Anything to show 60 fps sell better, or get more media coverage on account of their higher framerates, etc.
 
In short, different game genres warrant different framerate choices. No change there going forwards. Mandated or targeted console-wide framerates are nonsense. Many games will still want high framerate. Many will still want maximal eye-candy. Nothing is changing next gen AFAICS to make higher framerate more preferable than it's ever been, such that genres that choose to be 30fps this gen will choose to be 60fps next.

Yes, but I'm suggesting there is a trend towards 60Hz as entire genres of games are switching, like first person shooters, which is not a small market. On ps360 30Hz was the norm, with the exception of COD. On X1 and PS4, 60Hz is the norm.
 
Status
Not open for further replies.
Back
Top