What framerate will next-gen target? *spawn

What framerates will next-gen games target?

  • VRR (Variable Refresh Rate)

    Votes: 15 30.6%
  • 30ps

    Votes: 23 46.9%
  • 60ps

    Votes: 26 53.1%
  • 90fps

    Votes: 1 2.0%
  • 120fps

    Votes: 1 2.0%

  • Total voters
    49
Status
Not open for further replies.
Can you prove that targeting 60fps would actually lead to a loss of sales? No
But at least I present meaningful evidence in support of my opinion...
They downgraded the visual splendor by cutting the framerate in half. 60fps alone was already far ahead of UC3.
Semantics. Obviously there's major communication problem here with the pro-60 fpsers taking offence at the idea that a downgrade in framerate isn't being termed a reduction in 'visual splendor'. That's because we need different terms to talk about different things! We are separating quality of pixels with temporal frequency of pixels very clearly through obvious language.
 
But at least I present meaningful evidence in support of my opinion...
Semantics. Obviously there's major communication problem here with the pro-60 fpsers taking offence at the idea that a downgrade in framerate isn't being termed a reduction in 'visual splendor'. That's because we need different terms to talk about different things! We are separating quality of pixels with temporal frequency of pixels very clearly through obvious language.
Just the opinions of some devs, not actual proof of framerate being tied to sales.

A higher framerate means better motion resolution. That directly impacts the perception of the quality of the pixels.

Specially when objects or the camera moves fast it's much easier to actually see what's happening on the screen and appreciate the graphics at higher framerates.
 
Not really. Nintendo fans and Switch userbase prefer performance over image quality and resolution, for the most part. Switch first party games are mostly 60fps, but of course, at the cost of image fidelity and resolution. That's not to say Switch games are bad, because they're not... however, I wouldn't consider Nintendo engines super complex when compared to Sony's and Microsoft's own internal engines.

Nobody buys the Switch for graphics anyway. Nintendo can't compete in the graphics field, so it's a smart move for them.

Also, their games tend to be far more simple. When they really make ambitious games, they run at 30fps. See the last Zelda.

The choppiness of that framerate however remains forever.

Nobody cares about that... if people says in 2035 that GTA5 has a choppy framerate on console, RG still made a lot of money when the game was released.

In business, what matters is that your product has a high value at the right moment. Nobody cares if its value decreases when it becomes irrelevant anyway.

Who cares about what people will think about GTA5 in 2050 ? Certainly not RG...

30fps is nice for pretty screenshots but since we're talking VIDEOgames, framerate is ultimately far more important.

Fact is the industry disagrees with you... you can argue as much as you want, 30fps games still are very popular. So your opinion isn't popular among gamers.
 
A near photo-real game at 480p24? Maybe an interactive movie where you have no control over camera movement. Say hello to motion sickness and input lag. Even moving a mouse cursor around at 24fps would be a monstrosity. You can make real-time graphics that are 24fps, sure. But a game? You'd have a hard time playing civ at 24fps.

Zelda Ocarina of Time on N64 was 240p20 with eventual drops. On NST regions, that is. It's framerate was worse on PAL.
Not that the framerate wasn't bad on that game, or most of that gen for that matter. But it sure doesn't render a game unplayable.
 
Zelda Ocarina of Time on N64 was 240p20 with eventual drops. On NST regions, that is. It's framerate was worse on PAL.
Not that the framerate wasn't bad on that game, or most of that gen for that matter. But it sure doesn't render a game unplayable.


At least it was played on a CRT, so a lot of the display issues with LCD and OLED wouldn't be a factor. I'd hate to see it on a typical 60Hz tv. It's also a 20-year-old game. I don't think 20 fps is even remotely acceptable now, and I'm going to go out on a limb and say gamers would agree. Maybe some people would tolerate a game with a 24-26 constant. I don't know. For the most part, I think 30 fps is the bare minimum, or just slightly above it. It's the D- of frame rates. Not a fail, but barely a pass.
 
At least it was played on a CRT, so a lot of the display issues with LCD and OLED wouldn't be a factor. I'd hate to see it on a typical 60Hz tv. It's also a 20-year-old game. I don't think 20 fps is even remotely acceptable now, and I'm going to go out on a limb and say gamers would agree. Maybe some people would tolerate a game with a 24-26 constant. I don't know. For the most part, I think 30 fps is the bare minimum, or just slightly above it. It's the D- of frame rates. Not a fail, but barely a pass.
I dont disagree with any of that, but its worth pointing out it is not as impossible to play a game at low fps as you initially made it out to be. Its still far from ideal, but still playable, akwardly so, but playable nonetheless.
 
Last edited:
I like having options for sure. I think hypothetically, if I'm following this discussion correctly here:
if PS5 went more imbalanced on silicon in favour of GPU, and MS went for more CPU to get more games to 60; and more games had 60fps+ options on next gen for MS; it would do worse?

I dunno. It's pretty sweet to have high frames. The developers do their best to make their vision come to life, but not having the option worse and getting better marginally better graphics seems worse than having the option. I would like to see more high frame rate games in next gen. That being said though, the question for me is how much silicon is required to really accomplish it. I frankly don't know.

Everyone said 4K60 wasn't possible on X1X, and we're just seeing more evidence of it with Gears 5, Forza 7, FH4, and other titles possibly. We've seen some 3P titles get really close as well.
Everyone also said next gen CPU will enable mad sick animations and shit, and I'm staring at TLOU 2 I'm like, well lol?

So I'd like to see the high frame rate, but I still think there's more to be done here. We're still seeing improvements on our CPU utilization, I think the further you get away from cross platform coding, the more CPU tricks they can leverage on the console.

That being said, like hyperthreading discussion vs not hyperthreaded -- having more CPU will make it easier across the board for all developers to have better frame management. Custom hardware features may enable better CPU saturation, but it will be difficult for all teams to pull off.
 
At least it was played on a CRT, so a lot of the display issues with LCD and OLED wouldn't be a factor. I'd hate to see it on a typical 60Hz tv. It's also a 20-year-old game. I don't think 20 fps is even remotely acceptable now, and I'm going to go out on a limb and say gamers would agree. Maybe some people would tolerate a game with a 24-26 constant. I don't know. For the most part, I think 30 fps is the bare minimum, or just slightly above it. It's the D- of frame rates. Not a fail, but barely a pass.
Also the animations were simple enough in oot and majora as to not really need the extra frames at least animation wise. I still play majora on a crt and it's not that bad.

These days, a low latency 30fps game (as in not killzone 2) feels pretty snappy on a 20ms ish lcd. But yeah 30fps games on your average Walmart tv can feel pretty chunky.

@iroboto

Forza horizon 4 will be 4k30, its 60fps mode surely won't be native. Not that I care I can't wait for that game at a smooth 60!
 
Nobody cares about that... if people says in 2035 that GTA5 has a choppy framerate on console, RG still made a lot of money when the game was released.

In business, what matters is that your product has a high value at the right moment. Nobody cares if its value decreases when it becomes irrelevant anyway.

Who cares about what people will think about GTA5 in 2050 ? Certainly not RG...

Fact is the industry disagrees with you... you can argue as much as you want, 30fps games still are very popular. So your opinion isn't popular among gamers.
So we shouldn't care about the quality of the products because the developers are making money. OK.

The industry seems to disagree with me but not really, they know 60fps is better but they withold it for the "remastered" version so people have to buy the game twice.
 
I like having options for sure. I think hypothetically, if I'm following this discussion correctly here:
if PS5 went more imbalanced on silicon in favour of GPU, and MS went for more CPU to get more games to 60; and more games had 60fps+ options on next gen for MS; it would do worse?

I dunno. It's pretty sweet to have high frames. The developers do their best to make their vision come to life, but not having the option worse and getting better marginally better graphics seems worse than having the option. I would like to see more high frame rate games in next gen. That being said though, the question for me is how much silicon is required to really accomplish it. I frankly don't know.

Everyone said 4K60 wasn't possible on X1X, and we're just seeing more evidence of it with Gears 5, Forza 7, FH4, and other titles possibly. We've seen some 3P titles get really close as well.
Everyone also said next gen CPU will enable mad sick animations and shit, and I'm staring at TLOU 2 I'm like, well lol?

So I'd like to see the high frame rate, but I still think there's more to be done here. We're still seeing improvements on our CPU utilization, I think the further you get away from cross platform coding, the more CPU tricks they can leverage on the console.

That being said, like hyperthreading discussion vs not hyperthreaded -- having more CPU will make it easier across the board for all developers to have better frame management. Custom hardware features may enable better CPU saturation, but it will be difficult for all teams to pull off.
They are aiming for it and not at any one point did they say it's native 4k. What they've shown in E3 is 30fps footage and god knows what level of visual downgrade 60fps mode would receive.
 
Just the opinions of some devs, not actual proof of framerate being tied to sales.

A higher framerate means better motion resolution. That directly impacts the perception of the quality of the pixels.

Specially when objects or the camera moves fast it's much easier to actually see what's happening on the screen and appreciate the graphics at higher framerates.
I think it's no strange phenomenon that hundreds of millions of gamers had no trouble adapting to 30fps motion clarity while still seeing things clearly:). You make it sound like people are incapable of appreciate the intricate details or effects at that framerate which is not true. Sure 60fps makes it clearer to see, not denying that but you also lose half of all that detail so in the end you don't necessarily win anything. If everything is on par visually, then higher fps wins but obviously the entire discussion here is about making sacrifice in a console space isn't it? And that visual sacrifice does not favor marketing, PR, social media etc one bit. You gain much more just being 30fps like some major devs (Insomniac, Ubi) have already mentioned. Not to mention the majority of games this gen is 30fps no less.
 
I think it's no strange phenomenon that hundreds of millions of gamers had no trouble adapting to 30fps motion clarity while still seeing things clearly:). You make it sound like people are incapable of appreciate the intricate details or effects at that framerate which is not true. Sure 60fps makes it clearer to see, not denying that but you also lose half of all that detail so in the end you don't necessarily win anything.
Oh but you gain in other aspects. Your ability to judge direction and speed if motion, and of course lower hand to screen latency.
These are game playing aspects rather than game watching, and of course games can be designed making allowances for the low frame rates, avoiding fast motion, having controls that don’t allow fast angular movement and so on, just like film makers have shot film for a century in ways that motion artifacts doesn’t become too jarring.
Make no mistake, it is a major limitation for films as well, and game play makes the problem much more difficult to handle.

If everything is on par visually, then higher fps wins but obviously the entire discussion here is about making sacrifice in a console space isn't it? And that visual sacrifice does not favor marketing, PR, social media etc one bit.
And this is completely true. Since the marketing (screenshots or video) doesn’t allow interaction, most of the problems associated with low frame rates are hidden away until after purchase. Small wonder that much of console gaming is stuck at frame rates that are considered unacceptable in other areas.
It is about compromise, and what is considered an appropriate compromise depend on capabilities. How much of a benefit does further increases in resolution bring to people who play from their couch? Is it really a good idea to spend rendering resources on simulating abberation from old film cameras, when much of the audience is too young to have any relation to those at all, as film grain, vignetting, lens flare and chromatic aberrations have been taken care of by the move to digital capture, modern lens design and digital post processing? (And is fundamentally alien to computer gaming.) How much does the minutia of shadow rendering really matter in game play?
Photo realistic rendering is a straightforward goal to optimize towards, but at the end of the day playing a game is not the same activity as looking at a photograph.
Maybe at the next console node we will have sufficient rendering power that, despite marketing, the compromise can shift a bit, and we can set a goal of having frame rates that are at least equal to the US AC electrical power frequency as opposed to only half that.
Then again, maybe not.
 
I think it's no strange phenomenon that hundreds of millions of gamers had no trouble adapting to 30fps motion clarity while still seeing things clearly:). You make it sound like people are incapable of appreciate the intricate details or effects at that framerate which is not true. Sure 60fps makes it clearer to see, not denying that but you also lose half of all that detail so in the end you don't necessarily win anything. If everything is on par visually, then higher fps wins but obviously the entire discussion here is about making sacrifice in a console space isn't it? And that visual sacrifice does not favor marketing, PR, social media etc one bit. You gain much more just being 30fps like some major devs (Insomniac, Ubi) have already mentioned. Not to mention the majority of games this gen is 30fps no less.
Gamers also adapt to games with loot boxes so I guess loot boxes are good :LOL:

The detail you gain at 30fps is lost as soon as the screen moves. It's only good for mostly static scenes. And since in videogames most of the time there's motion it's then a net loss.

As for the devs that claim that 30fps is better, what else are they going to say? "The game would be better at 60fps but we want you to buy the game again next-gen" :LOL:

We'll have to wait until PS5 to properly appreciate the great animation work of TLoU2. Too bad.

Halo 5 sold a lot less than Halo 3.:mrgreen:
But was it because of the framerate? :p
 
Nowhere did I say all games MUST be 60 fps. It is most definitely preferable and objectively better.

I'm not sure how it's controversial, or subjective, to say:
Because you're only looking at part of what makes a game entertaining, the part that matters most to you because of your personal subjective values. Other people with other personal, subjective values, will find value in other aspects of game visuals. Dismissing these other values as objectively wrong is basically saying those who disagree with you are stupid or irrational.
"60 fps is better. It's science." That's untrue and insulting.

-30 fps gives more time for shading, meaning more characters that look human rather than like plastic or dolls. This helps drive emotional connection.
-30 fps has better lighting. One thing I hate most in games is unrealistic lighting where objects aren't properly grounded. 60 fps means less visual realism.
-30 fps gives more time for things like anisotropic filtering; some people find blurry grounds very annoying.
-30 fps could mean more scene diversity. I find repeating assets very noticeable and disagreeable.
-30 fps likely means better visual clarity in slower games and with draw distances etc. Pop-in sucks.
-30 fps gives more time for complex animation, so characters move believably and again help ground the world in reality. What's the point in smoother animation if it's robotic and hard to relate too?

There are short-comings in games I dislike which are objectively made better by choosing 30fps to be able to address them. There are also other short-comings I dislike about games that you list which are made better by choosing 60 fps. I am neither arrogant enough to ignore the other half of the argument, nor short-sighted enough to not be able to see the other half of the argument.

AFAICS in this discussion, those who value 30 fps also see what 60 fps brings, but those who value 60 fps just point at the others and make crazy-people gestures while insinuating intellectual superiority. And a good deal of the pro-60 fps camp argue why evidence brought in favour of 30 fps should be dismissed, but don't bring any meaningful evidence of their own.
 
They are aiming for it and not at any one point did they say it's native 4k. What they've shown in E3 is 30fps footage and god knows what level of visual downgrade 60fps mode would receive.
does it need to be 4K native to be validated? Why bother moving the goal posts, more is better you'd agree right?
More resolution, more frames per second. With reconstruction techniques at 4K, the artefacts are smaller, and with higher frames per second, the less delta between frames should also equate to better reconstruction quality.
 
@Shifty Geezer

Honestly, you're one of the most disingenuous posters on this forum. When did I ever argue that 30Hz did not have a longer frame time? When did I ever argue intellectual superiority, or make "crazy-people gestures"?

I made a list of things that are objectively true about 60 fps. Those are the benefits it provides. All else being equal, you will always choose 60 over 30. Of course, devs have to make choices, but does 60Hz always mean worse texture filtering? No. Does 60Hz always mean worse animation? No. Does it always mean worse lighting? No. Does it always mean worse visual clarity? No. Worse pop in? No. It can mean some, none or all of those things, because devs get to choose. The whole discussion is about a next-gen target that we're hypothesizing about, and all I'm suggesting is that it's worth investing some engineering into making it easier to provide 60Hz, not mandating it, because of the great benefits that 60Hz brings to gaming.

Please do not try to frame me in a way that is not reflective of my posts.

Let me put it this way, when a dev chooses 30Hz, they compromise many aspects of visual quality, including judder, visual clarity in motion (motion blur), responsiveness (input lag) and their users reaction times (slower visual update). These compromises are always true. Subjectively, you may not care, but objectively they are compromises. Now, obviously the same is true about 60Hz. Objectively, you have a smaller frame time to fit everything into. Subjectively, someone may be happy with the results of that choice. Personally, I don't think 30Hz is good enough, and that's a subjective statement, but I would never think that 60 Hz should be mandated. I do think it should become more of a priority, and a better cpu will help in that regard. Give me 60 Hz or gtfo.
 
Last edited:
@Shifty Geezer
Honestly, you're one of the most disingenuous posters on this forum. When did I ever argue that 30Hz did not have a longer frame time?
You didn't. You just listed areas 60 fps was better as if a list of objective reasons why 60 fps is always the right choice, and ignored all the ways 30 fps can be better. I listed reasons why 30 fps can be better. That makes be disingenuous??

Let me put it this way, when a dev chooses 30Hz, they compromise many aspects of visual quality, including judder, visual clarity in motion (motion blur), responsiveness (input lag) and their users reaction times (slower visual update). These compromises are always true. Subjectively, you may not care, but objectively they are compromises.
Once again you're ignoring all the other aspects! Why is it objective to talk about responsiveness but not shader quality? Why is it objective to talk about motion clarity but not lighting realism or anisotropic filtering? Why would Quantum Break be objectively better at 60 fps with simpler visuals? It can only ever be subjectively better. You would prefer Quantum Break with simpler visuals at a higher framerate. Others would prefer QB as it is now. There's nothing objective about it! For every aspect 60 fps brings a benefit, it also brings a cost in another aspect.
 
You didn't. You just listed areas 60 fps was better as if a list of objective reasons why 60 fps is always the right choice, and ignored all the ways 30 fps can be better. I listed reasons why 30 fps can be better. That makes be disingenuous??
There seems to be miscommunication here.
Objectively higher frame rates, higher resolution, HDR, etc are all better things.
If you could have 144FPS, 4K native, HDR, it would be better, no one would prefer 30 fps, 4K native given all things equal.
I think he's not wrong in that context.

More is better objectively.

Then you are both talking about the realism aspect of it. Which compromises must be made. In this case, you're willing to trade off frame rate for visual prowess.
That's fine, he's just looking for more silicon such that the developers could also have a 60fps option as well, which would compromise the graphics in favour of frame rate.

His viewpoint is that if you lock the CPU to be significantly weaker than the GPU, no options exist at all. Which is what he does not want, he understands developer choice over the matter. But developers should be able to make a 30 and 60fps variant of the game. Too little CPU and the latter doesn't exist. That's what I understand this debate of two points of view is all about.
 
-30 fps gives more time for complex animation, so characters move believably and again help ground the world in reality. What's the point in smoother animation if it's robotic and hard to relate too?

Animation at 30fps is better than at 60fps. That's not even wrong.

Your argument is a false dichotomy: 30fps can have bad shading, bad lighting, bad texture filtering, repeating assets, poor visual clarity, pop-in and bad animation.

If anything, 60fps remastered versions of games have less of those problems precisely because the devs focus on quality everywhere instead of only in areas that look good for pretty screenshots.
 
Status
Not open for further replies.
Back
Top