What framerate will next-gen target? *spawn

What framerates will next-gen games target?

  • VRR (Variable Refresh Rate)

    Votes: 15 30.6%
  • 30ps

    Votes: 23 46.9%
  • 60ps

    Votes: 26 53.1%
  • 90fps

    Votes: 1 2.0%
  • 120fps

    Votes: 1 2.0%

  • Total voters
    49
Status
Not open for further replies.
Sure I get this, and yes it can get out of hand. But you can't really remove framerate from the discussion if you are talking about its application in games.
No-one was limiting it from the discussion.

But if we limit things to just what the developer has control over, then I don't see how you can disassociate the effect of motion on graphics quality. I at least don't play games without moving. :)
Yes, motion impacts graphics. But if we use the term 'graphics' to include everything regarding the images on screen, we no longer have a word to talk about the pixel quality. So when talking about computer graphics, we divide it into 'resolution', 'framerate', and 'graphics', just for the sake of being able to talk about the thing. ;) Saying 'better graphics, worse framerate' just means higher pixel information, and not 'better computer graphics' as if the framerate doesn't affect the final total graphical package.

IE the computation power required to go from 30 to 60 Hz is lower (IMO) and the resultant graphics quality increase very noticeable while in motion compared to the power required to bring a similarly noticeable improvement in stationary non-motion or slow motion scenes in games. Once the game is in motion, however, the power needed for a similar increase in graphical fidelity at 30 Hz that would get from going from 30-60 Hz becomes rather ridiculously high, IMO.
I agree with that. I've asked on this board if 4K set owners find 1080p60 is sharper than 4K30 because of motion fidelity and the issue wiih motion resolution on TVs. I'd take 1080p60 over 4K30 for sure. (I prefer 60 fps gaming by and large, and would happily take 120+ fps. In the past on PC I've general reduced quality to get a decent framerate, but when that's not possible, I've tried to get the best quality at stable 30 fps).
 
Yah, I was right initially. Chief blur buster answered my question. On a sample and hold display (pretty much all LCD and OLED tvs and monitors) you're going to get approximately 33ms of persistence at 30fps and 16ms of persistence at 60fps on any tv or monitor, regardless of the refresh rate. That is 33 pixels of motion blur at 1000 pixels/second of movement vs 16 pixels of motion blur. Unfortunately backlight strobing, black frame insertion have too many side effects at 30fps, or even 60fps to make them viable options to reduce motion blur. You pretty much need to get to 120 before they're useful. And in that case, you have to make sure your fps never drops below the refresh of your display, or you'll get new artifacts. Maybe tv frame interpolation will get really good and function without artifacts and without increasing input lag, so we can take advantage of motion blur reduction. We never should have stopped using CRTs ;)

So, yes, in motion, 30fps is going to destroy a lot more detail than 60fps, but 60fps has massive room for improvement. It's very easy to see here, even at a very slow movement speed.
https://www.testufo.com/framerates#count=3&background=stars&pps=240
Don't some sets handle 30 fps at 120 fps and use BFI, just duplicating frames?
 
1) I don't add more constraints, they wanted the game to run at 1080p...

2) Which is literally false. COD games are filled with far more basic action scenes.

4) They still chose to run the game at 30fps... so you're wrong at every possible level... i didn't make that choice, ND made that choice. People still buy 30fps games, etc., etc.

So you're arguing in the vacuum... if 30fps is shit, go tell that to the developers, not me... you're wasting your time...
1) And some people would like donuts raining from the sky. What's your point?

2) Nah.

3) I don't even know what you're arguing for or against anymore. Certainly not any points I made.

Correction.
People who don't use our terminology, the terminology used in the sector for as long as I can remember by everyone who discusses it.

The developers of Monster Hunter World don't call their rendering options "performance" and "graphics" because I told them to.

And putting my mod hat on, if you're not willing to embrace the language used in this conversation to facilitate effective conversation, I'm going to issue a reply ban. It's very simple - resolution, framerate, and graphics, with graphics being a quick, general, single-word reference to the content that makes up the framebuffer, resolution being the 2D array of pixel values that make up that framebuffer, and framerate being the rate at which that framebuffer is changed.

This is now the official terminology of this thread (OP updated) to aid discussion Anyone arguing against that will receive a temp-ban for derailment. Let's have a real conversation about computer graphics now.
Let me re-state my position then, sir: framerate is an intrinsic part of the visual presentation of a game, so 30fps leads to a significantly worse visual experience than 60fps. To have so many AAA console games targeting 30fps in 2018 is simply disgraceful.

Yah, I was right initially. Chief blur buster answered my question. On a sample and hold display (pretty much all LCD and OLED tvs and monitors) you're going to get approximately 33ms of persistence at 30fps and 16ms of persistence at 60fps on any tv or monitor, regardless of the refresh rate. That is 33 pixels of motion blur at 1000 pixels/second of movement vs 16 pixels of motion blur. Unfortunately backlight strobing, black frame insertion have too many side effects at 30fps, or even 60fps to make them viable options to reduce motion blur. You pretty much need to get to 120 before they're useful. And in that case, you have to make sure your fps never drops below the refresh of your display, or you'll get new artifacts. Maybe tv frame interpolation will get really good and function without artifacts and without increasing input lag, so we can take advantage of motion blur reduction. We never should have stopped using CRTs ;)

So, yes, in motion, 30fps is going to destroy a lot more detail than 60fps, but 60fps has massive room for improvement. It's very easy to see here, even at a very slow movement speed.
https://www.testufo.com/framerates#count=3&background=stars&pps=240
It's as if framerate directly impacts the perception of detail any time there's motion. Seems like more than just a performance metric. Mhh...
 
Last edited by a moderator:
Don't some sets handle 30 fps at 120 fps and use BFI, just duplicating frames?

Maybe? I know definitely at 60Hz

Edit: actually yes for sure. My tv will do it, but not in game mode, and it probably has over 100ms of input lag. That's why it's only good for movies, but your display brightness is halved.

There may be newer sets that will do iin game mode. I'm pretty sure I've seen some on rtings, but I don't know how many.
 
Last edited:
1) And some people would like donuts raining from the sky. What's your point?

2) Nah.

3) I don't even know what you're arguing for or against anymore. Certainly not any points I made.

Ok, this is my last answer because this discussion is boring...

1) You said that Nintendo had an anti-consumer behaviour with ZWW. I said that this claim is based on nothing except your fantasy. The Switch isn't powerful enough to run the game at 1080p/60fps. They prefered a higher graphical quality and that's their choice... end of the discussion.

2) No evidence to support your claim. As usual, you speak in the vacuum.

3) You said this : "What's the advantage of impercetible higher polycounts on objects that pass you by in an instant?"

Yet ND disagrees with you since, outside of remasters, all their games run at 30fps this gen. Once again, end of the discussion. Your claims are crushed by facts. This gen, 30fps games still are very popular among developers and people still buy them.

Now you can discuss during 100 years about the framerate, nobody cares... only the market tell us what's relevant or not... if 30fps games sell well, then they are revelant. Case closed.

A small read for you anyway : https://www.eurogamer.net/articles/insomniac-60fps-no-more
 
Last edited:
Well, League of Legends and Fortnite have a combined 300-500 (?) million players, so I guess the market doesn't care about realism. Guess we shouldn't bother talking about it.
 
Well, League of Legends and Fortnite have a combined 300-500 (?) million players, so I guess the market doesn't care about realism. Guess we shouldn't bother talking about it.

Fortnite was popular even at 30fps. LOL is a PC game... this means that the game can be played at any framerate... not to mention that they are F2P...

And please, compare what's comparable. Do you have evidence that non-multiplayer 60fps games sell better than 30fps games ?

All evidence we have tend to say otherwise... compare FH6 to FH3 or MG5 to ACO... also, the most hyped games this E3 won't run at 60fps...

Personally, i'm not making any claim. To me both framerate are relevant according to the type of game. I'm not saying that one is stricly superior to another. My opinion is really simple : it's to developers to decide what's the best for their game. And if they were wrong, they will see it in terms of sales. As simple as that.
 
Last edited:
Fortnite was popular even at 30fps. LOL is a PC game... this means that the game can be played at any framerate... not to mention that they are F2P...

And please, compare what's comparable. Do you have evidence that non-multiplayer 60fps games sell better than 30fps games ?

All evidence we have tend to say otherwise... compare FH6 to FH3 or MG5 to ACO... also, the most hyped games this E3 won't run at 60fps...

Personally, i'm not making any claim. To me both framerate are relevant according to the type of game. I'm not saying that one is stricly superior to another. My opinion is really simple : it's to developers to decide what's the best for their game. And if they were wrong, they will see it in terms of sales. As simple as that.

I don't disagree that developers can decide to make their game however they want. Why should a next-gen console only be designed around non-multiplayer games?
 
Why should a next-gen console only be designed around non-multiplayer games?

Consoles aren't designed to hit a specific framerate. They are designed to get the best possible power/price ratio.

The Switch has an even weaker CPU than the PS4/XB1, yet many games run at 60fps.

If you see more 60fps games next-gen, it will not be because the new consoles have a better design, but only because developers feel that a 30fps target is not worth it anymore...

Developers make 60fps possible, not the hardware, that's why you had 60fps games on PS1.
 
Last edited:
Let me re-state my position then, sir: framerate is an intrinsic part of the visual presentation of a game
Yes. No-one's ever suggested otherwise - it's an obvious point that generally doesn't need stating.
, so 30fps leads to a significantly worse visual experience than 60fps.
Subjective. For plenty, the visual experience of better graphics over better framerate is preferable, certainly in some genres.
 
Ok, this is my last answer because this discussion is boring...

1) You said that Nintendo had an anti-consumer behaviour with ZWW. I said that this claim is based on nothing except your fantasy. The Switch isn't powerful enough to run the game at 1080p/60fps. They prefered a higher graphical quality and that's their choice... end of the discussion.

2) No evidence to support your claim. As usual, you speak in the vacuum.

3) You said this : "What's the advantage of impercetible higher polycounts on objects that pass you by in an instant?"

Yet ND disagrees with you since, outside of remasters, all their games run at 30fps this gen. Once again, end of the discussion. Your claims are crushed by facts. This gen, 30fps games still are very popular among developers and people still buy them.

Now you can discuss during 100 years about the framerate, nobody cares... only the market tell us what's relevant or not... if 30fps games sell well, then they are revelant. Case closed.

A small read for you anyway : https://www.eurogamer.net/articles/insomniac-60fps-no-more
1) Yes, that's what I said, it's their choice. If they had wanted to make 60fps the priority they could have chosen to do so.

2) You don't want to see the truth, that's up to you.

3) You didn't explain actually address my point. You're simply saying "Naughty Dog does it so you're wrong".

4) Once again you're arguing against a strawman. Nobody said 30fps games don't sell or that are irrelevant. What we're arguing against is the idea that 60fps cause worse sales / revenue, which is one of the excuses you gave as to why so many AAA games are 30fps on consoles. That's been proven wrong so of course you're dishonestly switching to fighting a strawman.

Subjective. For plenty, the visual experience of better graphics over better framerate is preferable, certainly in some genres.
You're confusing preference with facts. It is a fact that 4K produces a sharper picture than 1080p but some people might prefer the softness of the latter or even lower resolutions. Same with framerate, it is a fact that a higher framerate produces not only a better sense of dynamics but it significantly enhances the perceived quality of the graphics™ during motion (so most of the time) as shown by @Scott_Arm . Still, some people prefer the enhanced quality of the graphics™ during mostly static scenes or even like the cinematic feel of the framerate.
 
4) Once again you're arguing against a strawman. Nobody said 30fps games don't sell or that are irrelevant. What we're arguing against is the idea that 60fps cause worse sales / revenue, which is one of the excuses you gave as to why so many AAA games are 30fps on consoles. That's been proven wrong so of course you're dishonestly switching to fighting a strawman.

Where did i gave this excuse ?

But you said : "The industry seems to disagree with me but not really, they know 60fps is better but they withold it for the "remastered" version so people have to buy the game twice."

Which is pure bullshit... they know nothing since they continue to make a lot of 30fps games. 60fps is only better in your mind...

For reasonnable people like me, 60fps is far superior only when everything is equal, something that never happens on console. Case closed.

Also, based on the industry behaviour, indeed 60fps seems to cause worse sales in some field and better sales in other ones.

Developers are rational people. Most of multplayer games tend to run at 60fps. So we can reasonnably assume that a higher framerate is important and positive in this type of game. Inversely, there are very few 60fps single player games. So we can reasonnably assume that this impacts negatively the sales and that better graphics have a better influence on sales.

It's a logical and basic hypothesis based on empirical data...

1) Yes, that's what I said, it's their choice. If they had wanted to make 60fps the priority they could have chosen to do so.

And ? Does that mean Nintendo is anti-consumer ? Where's your proof for that ? Stop to speak in the vacuum...
 
Last edited:
I agree with that. I've asked on this board if 4K set owners find 1080p60 is sharper than 4K30 because of motion fidelity and the issue wiih motion resolution on TVs. I'd take 1080p60 over 4K30 for sure. (I prefer 60 fps gaming by and large, and would happily take 120+ fps. In the past on PC I've general reduced quality to get a decent framerate, but when that's not possible, I've tried to get the best quality at stable 30 fps).

I do, though this is largely due more to "feel" than look. Wouldn't have to make this choice if the PC GPU situation weren't so jacked up. :devilish:
 
Where did i gave this excuse ?

But you said : "The industry seems to disagree with me but not really, they know 60fps is better but they withold it for the "remastered" version so people have to buy the game twice."

Which is pure bullshit... they know nothing since they continue to make a lot of 30fps games. 60fps is only better in your mind...

For reasonnable people like me, 60fps is far superior only when everything is equal, something that never happens on console. Case closed.

Also, based on the industry behaviour, indeed 60fps seems to cause worse sales in some field and better sales in other ones.

Developers are rational people. Most of multplayer games tend to run at 60fps. So we can reasonnably assume that a higher framerate is important and positive in this type of game. Inversely, there are very few 60fps single player games. So we can reasonnably assume that this impacts negatively the sales and that better graphics have a better influence on sales.

It's a logical and basic hypothesis based on empirical data...



And ? Does that mean Nintendo is anti-consumer ? Where's your proof for that ? Stop to speak in the vacuum...
You don't even remember what you said in you previous message:

"OK, this is my last reply..."

:LOL:

They can continue to make 30fps games because 60fps is the low hanging fruit they can use to get consumers to buy the game again a generation later ;)

"We can reasonable assume..."

There it is, no proof that a higher framerate hurts sales, just your fallacious assumptions ;)
 
Status
Not open for further replies.
Back
Top