What framerate will next-gen target? *spawn

What framerates will next-gen games target?

  • VRR (Variable Refresh Rate)

    Votes: 15 30.6%
  • 30ps

    Votes: 23 46.9%
  • 60ps

    Votes: 26 53.1%
  • 90fps

    Votes: 1 2.0%
  • 120fps

    Votes: 1 2.0%

  • Total voters
    49
Status
Not open for further replies.
The loss of temporal data gets significantly more obvious when you’re playing 144fps in 144hz monitor. And then jump to 30fps. Your eyes become confused for a long time until the adjustment. 30fps becomes a blocky slide show.

I think for those that have only gamed between 30 and 60, perhaps the gap isn’t wide enough. Hit 120+ and compare it to 30 and most people will “get it” immediately.
Ghosting is gone etc. Colours and clarity pops. The actual image quality of what’s drawn by the display is shoulder over fist better.
 
The loss of temporal data gets significantly more obvious when you’re playing 144fps in 144hz monitor. And then jump to 30fps. Your eyes become confused for a long time until the adjustment. 30fps becomes a blocky slide show.

I think for those that have only gamed between 30 and 60, perhaps the gap isn’t wide enough. Hit 120+ and compare it to 30 and most people will “get it” immediately.
Ghosting is gone etc. Colours and clarity pops. The actual image quality of what’s drawn by the display is shoulder over fist better.

Yeah, 144hz monitors scare me. :p I'm not gaming at my friend's house anymore as it's already made gaming on my 60 Hz monitor less pleasant. Even at reduced detail settings compared to what I run (usually in a 1600p or 1800p window), it still looks so much better in motion. Not just the reduced judder and increased smoothness, but the far greater detail that is visually discernable while in motion. The image just looks far better in motion even at reduced settings.

I don't have the option to move to 144hz yet (main display is a 4k 49" monitor). I'm considering moving to 120 Hz, however, once 120 Hz 4K TVs that accept a 120 Hz 4K input become more commonplace.

Regards,
SB
 
Last edited:
Not everyone agrees on what's better, then. To me it's definitely better to have better shadowing, models, shaders, post-processing effects and AA. It's not that we're talking about 480p/30 FPS vs 4K/60 FPS, you see. High definition resolutions are pretty good at showing a fair amount of detail. I can perceive a lot of detail in 30 FPS games, as well as so many gamers who are amazed at how pretty some 30 FPS games graphics look.

At 30 fps you're probably going to have something like maybe 32 ms of persistence on a typical lcd display. You’d probably have about 16 ms at 60Hz. Each ms of persistence equals one pixel of blur at 1000 pixels/second of motion. That’s fairly quick movement. You’ll have less blur the slower the movement, but 32 pixels is a lot to start from. How slow do you have to move the camera to stop destroying one or two pixels worth of data every time you move the camera? Pretty slow. You probably have to rotate 30 fps half as fast.

It’s hard to find data on 30 Hz on a 60Hz display because the people that research it never bother measuring 30 Hz. But on sample and hold displays it typically halves persistence as you double refresh rate. So the implication is you blur half as many pixels each time you double refresh rate.

Typically 30 fps games have to hide judder so they’ll implement a camera-based motion blur which destroys most fine detail in motion anyway. Camera-based motion blur is not necessary at 60 FPS.

Edit: Now that I'm thinking about it more, I'm curious about frame rate vs persistence, because you have two refreshes per frame at 30fps on a 60Hz screen. So your image moves twice as far per frame change, at the same rate of movement (more judder), but there is no change every second frame. So sample/hold effect should not matter for every second frame? Maybe it's the same mprt, but with the visible blur in a different location? Maybe the blur busters people would be kind enough to answer some questions.
 
Last edited:
Then again maybe Nintendo IS indeed anti-consumer and wants the fans to buy the game more than twice since even its remasters don't run at 60fps.

And Sony is so anti-consumer that Ratchet and Clank was running at 60fps on PS2 and 30fps on PS4... your arguments are based on nothing or isolated examples...

Edit : you were actually serious about Nintendo, i thought it was sarcasm and that's why i gave the R&C example. The game doesn't run at 60fps because the Switch isn't powerful enough.

Clearly TLoU2 would benefit tremendously from 60fps but sadly that won't happen until PS5...

Your example is completely irrelevant... is it so difficult to understand that you can get 60fps without compromises on a much more powerful hardware (PS3 vs PS4) ? How many times do i have to say it ?

Yeah, TLOU is cool at 60fps on PS4 because they can even upgrade the engine at that framerate. On PS3, they could not make the TLOU you know at 60fps : probably less ennemies, smaller map, worse animations, worse AI, etc.

Otherwise, this is what happens : http://www.ign.com/articles/2015/01/15/uncharted-4-wont-be-60-fps-if-it-harms-the-player-experience

None of the games that run at 60fps this gen approach the complexity of a game like Uncharted. None of them. Either you have very basic multiplayer games or corridor shooters.

MG5 is the only exception but its environment is completely empty and action scenes aren't very complex.


Show us a game doing that at 60fps this gen on console...
 
Last edited:
One genre i hope is never 30fps next gen are fps. I can't stand to play shooters at 30fps anymore. Playing Titanfall 2 on ps4 atm and honestly it looks fantastic, perceptually it's as detailed as any fps out there and runs like butter. No nasty post processing either. We're almost to that point with halo and BF moving to 60 but there's some stragglers at 30fps still.

Tps like uncharted are more tolerable at 30fps than fps games.
 
And nothing in my post means that I'm stating that framerate is not a part of the graphics of a game...
Then we agree on that.

And Sony is so anti-consumer that Ratchet and Clank was running at 60fps on PS2 and 30fps on PS4... your arguments are based on nothing or isolated examples...

Edit : you were actually serious about Nintendo, i thought it was sarcasm and that's why i gave the R&C example. The game doesn't run at 60fps because the Switch isn't powerful enough.

Your example is completely irrelevant... is it so difficult to understand that you can get 60fps without compromises on a much more powerful hardware (PS3 vs PS4) ? How many times do i have to say it ?

Yeah, TLOU is cool at 60fps on PS4 because they can even upgrade the engine at that framerate. On PS3, they could not make the TLOU you know at 60fps : probably less ennemies, smaller map, worse animations, worse AI, etc.

Otherwise, this is what happens : http://www.ign.com/articles/2015/01/15/uncharted-4-wont-be-60-fps-if-it-harms-the-player-experience

None of the games that run at 60fps this gen approach the complexity of a game like Uncharted. None of them. Either you have very basic multiplayer games or corridor shooters.

MG5 is the only exception but its environment is completely empty and action scenes aren't very complex.


Show us a game doing that at 60fps this gen on console...

1) The game doesn't run at 60fps because the developers targeted 30fps.

2) Your argument is a strawman. Nobody said 60fps requires no compromises. Also, your example of complexity is a scripted scene? :LOL:


Just look how much better the action flows and feels at 60fps. "Bu-but it would have a few less polygons here and there and some pop-up I wouldn't even notice until Digital Foundry slowed down the footage and zoomed in at 400%..." :LOL:
 
This thread has become ridiculous. No shit higher frames are better but if you really think the devs have just thumb sucked the idea that better graphics,AI and animation sell better then you are delusional and the debate has turned into a kindergarten playground fight.
 
And nothing in my post means that I'm stating that framerate is not a part of the graphics of a game...
For most in this discussion, and every discussion on computer game visuals, we divide the three aspects of the visuals into resolution, framerate, and graphics, so that we can talk about the three different aspects and their impact on the experience. Some see the exclusion of framerate as being part of the graphics as a racial slur, and would rather we use these three terms to discuss the different aspects of computer game visuals:

Higher Screen Resolution = better graphics
Higher Refresh Rate = better graphics
High Shader Fidelity, AA, AF, better lighting, better shadowing, better rendering effects = better graphics

So obviously, if you want to talk about better graphics but want to be explicit about framerate, you should say, "higher framerate". And if you want to say better graphics regards the per-pixel qualities, you should say, "higher shader fidelity, increased AA, increased AF, better lighting, better shadowing, and better rendering effects."

Some people, IMHO, are idiots, and they should just accept the terminology everyone's happy using because it's conducive to discussion, but they'd rather waste everyone's time making it impossible to talk sanely on the subject.
 
So obviously, if you want to talk about better graphics but want to be explicit about framerate, you should say, "higher framerate". And if you want to say better graphics regards the per-pixel qualities, you should say, "higher shader fidelity, increased AA, increased AF, better lighting, better shadowing, and better rendering effects."
And that's just what I did several times, in this thread.

Some people, IMHO, are idiots, and they should just accept the terminology everyone's happy using because it's conducive to discussion, but they'd rather waste everyone's time making it impossible to talk sanely on the subject.
Indeed. First of all, I think that people should know how to read and understand the overall message, not just nit picking a few words so that later you have to explain again what you already said in your first post. Many people have an "all or nothing" attitude ("1200 FPS or MEGAULTRA graphics!! there's nothing in between!!"), too.
 
Among the trifecta, there are eventual upper bounds of diminishing returns on 2 of them, in particular resolution and frame rate. Once they exceed the capacity of the human eye it can be argued it needs to go no further.

Leaving the rest of shaders and lighting and shadows having a pretty high ceiling.

I think I’m this argument at its eventuality would walk down this line of thinking, and pro frame rate would eventually need to concede this point.

However, the argument isn’t about graphics at 120fps+. It’s about their calls to have option to move from the absolute bottom (30) to something higher. And these demands aren’t necessarily wrong, you’d be upset at Texture filtering locked to bilinear for another generation Or 720p for resolution? Or having super low polygons? Or the absolute rock bottom worst shadow resolution for most titles?

Resolution gate was the biggest thing between Xbox and PS4. And all it was about was generally speaking 30% more pixels. It’s not like both consoles got held at 720p for its lifetime. What a discussion we would have with framerate gate would have made if they held all variables the same but one console was 60 and the other 30 instead of PS4 pushing higher graphical settings.

I sort of get why they would get anxious over it. Everything else within the graphics cloud has really moved forward, especially since mid-gen refresh arrived, resolution, AF, and many more items got bumped. But frame rate still largely held at the absolute bottom of the list.

Cutscenes can always drop to 30 to hit the graphics that people want. But that gameplay feature, after going a long gen without it, having addressed everything else, maybe it is time to address it for next gen.

We know they will be going Ryzen, so there are taking the right steps. We also know there won’t be any mandates. But we know that they would have been working with 4K and checkerboarding, reconstruction techniques for 3-4 years before next gen. So perhaps we’ll start seeing higher frame rates here now that they don’t need to tackle so many different areas at once.
 
For most in this discussion, and every discussion on computer game visuals, we divide the three aspects of the visuals into resolution, framerate, and graphics, so that we can talk about the three different aspects and their impact on the experience. Some see the exclusion of framerate as being part of the graphics as a racial slur, and would rather we use these three terms to discuss the different aspects of computer game visuals:

Higher Screen Resolution = better graphics
Higher Refresh Rate = better graphics
High Shader Fidelity, AA, AF, better lighting, better shadowing, better rendering effects = better graphics

So obviously, if you want to talk about better graphics but want to be explicit about framerate, you should say, "higher framerate". And if you want to say better graphics regards the per-pixel qualities, you should say, "higher shader fidelity, increased AA, increased AF, better lighting, better shadowing, and better rendering effects."

Some people, IMHO, are idiots, and they should just accept the terminology everyone's happy using because it's conducive to discussion, but they'd rather waste everyone's time making it impossible to talk sanely on the subject.
"People who don't use my terminology are idiots". You're just used to ignoring framerate in graphics discussions.

Among the trifecta, there are eventual upper bounds of diminishing returns on 2 of them, in particular resolution and frame rate. Once they exceed the capacity of the human eye it can be argued it needs to go no further.

Leaving the rest of shaders and lighting and shadows having a pretty high ceiling.

I think I’m this argument at its eventuality would walk down this line of thinking, and pro frame rate would eventually need to concede this point.

However, the argument isn’t about graphics at 120fps+. It’s about their calls to have option to move from the absolute bottom (30) to something higher. And these demands aren’t necessarily wrong, you’d be upset at Texture filtering locked to bilinear for another generation Or 720p for resolution? Or having super low polygons? Or the absolute rock bottom worst shadow resolution for most titles?

Resolution gate was the biggest thing between Xbox and PS4. And all it was about was generally speaking 30% more pixels. It’s not like both consoles got held at 720p for its lifetime. What a discussion we would have with framerate gate would have made if they held all variables the same but one console was 60 and the other 30 instead of PS4 pushing higher graphical settings.

I sort of get why they would get anxious over it. Everything else within the graphics cloud has really moved forward, especially since mid-gen refresh arrived, resolution, AF, and many more items got bumped. But frame rate still largely held at the absolute bottom of the list.

Cutscenes can always drop to 30 to hit the graphics that people want. But that gameplay feature, after going a long gen without it, having addressed everything else, maybe it is time to address it for next gen.

We know they will be going Ryzen, so there are taking the right steps. We also know there won’t be any mandates. But we know that they would have been working with 4K and checkerboarding, reconstruction techniques for 3-4 years before next gen. So perhaps we’ll start seeing higher frame rates here now that they don’t need to tackle so many different areas at once.
Also, the focus on asset fidelity has managed to push games into the uncanny valley, a valley not even cutting edge film CGI has managed to get out of (Leia in Rogue One looks like a talking corpse). This actually lowers immersion since every little flaw in rendering / animation / physics stand out like a sore thumb, specially at 4K. Massive costs are required just to avoid things looking horrible. Surface detail is just far ahead of the other graphical elements needed and it shows.
 
Last edited:
1) The game doesn't run at 60fps because the developers targeted 30fps.

1) The Switch wasn't powerful enough to run this game at 1080p/60fps...

2) Another irrelevant argument... yeah this scripted scene is really easy, that's why we see comparable scenes in other games...

http://www.liaisonpr.com/news-room/2017/Mar/03/how-uncharted-4s-artists-created-their-amazing-lan/ : "The biggest moment where tech came together was the E3 chase sequence. This by far was the biggest environment when it comes to driven distance and the amount of different environmental styles as you go from the market to the docks. When it came to memory, LOD tech, modularity, streaming, car physics, prop destructibility it was an extreme challenge. Nobody at this studio or any studio for that matter has tried to do a level section with that complexity when it came to the amount of art assets that needed to be created to make all these sections of the chase."

Same for the snake in God of War, so easy : "From the beginning the team knew it would be included in the game, since it’s “such a big character” in Norse Mythology. It’s also huge in size in the game, and pushes the technology because players will interact with it in “a lot of different ways.” We’ll get very close and very far."

https://www.dualshockers.com/god-of-war-midgard-serpent-tech/

Fact is you never know of what you are talking about, that's why you expect every game running at 60fps...

Just look how much better the action flows and feels at 60fps.

Yeah much better when everything is equal... not possible on PS4 and even on Pro. Sorry...

On a side note, how many people think that Cyberpunk 2077 will run at 60fps on next-gen ? :LOL:
 
Last edited:
1) The Switch wasn't powerful enough to run this game at 1080p/60fps...

2) Another irrelevant argument... yeah this scripted scene is really easy, that's why we see comparable scenes in other games...

http://www.liaisonpr.com/news-room/2017/Mar/03/how-uncharted-4s-artists-created-their-amazing-lan/ : "The biggest moment where tech came together was the E3 chase sequence. This by far was the biggest environment when it comes to driven distance and the amount of different environmental styles as you go from the market to the docks. When it came to memory, LOD tech, modularity, streaming, car physics, prop destructibility it was an extreme challenge. Nobody at this studio or any studio for that matter has tried to do a level section with that complexity when it came to the amount of art assets that needed to be created to make all these sections of the chase."

Same for the snake in God of War, so easy : "From the beginning the team knew it would be included in the game, since it’s “such a big character” in Norse Mythology. It’s also huge in size in the game, and pushes the technology because players will interact with it in “a lot of different ways.” We’ll get very close and very far."

https://www.dualshockers.com/god-of-war-midgard-serpent-tech/

Fact is you never know of what you are talking about, that's why you expect every game running at 60fps...



Yeah much better when everything is equal... not possible on PS4 and even on Pro. Sorry...
1) Adding more constraints, I see...

2) The COD games are filled with that kind of stuff.

3) Increasing framerate is also pushing technology ;) In the case of those two devs, they simply chose to push other tech areas.

4) Even if the cost was as high as having the fidelity of a PS3 game ( :LOL: ) the advantages of 60fps would still be there. What's the advantage of impercetible higher polycounts on objects that pass you by in an instant? :)
 
1) Adding more constraints, I see...

2) The COD games are filled with that kind of stuff.

3) Increasing framerate is also pushing technology ;) In the case of those two devs, they simply chose to push other tech areas.

4) Even if the cost was as high as having the fidelity of a PS3 game ( :LOL: ) the advantages of 60fps would still be there. What's the advantage of impercetible higher polycounts on objects that pass you by in an instant? :)

1) I don't add more constraints, they wanted the game to run at 1080p...

2) Which is literally false. COD games are filled with far more basic action scenes.

4) They still chose to run the game at 30fps... so you're wrong at every possible level... i didn't make that choice, ND made that choice. People still buy 30fps games, etc., etc.

So you're arguing in the vacuum... if 30fps is shit, go tell that to the developers, not me... you're wasting your time...
 
Last edited:
Among the trifecta, there are eventual upper bounds of diminishing returns on 2 of them, in particular resolution and frame rate. Once they exceed the capacity of the human eye it can be argued it needs to go no further.
Yep.

But with photorealism still a long ways off, the choice between res, framerate, and graphics is always going to be a compromise. What we've had in this thread is people preferring a different balance sometimes suggesting that only their preference should be respected, as if it's wrong to shift the balance more towards res or more towards graphics or more towards framerate. As long as everyone can back away from those arguments, trying to prove their values are the right values, we can look at what are the limiting factors and how the hardware can best be optimised to get the most out of each aspect. eg. Does a 50% reduction in framerate mean a 50% increase in graphical zing, or is it only a 20% improvement? Can a 15% reduction in resolution increase framerate by 30%? Where's the sweat-spot and are their any technologies that'll shift that?

Funnily enough, we've heard pro-graphics and pro-framerate voices, but I don't recall anyone saying, "give me 4K or GTFO"! Is sub-4K the sweat-spot? Sounds like it here. But then how many how are willing to take sub-4K have 4K displays?
 
"People who don't use my terminology are idiots". You're just used to ignoring framerate in graphics discussions.
Correction.
People who don't use our terminology, the terminology used in the sector for as long as I can remember by everyone who discusses it.

The developers of Monster Hunter World don't call their rendering options "performance" and "graphics" because I told them to.

And putting my mod hat on, if you're not willing to embrace the language used in this conversation to facilitate effective conversation, I'm going to issue a reply ban. It's very simple - resolution, framerate, and graphics, with graphics being a quick, general, single-word reference to the content that makes up the framebuffer, resolution being the 2D array of pixel values that make up that framebuffer, and framerate being the rate at which that framebuffer is changed.

This is now the official terminology of this thread (OP updated) to aid discussion Anyone arguing against that will receive a temp-ban for derailment. Let's have a real conversation about computer graphics now.
 
Funnily enough, we've heard pro-graphics and pro-framerate voices, but I don't recall anyone saying, "give me 4K or GTFO"! Is sub-4K the sweat-spot? Sounds like it here. But then how many how are willing to take sub-4K have 4K displays?
oh we exist lol. My general argument is that I paid too much for a 4K OLED HDR screen, so 4K HDR are the settings I run on it. If I want high framerate gaming, I can switch to 1080p@120 on my TV (unexpected feature). But generally speaking I don't and haven't, unless there is some competitive game I'm looking for an edge in or.. there's a particular boss/sequence in a game in which my performance in the game would be significantly improved by moving the frame rate up.

but 4K is unfortunately too dependent on a great deal of many factors that are just not there yet.
For 4K to really shine, you need solid HDR, high texture filtering, long draw distances, and high asset quality for it to be something to be able to see a substantial difference. When you see it in it's full glory it becomes very desirable, and it looks next gen imo; next gen being able to see minute detail of items from a very far distance, while being able incredible detail at a close distance all in the same camera. Particle effects shine because you can have fine grained particles like a rolling cloud of dust etc. God rays and all that jazz become awesome. Particle effects are awesome in 4K.

Thus, I'm looking forward to next gen because of this, my basic assumption is that we'll see a lot more 'great' implementations of 4K in the years to come. But the realist in me says that the mainstream market still won't have a massive user base at 4K, so this debate on higher FPS is somewhat a sensible discussion topic, if not 4K, then we should see 60/120 fps variants on 1080p resolution (for those that don't have 4K sets)
 
Last edited:
For most in this discussion, and every discussion on computer game visuals, we divide the three aspects of the visuals into resolution, framerate, and graphics, so that we can talk about the three different aspects and their impact on the experience. Some see the exclusion of framerate as being part of the graphics as a racial slur, and would rather we use these three terms to discuss the different aspects of computer game visuals:

Higher Screen Resolution = better graphics
Higher Refresh Rate = better graphics
High Shader Fidelity, AA, AF, better lighting, better shadowing, better rendering effects = better graphics

So obviously, if you want to talk about better graphics but want to be explicit about framerate, you should say, "higher framerate". And if you want to say better graphics regards the per-pixel qualities, you should say, "higher shader fidelity, increased AA, increased AF, better lighting, better shadowing, and better rendering effects."

Some people, IMHO, are idiots, and they should just accept the terminology everyone's happy using because it's conducive to discussion, but they'd rather waste everyone's time making it impossible to talk sanely on the subject.

Sure I get this, and yes it can get out of hand. But you can't really remove framerate from the discussion if you are talking about its application in games.

Yes, for still images, motion is irrelevant. But if motion destroys detail due to too low framerate then you can't disassociate the two when it comes to its application in games. Of course there are dangers with that as well as that can lead into discussion on the impact of the ability of your display to resolve detail in motion (some displays are far better than others at retaining some detail during motion).

But if we limit things to just what the developer has control over, then I don't see how you can disassociate the effect of motion on graphics quality. I at least don't play games without moving. :)

And while there is diminishing returns for all of that, I think we're closer to diminishing returns on power needed for higher quality still images than we are higher framerate in games. Resolution (when speaking about the living room experience and living room viewing distances) has also hit a point of greatly diminishing returns. 4k isn't significantly better than 1080p. And I can't imagine how 8k would improve anything at typical living room distances. At PC viewing distances 8k may or may not still matter to some extent.

IE the computation power required to go from 30 to 60 Hz is lower (IMO) and the resultant graphics quality increase very noticeable while in motion compared to the power required to bring a similarly noticeable improvement in stationary non-motion or slow motion scenes in games. Once the game is in motion, however, the power needed for a similar increase in graphical fidelity at 30 Hz that would get from going from 30-60 Hz becomes rather ridiculously high, IMO.

That said, I know that some people like to just stop moving and look at things to see how good they look (as they can't do that at 30 Hz while running around since the detail is greatly diminished in motion at 30 Hz), hence, having the user be able to choose how they want to play the game is the best option.

Hell, on PC I have some friends that will occasionally drop performance down to 10-20 Hz and crank up the quality settings to see how things can look when not in motion and to take screenshots and then promptly crank it back up to 60 Hz or more (increasingly it's more) as soon as they start to play the game.

One thing that people also fail to take note of is the increasing use of temporal rendering effects (those that use multiple frames in order to achieve their look). That can greatly increase graphics quality while at the same time reducing the rendering cost of similar non-temporal effects, but comes at a cost that it looks like ass at lower framerates. 60 Hz is barely good enough to result in decent looking temporal effects. 120 Hz and higher will result in much better use and much better looking temporal rendering effects.

Regards,
SB
 
Last edited:
...
Edit: Now that I'm thinking about it more, I'm curious about frame rate vs persistence, because you have two refreshes per frame at 30fps on a 60Hz screen. So your image moves twice as far per frame change, at the same rate of movement (more judder), but there is no change every second frame. So sample/hold effect should not matter for every second frame? Maybe it's the same mprt, but with the visible blur in a different location? Maybe the blur busters people would be kind enough to answer some questions.

Yah, I was right initially. Chief blur buster answered my question. On a sample and hold display (pretty much all LCD and OLED tvs and monitors) you're going to get approximately 33ms of persistence at 30fps and 16ms of persistence at 60fps on any tv or monitor, regardless of the refresh rate. That is 33 pixels of motion blur at 1000 pixels/second of movement vs 16 pixels of motion blur. Unfortunately backlight strobing, black frame insertion have too many side effects at 30fps, or even 60fps to make them viable options to reduce motion blur. You pretty much need to get to 120 before they're useful. And in that case, you have to make sure your fps never drops below the refresh of your display, or you'll get new artifacts. Maybe tv frame interpolation will get really good and function without artifacts and without increasing input lag, so we can take advantage of motion blur reduction. We never should have stopped using CRTs ;)

So, yes, in motion, 30fps is going to destroy a lot more detail than 60fps, but 60fps has massive room for improvement. It's very easy to see here, even at a very slow movement speed.
https://www.testufo.com/framerates#count=3&background=stars&pps=240
 
Status
Not open for further replies.
Back
Top