Are higher framerates likely to proliferate?

Hold Type is only ok for movies with their heavy motion blur or with native 140fps but for the rest its really bad and 200-300p looking.

The BFI implementation fluctuates with the SONY OLED the picture flickers strongly but with the Panasonic and LG OLED it does not.
 
Last edited:
Hold Type is only ok for movies with their heavy motion blur or with native 140fps but for the rest its really bad and 200-300p looking.

The BFI implementation fluctuates with the SONY OLED the picture flickers strongly but with the Panasonic and LG OLED it does not.


The darker the room the less brightness is needed and flicker sensitivity goes down, so even with 100 -120" screens you can get away with 60fps on CRT projectors. As I said some can't even see it at 48P (and ofc. correct timing matters).

In the past screen size was not an issue with analog TV-s, without the screen size it can't tickle your peripherial vision.

Modern TV-s are just inbetween. IMO projectors are better because it's much easier to focus on something far .
 
Another way to look at it, if next gen consoles are 18-20tf monsters than by all means proliferate as you wish although imagine the possibility of an unrestrained 30fps title with that much power! The truth is a 9-10 tf weaksauce box is simply inadequate to provide next gen graphics with decent resolution at 60fps. It would probably just look twice as good as Doom or CoD with 1800p res? That's a pretty depressing prospect personally.

It depends that 30 FPS game is going to LOOK significantly worse in motion than the 60 FPS game. However in static shots (like screenshots) the 30 FPS game will look better than the 60 FPS game. In fast motion the 60 FPS game will look far higher resolution than the 30 FPS game if both are at the same resolution. In fast motion a 60 FPS game will keep more detail than the 30 FPS game. All this assuming you have a good display with fast pixel response that can display 60 FPS without significant ghosting.

That said, if the majority of games you play are 30 FPS then that is what you are used to and what you think looks good.

Similar to how some people prefer "filmic" 24 FPS movies versus a more realistic 60 FPS movie or video because that's what they grew up watching in theaters.

After not having gone to a theater in over 5 years, I recently went to one and couldn't believe how bad 24 FPS film actually looks. It's truly a horrifying experience, IMO. Like watching a slide show, and motion resolution is the pits making the whole thing look far lower resolution than it is.

So the roles for me have been reversed. 24/30 FPS video/movies look weird and offputting while 60 FPS video/movies look normal which is contrary to how many people feel 24/30 looks more normal while 60 FPS video looks weird. That's due to the fact that since I don't watch 24/30 FPS content for a large chunk of the day (don't watch TV or movies generally), I don't have something that is extremely far from reality warping what I think is how things should look. While 60 FPS still isn't as smooth (still noticeable judder and stutter, just not nearly as bad as 30 FPS) as real vision, it's certainly a much closer match.

Regards,
SB
 
It depends that 30 FPS game is going to LOOK significantly worse in motion than the 60 FPS game. However in static shots (like screenshots) the 30 FPS game will look better than the 60 FPS game. In fast motion the 60 FPS game will look far higher resolution than the 30 FPS game if both are at the same resolution. In fast motion a 60 FPS game will keep more detail than the 30 FPS game. All this assuming you have a good display with fast pixel response that can display 60 FPS without significant ghosting.

Indeed but 60fps don't completely compensate much lower graphical settings. This is why a game like FH3 looks better than FM7 to many players on console.

Obviously, on PC, the choice between 1080p/60fps vs 4k/30fps is obvious to me assuming the graphical settings are identical.
 
It depends that 30 FPS game is going to LOOK significantly worse in motion than the 60 FPS game. However in static shots (like screenshots) the 30 FPS game will look better than the 60 FPS game. In fast motion the 60 FPS game will look far higher resolution than the 30 FPS game if both are at the same resolution. In fast motion a 60 FPS game will keep more detail than the 30 FPS game. All this assuming you have a good display with fast pixel response that can display 60 FPS without significant ghosting.

That said, if the majority of games you play are 30 FPS then that is what you are used to and what you think looks good.

Similar to how some people prefer "filmic" 24 FPS movies versus a more realistic 60 FPS movie or video because that's what they grew up watching in theaters.

After not having gone to a theater in over 5 years, I recently went to one and couldn't believe how bad 24 FPS film actually looks. It's truly a horrifying experience, IMO. Like watching a slide show, and motion resolution is the pits making the whole thing look far lower resolution than it is.

So the roles for me have been reversed. 24/30 FPS video/movies look weird and offputting while 60 FPS video/movies look normal which is contrary to how many people feel 24/30 looks more normal while 60 FPS video looks weird. That's due to the fact that since I don't watch 24/30 FPS content for a large chunk of the day (don't watch TV or movies generally), I don't have something that is extremely far from reality warping what I think is how things should look. While 60 FPS still isn't as smooth (still noticeable judder and stutter, just not nearly as bad as 30 FPS) as real vision, it's certainly a much closer match.

Regards,
SB
Again you're comparing in an ideal situation when all else is equal, PC environment I suppose? Then yes of course 60fps will look better. But taking into account the sacrifices that are needed for a console to reach 60fps, the result will not be as clear cut. The whole point of this is about sacrifices in visual quality otherwise we wouldn't have this discussion would we? Imagine running a game with low to med settings, no AF at 1080p 60fps vs high to max settings 8-16x AF at 1800p - 4k 30fps, the motion clarity of the former will have no chance in hell to compensate for the loss of everything else.
 
Tend to agree with the 4K camp,

I see no other shortcut to >60fps computational fluid dynamics, than diablo-esque top down view with a slight perspective and pre-rendered lighting / animations . And while you 're at it why not pre-render it at 4K with 256x AA.
 
Again you're comparing in an ideal situation when all else is equal, PC environment I suppose? Then yes of course 60fps will look better. But taking into account the sacrifices that are needed for a console to reach 60fps, the result will not be as clear cut. The whole point of this is about sacrifices in visual quality otherwise we wouldn't have this discussion would we? Imagine running a game with low to med settings, no AF at 1080p 60fps vs high to max settings 8-16x AF at 1800p - 4k 30fps, the motion clarity of the former will have no chance in hell to compensate for the loss of everything else.

2256 x 1269 is roughly half of 3200 x 1800. I'd take 2256 x 1269 at 60Hz over 3200 x 1800 any day. I know it doesn't work out to be quite that simple, but I'm just ball-parking on a hypothetical. Dynamic resolution and temporal supersampling/upsampling will go a long way next gen. You'll get sharper looking image quality in static situations, with good responsiveness and clarity in motion. Of course they need a better on the cpu side so games like Assassin's Creed and the Witcher don't tank when they're cpu limited.
 
It's 30fps for the Pro resolution mode. I don't think there's a framerate mode otherwise they would have announced it already.
 
Back
Top