Why have High Resolution at only 30fps?

Jupiter

Veteran
The Order 1886 has too many filters it wont look much sharper at a higher resolution. In motion its even worse because of its extremly heavy motion blur implementation.

In general I do not understand what the high resolutions at 30fps should bring as 30fps blurs the whole mage on 60Hz displays. The user does not benefit much of an high resolution. I find 1080p with good TAA and 60fps sharper than UHD at 30fps where the picture is blurred every time when the camera moves.

If one has an LCD/OLED with BFI or a Plasma he/she will have an extreme sharpness at 60 fps and a fuzzy image at 30 fps: https://www.testufo.com/

On those TVs this moving UFO is as sharp as an screenshot with 60fps. At 30fps not so much. But the subject of motion resolution is far too complicated for the general public which still prefers images (no movement) at 2160p which are then "reduced" to typical 300p when moving on 60hz sample-hold screens (LCD, OLED without BFI or frame interpolation -->13.8% of the 2160p lines). Too bad developers wasting computing power without end and than there is a minimal benefit from it. But I think that topic (Xbox One X games and The Order 1886 Pro Patch etc.) belongs in the PlayStation 4 Pro or XBOX thread.


More important than the resolution itself is the quality of the TAA. I already saw games that looked more blurry in UHD with TAA than FullHD games with TAA. UHD cosumes lots of computing power and can end up with a worse picture.
 
Last edited:
In general I do not understand what the high resolutions at 30fps should bring as 30fps blurs the whole mage on 60Hz displays. The user does not benefit much of an high resolution. I find 1080p with good TAA and 60fps sharper than UHD at 30fps where the picture is blurred every time when the camera moves.

I guess everyone has his sensibility. I can easily spot a higher resolution in any situation.
 
Yes, but as soon as the camera moves it will look like 300p. 30fps are only a target because of the slow CPU. High resolution 60fps games look multiple times sharper.
 
Then you are going beyond the physical possibilities of the eye. Two following images without dark phase at 30fps look like 300p for humans. That's a fact and all the experts like hdtv.uk will confirm that (FPD Benchmark Software Test Disc etc.).

People are enjoying these UHD screenshots but there is nothing of those details to see in motion. That's why the UHD Blu ray is disappointing in my opinion. When the picture moves nothing is left of the UHD sharpness.
 
Last edited:
Then you are going beyond the physical possibilities of the eye. Two following images without dark phase at 30fps look like 300p for humans. That's a fact and all the experts like hdtv.uk will confirm that (FPD Benchmark Software Test Disc etc.).

RAnc.jpg


Is this looks like 300p to you ?

Since when motion blur should not blur the image at 60fps ? :

rZ3gu5.png
 
Thats an ingame screenshot. The television does not output 300p. A sequence of images creates a video for the human brain. The more images are played in the second, the sharper the video looks. Also important if its a sample-and-hold or an impulse-driven television. Sample-and-hold TVs playing two pictures without dark phase between them. This mixes the two images for the human brain which make movements look like 300p with 30fps even at UHD resolution. The more pictures played per second on this type of display the sharper the movement looks. Therefore 144fps do look much sharper than 60fps.
If there is a dark phase between those two images (impulse-driven television - OLED, LCD with BFI or just a Plasma) the images are perceived sharper and its already screenshot like clear with 60fps.

In order to not to destroy the sharpness I would generally disable fullscreen motion blur at 60fps or set it very low.
 
Last edited:
Then you are going beyond the physical possibilities of the eye. Two following images without dark phase at 30fps look like 300p for humans. That's a fact and all the experts like hdtv.uk will confirm that (FPD Benchmark Software Test Disc etc.).
That's absolute crock. You're saying no-one can tell the difference between 300p30 and 1080p30?? That's so clearly untrue based on personal experience, it should never even enter consideration.
 
It has less aliasing artefacts compared to a game with a lower resolution but sharpness in movement is more comparable to a downsampled 300p video.

Example:
Motion
Engaging [Motionflow] allowed us to more than double the Sony KD65A1BU’s motion resolution from the sample-and-hold baseline of 300 lines to 650 lines. There are two methods to achieve this. First is by motion-compensated frame interpolation (MCFI), [...]

"More exciting is black frame insertion (BFI) [...] By inserting black frames between the original video frames, BFI refreshes our retinal persistence and reduces motion blur without introducing interpolation artefacts and soap opera effect, but there are at least two well-documented downsides due to how BFI works [...]
http://www.hdtvtest.co.uk/news/kd65a1-201706044471.htm

Motion resolution - 300p
motionflow-off.jpg


Motion Resolution - 650p
clearness-high.jpg



There are many tests available: https://www.testufo.com/eyetracking#count=2&background=stars&pps=960&pattern=stars

I bet both UFOs (standing and moving one) will not look the same even at 60fps
 
Last edited:
Keep in mind that FPD motion resolution benchmark does not have any 2160p test patterns, only 1080p. This confusion is undetstandable as back in the day, there was only HD and 1080 lines would be considered as a reference. So with UHD displays, that 300 lines number should double to 600 lines.

Oh, and one more thing people get confused. 300 lines figure for LCDs and OLEDs back in the day was actually derived from a 60 FPS test pattern. So 30 FPS would have meant further reduction to 150 lines. (120 lines for 24hz movies)

Whether or not that 300p-600p resolution reduction from 2160p is true is up to debate as lines of resolutions are not scientifically derived, but at least I can say with confidence that I can see more details in motion at 30 FPS with my Panasonic plasma TV (700-900 lines at 1080p) than 60 FPS with my Samsung LCD TV. (300 lines at 1080p)
 
There are several reasons:
- They do that to enhance the perception of fluidity since the base console hardware cannot push 1080p 60fps without making visual and game logic comprises (cpu side)
- Its easy to offer higher resolution options in game engines. And PC gaming enthusiants want SOMETHING as they are sitting around with hardware that outperforms the base consoles the games are designed around.
- Some games have physics and other game logic tied to framerate, and when you increase fps it breaks the game (eg Need for Speed)
- Display manufacturers releasing and marketing 4K tvs that the general public are increasingly eager to buy, and 1440p, 1600p, 2160p monitors that the PC enthusiast community is eager to buy.
- Base consoles and their mid gen upgrades still sport Jaguar CPUs which have difficulty maintaining a solid 60fps in alot of games unless the developers are willing to make compromises in some aspects of their games and its visuals.
 
Yes, but as soon as the camera moves it will look like 300p.

No it will not. Speed of motion plays a big factor in motion resolution It's not just still vs motion. When the camera moves slowly you can see more resolution. Even in real life your eyes lose focus in fast motion, so I personally don't see motion resolution downfalls of modern displays very serious at all. The games I play often have slow moving objects or I stop to look around and thus the games look great when it really matters. When I move I focus on the movement and action not to the small details. The stuff about motion looking like 300p is silly hyperbole. In practice 300p game doesn't look anything like a 1080p let alone 4K game, outside of some really rare border cases.

Black frame insertion is pretty much dead now with it being largely incompatible with proper HDR performance.
 
Last edited:
I also have the feeling that the higher the res, the less it looks good in motion (if using the same framerate).
 
There are several reasons:
- They do that to enhance the perception of fluidity since the base console hardware cannot push 1080p 60fps without making visual and game logic comprises (cpu side)
- Its easy to offer higher resolution options in game engines. And PC gaming enthusiants want SOMETHING as they are sitting around with hardware that outperforms the base consoles the games are designed around.
- Some games have physics and other game logic tied to framerate, and when you increase fps it breaks the game (eg Need for Speed)
- Display manufacturers releasing and marketing 4K tvs that the general public are increasingly eager to buy, and 1440p, 1600p, 2160p monitors that the PC enthusiast community is eager to buy.
- Base consoles and their mid gen upgrades still sport Jaguar CPUs which have difficulty maintaining a solid 60fps in alot of games unless the developers are willing to make compromises in some aspects of their games and its visuals.

Its true that its not possible at the moment to get 60fps in a lot of games. Just look at the CPU Benchmaks of AC: Origins etc. However, in the next gen console developers should target 60fps more often on a machine with a Ryzen like performance.

UHD and 30fps are far from optimal in terms of sharpness and if the games are aiming for high resolutions at 60 fps they will soon be much sharper than 24p movies. Video games overtake movies. Thats funny. Some UHD/60fps games like Wipeout HD look already stunning in terms of sharpness on a corresponding screen.

No it will not. Speed of motion plays a big factor in motion resolution It's not just still vs motion. When the camera moves slowly you can see more resolution. Even in real life your eyes lose focus in fast motion, so I personally don't see motion resolution downfalls of modern displays very serious at all. The games I play often have slow moving objects or I stop to look around and thus the games look great when it really matters. When I move I focus on the movement and action not to the small details. The stuff about motion looking like 300p is silly hyperbole. In practice 300p game doesn't look anything like a 1080p let alone 4K game, outside of some really rare border cases.

Black frame insertion is pretty much dead now with it being largely incompatible with proper HDR performance.

I tested that earlier where I turned BFI on and off in MGS5 for example. The sharpness difference was huge. In any case, in motion it looked very blurry without BFI and with BFI it was clear.

There is a new technology from Philips which doubles the motion resolution without any disadvantages on their OLED TV. I have never seen anything like it before.
Engaging [Perfect Clear Motion] was far more useful, boosting motion resolution (as determined via Chapter 31 of the FPD Benchmark Software test disc) from the sample-and-hold baseline of 300 lines to 650 lines. Running our torture tests, we couldn’t detect any interpolation artefacts or SOE, and there’s no visible flicker and drop in light output that are normally associated with black frame insertion (BFI) either.

We’re still investigating how TP Vision/ Philips has achieved this, but from watching many football matches on the 55POS901, it delivered the sharpest, artefact-free motion we’ve witnessed from a 4K OLED television to date. Some LCD-based displays from Samsung and Sony could reach a motion resolution of 1080 lines (or higher) with the help of BFI +/- interpolation, but OLED’s near-instantaneous pixel response time meant that there’s no dark-coloured smearing (which can affect VA-type LCD panels) at all.
http://www.hdtvtest.co.uk/news/55pos901f-201702274433.htm

[Perfect Clear Motion] “Off” |
pcm-off.jpg


[Perfect Clear Motion] “On”
pcm-large.jpg
 
Last edited:
It has less aliasing artefacts compared to a game with a lower resolution but sharpness in movement is more comparable to a downsampled 300p video.

Example:
That's a fast side-scrolling motion. Yes, that's low quality. In an FPS moving into the screen though, where the pixel variation is far smaller? Or a top-down hack-and-slash? You've taken a worst case scenario and applied it as a blanket truth whilst ignoring your own senses!

Take any PC FPS on any display. Play it in 1080p30 and then 300p60, and see which looks sharper... Heck, even a side-scroller, worst case, doesn't move that fast as to cause the same concerns. Play Trine 2 at 300p and tell me 1080p30 looks just as blurry! Furthermore, take a worst-case that actually matches your situation, like Sonic the Hedgehog. The whole screen is moving a bazillion pixels a second and the quality is shot, no better clarity than 300p. Except when you look at Sonic, he's crystal sharp at 1080p, and fuzzy at 300p, because he's not smearing all over the screen.

The blanket statement that as soon as the camera moves, games look like 300p, is patently untrue. And the idea that games at high res at 30 fps doesn't bring anything is clearly nonsense. With increasing movement deltas, lower framerates degrade in visual fidelity faster than higher framerates, but 1080p30 does not look like 300p.
 
lines of motion resolution is outdated, but it's a fact that for various reasons lcd screens and sample and hold displays can resolve far less detail when the picture is moving, depending on the quality of the display. I just switched to gaming with my console hooked up to my old monitor, because my tv had too much blur even under game mode, which made playing fast shooters like Wolfenstein difficult.

As a more fair comparison, I would say 900p60 > 1080p30 in terms of image clarity while the picture is moving.
 
Last edited:
Back
Top