A potential link between frame rate and resolution.

Just saw this at techreport and thought you guys might find it interesting to discuss.
http://techreport.com/news/27540/xbox-dev-explains-why-30-fps-isnt-enough

I trying to figure out if i have any games that have enough detail and run at a good enough frame rate at the same time to see if there is something to this. Then again I only have a 1600x900 monitor.

edit - and here is a link directly to the blog post in question.
http://accidentalscientist.com/2014...e-better-at-60fps-and-the-uncanny-valley.html
 
Hm... it has been a while since my graphics and visual information classes at Uni, but... iirc eyes don't sample the image. It's more like a saturation function.

I usually prefer 60Hz in games. But I can accept 30hz just as well, as long as the games aren't too fast. But... if that's because I lack information? Rather reaction time. Not sure though.
 
higher fps are always better because almost everyone is using LCD TV, so more samples = less blur on 'sample and hold' displays and as result more of the detail is perceived in motion
 
On my old Trinitron crt monitor I could see improvements up to at least 100 fps. 120 hz was great too, as 60hz material fit perfectly but with almost no perceptible flicker.

The biggest gains, however, with either triple buffer or vsync off always seemed to be up to about 45~50 fps. Beyond that the extra smoothness is mostly just delicious gravy.

Unfortunately, with fixed refresh displays this leads to either tearing or j j j j judder.

Freesync will be awesome. After more than a decade or might make me jump back to the red team

Properly optimised PAL 50 games had a perfectly acceptable level of smoothness, and higher resolution graphics. It was the crt flicker rather than the frame rate that made 60hz preferable.
 
It was the crt flicker rather than the frame rate that made 60hz preferable.

Funny thing is that we are going back to the flicker, because that is the only way how to display clear motion on sample and hold displays without motion interpolation artifacts.
Some TVs already have some form of scanning backlight system but it introduced 60 Hz flicker back and user can't control ratio of image/black frame duration, maybe in the next year models.
 
I think there is wastly less testing done at driving TVs at 50 hz. I am not so sure that every display actually works even though they should.....
 
Not really, just that Japan and US are 60hz but europe is 50hz, there is a good amount of content and "testing". Now modern 1080p displays accept both 50 and 60, with just a flag to define the "preferable" mode, changed when the same TV is sold in europe or US, besides the local Broadcast tuner. As for flickering from your environment lights, if it was really an issue with modern displays then all european gamers or development studios would already "saw" something was strange.
Meaning for many years the europe 50hz lamp flicker did not affect the displays when gaming at both 60 or 30fps for gamers rooms and studios!
 
Sorry cant edit:
The issue of mains frequency is more complex, from wikipedia. https://en.wikipedia.org/wiki/Mains_electricity_by_country

Most game markets would be 50hz, so 20ms frame times can be a good enough compromise on the better quality and fast response dilemma.

"East Japan 50 Hz (Tokyo, Kawasaki, Sapporo, Yokohama, and Sendai); West Japan 60 Hz (Okinawa, Osaka, Kyoto, Kobe, Nagoya, Hiroshima). 120 V in military facilities in Okinawa."
Australia, China, France, Germany, Netherlands, New Zealand, Sweden, UK, ... all 50hz.
 
I think there is wastly less testing done at driving TVs at 50 hz. I am not so sure that every display actually works even though they should.....

If you backtrack to the reasons for 50z vs 60hz it was entirely due to the native frequency of the AC electrical system of the region in question. Subsequently large parts of the world, including most of Europe settled on the PAL standard which was higher resolution (576 vertical lines) at 50hz compared to NTSC standard which was a lower resolution (400 vertical lines) at 60hz.

It was fun being a C64/Amiga coder in the 1980s as your code was expected to support 320x200 @60hz in NTSC territories and 320x256 @50hz in PAL territories. Good times :yes: Since TVs stopped using the AC frequency as their source for picture timing decades back, there's no reason that a TV shouldn't support 50hz, 60hz and 24hz (24p for movies).
 
...PAL standard which was higher resolution (576 vertical lines) at 50hz compared to NTSC standard which was a lower resolution (400 vertical lines) at 60hz.
480 lines for NTSC. And the difference meant some games actually ran slower in PAL territories. Probably a lot (all of 'em!), because the game engines were typically tied to vertical refresh.
 
But we got more resolution... not (well it depends, some games used it, others didn't). I am still amazed that we're actually still actively using overscan and interlaced options. TV could easily use variable framerate (i.e. 50/60 for sports and 25/30 for movies, or even 24p) and drop interlacing (actually, German public tv did just that, but only at 720P/50). As for overscan... I don't even need to start, I think.

Either way. I've finally managed to convince my older GPU (AMD 6870, needs driver patching to allow 4K via HDMI) to output 4K at 30Hz to my new TV. And I am actually amazed at how good it looks, even in "old" games. Dirt 3, on high settings, managed to run at locked 30Hz, even. But my amazement is actually more because I thought that, even with my longer viewing distance on my 55'' compared to my old 40'', it's still a massive improvement. I am guessing about 2-3 meters for gaming, 3-4 for movies, usually. I was actually convinced before I bought it, that the difference was "just for show". Now... "needing it" is a different question altogether, I am not asking here. I sure as hell don't regret purchasing that set.

So the TV supports BFI as well... in some test footage (moving lines etc.) it really improves clarity, but in movies or games, it doesn't help, imho. Games with variable framerate actually suffer a whole lot (I played God of War Ascension for the fun of it and although the games performance is, in general, good enough, BFI makes the inconsistent framerate MUCH more apparent).

I am yet to use a modern 120Hz+ display. I used to play 85Hz on my old CRT, at least of the games ran that fast at the time. But that's been a WHILE. Not sure if I need/want more than 60Hz. The image quality of a "good" 4K tv might offset the benefit of more than 60Hz (i.e. discussing 1080P at 120Hz+ vs. 4K at 60Hz)
 
Black frame insertion. Stick black frames between "real" frames to reduce sample-and-hold blur.

You need a decent refresh rate to do it at all due to flicker fusion threshold; there would be annoying flickering in a 60fps stream with 30 black frames and 30 real frames each second, TVs that support BFI tend to be 120Hz or higher. It also only makes sense for high source framerates, since a pattern like...

A (BLACK) A (BLACK) B (BLACK) B (BLACK)

...would produce stroboscopic ghosting in lieu of sample-and-hold blur, and thus wouldn't really fix the issue. You want:

A (BLACK) B (BLACK)
 
EBU tests show that in large distancies at most a 0.5 quality grade is achieved in 2160p.
Every framerate doubling brings a full grade. The most eficienty way to better quality is HDR, where the least amount of resources added bring more benefit. but the way forward shines with Occulus new prototype ( 2560x1440@90hz with only 1ms persistence).
 
My TV differs in how it does BFI, depending on the input source, as well. I.e. it produces a visible less strobing image with 24Hz material than 60Hz. But since I don't have a high speed camera handy, I can't tell if it's showing each image multiple times or not (I would imagine so, since it would mean a black image for more than 20ms per frame).
 
Back
Top