Digital Foundry Article Technical Discussion Archive [2015]

Status
Not open for further replies.
That actually makes me wonder, what would game devs think about 50fps instead of 60? It's probably still reasonably fluid and much better than 30fps; but they could use up to 20% more computing power...

That would be a very interesting looking game, that's for certain.
 
But you couldn't though. As I say, dropping quality (of which there were limited options back in the day) lead to racing framerates, which could still crash in busy scenes.
Davros said PC wouldn't lock the framerate. That's because the couldn't. They put the control in the PC owners hands. Console gave the option to tailor the game. I think basically you're agreeing with me. ;) I was just explaining to Davros the difference and why forced vSync on PC wasn't a thing where it was on consoles. There were even faster than 60 Hz monitors at the time, so capping a game at 60 Hz on PC made zero sense.

Mostly agreeing with you. But there were some titles on PC that forced V-sync (uncommon) or had an option to either force Vsync or not (more common). PC gamers generally hated the titles that didn't give an option to disable Vsync, however, as they continue to hate them even today. :) But yes, forced Vsync is far less common on PC. Optional in game Vsync, however, is more common and more accepted. But that has more to do with PC gamer's abhorrence of it than developers being unable to implement it on PC (even back in the late 90's [1996 onwards]), I believe.

That is why, I'd imagine, many Xbox (Xbox, X360, and XBO to a lesser extent) games didn't have Vsync while PlayStation games did. And that went for the multiplatform games as well. As Xbox was more closely associated with its PC gamer roots.

Regards,
SB
 
Last edited:
All TVs support 60 Hz (it seems). It's uncertain whether all TVs support 50 Hz. Some quick Googlage suggests the US TVs could but the FW doesn't allow it.

Not just the US market. But TV's for the Japanese and South Korean (oddly while S. Korea was NTSC, N. Korea is/was PAL) markets mostly only support 60 hz as well as there is no need for 50 hz in those countries. And virtually all European countries appear to have dropped support for PAL or are in the process of converting to a system that doesn't require 50 hz.

But yes, considering that >= 50% of the world market for console games doesn't support 50 hz, it wouldn't make sense for a developer to target 50 hz.

And dangit, I've spent too much time on the forum again today. :D

Regards,
SB
 
And I was hoping freesync to make it into TV's, and we have TV's locked out of frequencies by firmware, it makes no sense to lock them out, PAL TV's aren't locked out of 60.

(Still would like to see freesync in gamer oriented TV's and support in PS5)
 
Freesync is the better solution. Then framerates wouldn't matter. That'd be too forward thinking for the TV industry though. At least everything moving over to 60 Hz is better than maintaining the artificial PAL divide (cameras that film 1080p50 and are blocked from 1080p60 in FW).
 
Could be wrong, but I was under the conviction that HD standards such as 720p and 1080p, and now 4K, were all based on 60Hz. And that PAL TVs still support 50Hz for standard definition material, which is still at 50Hz. I don't think 1080p/50 as such has ever actually existed or been used. Has it?
 
I don't think it has either, but isn't The Hobbit recorded at 48fps? I guess that'd better fit PAL?
 
You'd wind up with the strange over speed effect just like the old days where 24fps films ran just slightly too quick on 25hz PAL. No idea why PJ decided to push 48hz just when we finally got 120hz TV's for smooth 24fps playback.
 
Could be wrong, but I was under the conviction that HD standards such as 720p and 1080p, and now 4K, were all based on 60Hz. And that PAL TVs still support 50Hz for standard definition material, which is still at 50Hz. I don't think 1080p/50 as such has ever actually existed or been used. Has it?
I bought a Sony NEX 5N camera and its recording modes were only 50 Hz. A Sony HD camcorder before that was also only 50 Hz. I'm pretty sure all UK HD broadcast (and streams) are 50 fps to boot.
 
Freesync is the better solution. Then framerates wouldn't matter. That'd be too forward thinking for the TV industry though.

Alas, the HDMI technical committee work slowly on debating and ratifying proposed technical additions to the standard. I know AMD did a proof of concept using HMDI 1.4a unless it's seemless with older equipment (and "novel programming" sounds like it may not be, we could be waiting awhile.

Sony should really be pushing this for gamers, HDMI is too focussed on the wants of TV manufacturers.
 
Could be wrong, but I was under the conviction that HD standards such as 720p and 1080p, and now 4K, were all based on 60Hz. And that PAL TVs still support 50Hz for standard definition material, which is still at 50Hz. I don't think 1080p/50 as such has ever actually existed or been used. Has it?

50Hz is still pretty standard for HD material in Europe. And since there's no resolution difference between the two standards any more, you get a nice decrease in transmission bitrate meaning more channels per multiplex.
 
The DVB standard for PAL territories mandates BT.709 which requires support of MPEG-2 ATSC, DVB 25Hz HDTV and DVB 30Hz HDTV. BT.709 wider mandates support for 60Hz, 50Hz, 30Hz, 25Hz and 24Hz frame rates and this is why most PAL TVs support the full monty - 24, 25, 30, 50 and 60Hz.
 
A lot of media production (like sports) in Europe is done at 50fps which creates problems when you stream it since that is usually 60 fps output.
 
Even at 48 frames, the effects work in the Hobbit movies often looked a bit wonky and unfinished. Imagine the rendering capabilities needed to produce comparable results at 3 times the framerate.
 
There were no big issues on hdmi formats even in stereoscopic games for consoles, were there is a ton more formats than just fps.

HD Ready certified displays are required to support 1080p at 50Hz and 60Hz https://en.wikipedia.org/wiki/HD_ready
ATSC also supports PAL frame rates and resolutions which are defined in ATSC A/63:1997.
Then update spec to include H.264 A/72 Part 1:2008 included 1080p at 50, 59.94 and 60, previous only supported 30 progressive at 1080.

So even the US actually can use broadcast 50Hz since 2008. I dont think TVs manufactured after this date ignore this, but if anyone here knows about a recent 60Hz only display?
 
The cameras they used could only do 48 fps then. Also, it is very easy to do the 24 fps version.
True but each frame in the 24fps version has been 'exposed' for 1/2 the time meaning that the smearing and blurring that give 24fps it's 'look' are eliminated. They could presumably use s/w to simulate the extra exposure but the BD looked odd to me when I saw it a few months back.

Interesting that manufacturer's would restrict 50/60 by f/w it just seems so unnecessary, the only reasoning that comes to mind is a concern about 'grey market' leakage?
 
True but each frame in the 24fps version has been 'exposed' for 1/2 the time meaning that the smearing and blurring that give 24fps it's 'look' are eliminated. They could presumably use s/w to simulate the extra exposure but the BD looked odd to me when I saw it a few months back.
They shot with 270 degree shutter on 48fps, so shutter speed of 1/64 even on 24fps version. (as you said slightly less smear.)

I'm quite certain that if they would have shot with fully open 1/48 shutter, the HFR would have felt better for many viewers especially for 24fps DVD/Blu-ray. (it would have matched 24fps 180 degree shutter perfectly.)
 
True but each frame in the 24fps version has been 'exposed' for 1/2 the time meaning that the smearing and blurring that give 24fps it's 'look' are eliminated. They could presumably use s/w to simulate the extra exposure but the BD looked odd to me when I saw it a few months back.
Actually, The Hobbit used a 1/64th second exposure, splitting the difference. I was quite surprised when people praised how clean the 48fps version remained in motion, because I thought it looked very overblurred for its framerate. Whereas the 24fps version looked basically normal to my eyes; underblurring means motion vectors are less obvious, but it doesn't feel like something is "actively wrong" with the image. Maybe that's just the gamer in me being acclimated to unblurred 30fps video, though.
 
Status
Not open for further replies.
Back
Top