Digital Foundry Article Technical Discussion Archive [2015]

Status
Not open for further replies.
Yeah most probably, they specifically said they either want a locked 30 or locked 60, consistency in frame delivery is very important for ND according to the numerous interviews on the matter. Having a consistent 60 is not an easy task, i am sure it would be easier for them to just have it variable but that would hurt the overall experience. This whole 30/60 debacle is why I kinda hope we get variable refresh rate in the next gen of consoles (ala G-Sync/Freesync).
 
Yeah most probably, they specifically said they either want a locked 30 or locked 60, consistency in frame delivery is very important for ND according to the numerous interviews on the matter. Having a consistent 60 is not an easy task, i am sure it would be easier for them to just have it variable but that would hurt the overall experience. This whole 30/60 debacle is why I kinda hope we get variable refresh rate in the next gen of consoles (ala G-Sync/Freesync).
We could get the tech in the consoles, but then TVs wouldn't be able to do anything with that. I still haven't heard anyone even. Remotely mentioning such tech in the TV world. Everyone is obsessing over OLED, 4k and HDR TVs but nothing to say about Gsync/Freesync.
 
If I understand that correctly, they've got the free sync protocol included in an optional DisplayPort standard but it will still require TV manufacturers to include the actual hardware for it to work.

I can't see this hugely taking off given the niche use.
 
If I understand that correctly, they've got the free sync protocol included in an optional DisplayPort standard but it will still require TV manufacturers to include the actual hardware for it to work.

I can't see this hugely taking off given the niche use.
The link I gave mentions Freesync over HDMI, which is different from the original Displayport outing of the Freesync tech. If they can get console manufacturers on board (especially Sony which also produces TV's), we could get this in the following generation of consoles, with TV's supporting it.
 
DisplayPort is part of the HDMI standard so by including it in there means it may get Trojan horsed into HDMI - although not certainly given that part of the standard is optional and many parts are DisplayPort are not included in many TV's HDMI implementations.

But having support for dynamic refresh rates as part of the standard and manufacturers actually including provision for this in hardware are very different things.
 
Is there any reason to stick with fixed refresh rates as standard rather than move to all content being dynamically refreshed? On constant content like movies, you just provide a constant timing, but it'd also mean you could adapt to different rates on the fly. Support for PAL and NTSC content, and then 24 Hz, would then be seamless, plus whatever content you get from devices in the future.
 
DisplayPort is part of the HDMI standard so by including it in there means it may get Trojan horsed into HDMI - although not certainly given that part of the standard is optional and many parts are DisplayPort are not included in many TV's HDMI implementations.

But having support for dynamic refresh rates as part of the standard and manufacturers actually including provision for this in hardware are very different things.
As far as I know, the two standards are very much separate. HDMI and DisplayPort are handled by completely different parties.
 
As far as I know, the two standards are very much separate. HDMI and DisplayPort are handled by completely different parties.
Apologies, you're absolutely correct - I am thinking of Thunderbolt!
 
Is there any reason to stick with fixed refresh rates as standard rather than move to all content being dynamically refreshed? On constant content like movies, you just provide a constant timing, but it'd also mean you could adapt to different rates on the fly. Support for PAL and NTSC content, and then 24 Hz, would then be seamless, plus whatever content you get from devices in the future.

Imo, a dynamic refresh coupled with a display with dynamic refresh would be much better and much more natural. Everytime you lock framerate, you are basically wasting resources as you need this extra headroom in the worst case, with all other cases having an easy time.
 
Is there any reason to stick with fixed refresh rates as standard rather than move to all content being dynamically refreshed? On constant content like movies, you just provide a constant timing, but it'd also mean you could adapt to different rates on the fly. Support for PAL and NTSC content, and then 24 Hz, would then be seamless, plus whatever content you get from devices in the future.

If you're suggesting an adaptive framerate that is higher than 24/25/30Hz then there is obviously bandwidth to consider and I've never yet met a video editor who found time syncing video and audio tracks during post to be a particularly fun endeavour so to adding to the misery by introducing an adaptive framerate on the video track may well result in mass suicide in the video editing industry.

Also for actual recording of footage for TV and movies you want to be capturing video at the exact rate you intend that section to be played back at, or at a faster rate but obviously a multiple of the intended playback rate. E.g. if you capture at 60Hz but intend to playback at 40Hz, that's going to look shit.

I think it just adds considerable complexity for questionable gain to content creation.
 
Not adaptive framerate during a film. These would remain at a constant framerate. I'm suggesting getting rid of fixed framerates in displays, so losing the 24, 25, 29.97, 30, etc predefined framerates with support in TV hardware and going with a flexible refresh model a la Freesync. Whatever your source framerate, the display adapts to it because they all use flexible source-driven refresh. This means the same display can show everything from 24 Hz up to 120, and means content provides have a choice of framerates including those not supported in current standards, so 48 fps or 35 say. And then for computer content, it'd give game developers the option for alternative framerates like 45 as a happy compromise between 30 and 60.

Now that the technology exists for variable refresh, is there any reason not to use it universally?
 
Ah, I see. Sorry it was the use of "on the fly" comment that made me think you wanted variable framerates within video content. Then no, I can't see what the problem is. Many TVs already support real 24Hz (24p) and 50Hz and 60Hz - and their respective low persistence versions: 100Hz, 120Hz, 200Hz and 240Hz on some sets which suggests a wide range of operating frequencies, nor can I imagine that panels are made to operate specifically at these rates, they are surely operating from a programmable PLL.
 
The re-engineering of Ethan Carter

One interesting part of this interview comparing PC, Ps4 and previous gen development:
Looking at the actual development process on PS4 itself, the Astronauts are unequivocally positive: "It was the best console development experience I've ever had," Poznanski says. "Compared to the previous generation, the current hardware is very well balanced - we didn't encounter any bottlenecks. We didn't need to bend over backwards to utilise full hardware potential, and it was a genuine pleasure to free our minds of PC driver issue worries."
 
It's a story driven game though, cutscenes are a big deal for Uncharted 4. Plus, i am sure none of us will say "what if it was 60 fps" when we get out hands on it, do you think ND will fail to deliver after their past success? I don't buy that. And besides, such decisions are not made lightly, i am sure they argued a lot till they reached the 30/60 target for SP/MP, and it is probably for a good reason.

Errrrrr...... no. Not even close. As Globalisator has already said, cutscenes have absolutely nothing to do with it. That's like saying that just because people like playing games at 60fps, means they prefer cinematic movies at that framerate too (instead of 23.97). No, I exclusively like 60fps because of the feel it gives to games. And because in an interactive game, you will often find yourself panning the camera quick (because you as the player dictate the camera) which is where you will have fast moving pixels over a big distance and a lower framerate will be jarringly visible. What framerate a cut-scene has is mostly irrelevant in this context because it is not interactive, hence a controlled situation so the team can chose on how pronounced the lower framerate (in such a scene) is or not. A cutscene is also a storytelling set-piece, so a director can influence on where the viewer looks by using focus and camera to highlight or emphasize what is being shown. It's entirely different to when you are playing and you the player decide what you do, where you look, how fast you pan the camera around etc.

I didn't want to start another framerate discussion. I am sure I will love U4 just as much if it is 30fps (because to this date, I have no idea how a 60fps Uncharted will feel like - TLoU being a bit of a different game). That will change however with the shipping of the remasters and people will be able to compare how a 60fps Uncharted game compares to one at 30fps, although much more complex graphics. Who knows, by the time I play U4, I might forget what is was like to play U2 @ 60. Maybe. Maybe not. Every time I play a KZ game (solo campaign), framerate is one of the things I hate about it and despite the much better graphics, I find myself wanting to go back to BF4 or CoD for it. But we'll see.
 
Meh, i don't feel it's that much of a deal. I constantly swap between 120 hz + games and 30 between my PC and consoles and it doesn't bother me that much. The only case where i need the game to run at 90+ is if a competitive multiplayer game like Battlefield, different mmos or League/Dota. I am fine with them going 30 for SP because it is exactly that, a single player campaign, i want them to make the concluding chapter to this series look the best it can and i am fine with the compromise personally. Would i say no to 60 fps? Hell no. Is it necessary for single player? I don't think so.
 
Status
Not open for further replies.
Back
Top