Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
So they were actually shot on video (and then edited etc on video as well)? Do you know any shows that were actually shot and edited this way?

I thought pretty much ALL TV stuff (shows, sports etc) was shot at 50/60hz, interlaced. Which in the UK at least has carried over to 1080i HD channels.
Could be wrong.
 
I've watched HFR 3 times. The first time, I noticed there were several moments where it looked like the projector was playing at double speed, which made the motion look ridiculously fast. My wife isn't very technical, but she loves the clarity and brightness of the Hobbit movies.

My 7 year old isn't articulate enough to say what was different about it, but she was also in love with the movie magic.

Personally, other than that first time (and then only for 2 brief moments early on), I am fully onboard with HFR cinema. This whole argument reminds me of people who complained about HD a decade ago.
 
People complained about HD? :oops:
I've watched both of the Hobbit films in HFR. In both cases, the smoothness bothered me for the first 20 or so minutes and afterwards I got used to it. I'm not a big fan of 3D, but 3D really pops in HFR. I think it's psychological, we're used to 24fps films and the 48fps reminds us of video which subconsciously we probably think is 'cheap'. I think it's a bigger deal than 3D itself and hope it's here to stay. I think Edison himself once said that anything less than about 48fps might make people sick or something like that. And they made it 24fps because of sound IIRC.

Back on topic regarding the DF article on Lego MSH, I wonder if PS4/XB1 parity was a choice? I would hate for this to be the case in future cross-platform titles.
 
I've recently watched the Extended Cut bluray of The Hobbit and I've noticed some scenes, which even in the 24p version looked off in terms of movement speed. There's one scene early on showing a waterfall. But it looks like the water is falling MUCH too fast, making it seem like the simulation was set to 24Hz and thus undercranking the scene. But... I could be wrong too. It's hard to exactly pinpoint it in retrospect.

In any case, I've really liked the 48p version. We went with people and all of them wanted to see the HFR version of the sequel^^ None of those people except me are into all the tech stuff in games, pc and movies. So I was pleased to see this reaction^^
 
People complained about HD? :oops:
I've watched both of the Hobbit films in HFR. In both cases, the smoothness bothered me for the first 20 or so minutes and afterwards I got used to it. I'm not a big fan of 3D, but 3D really pops in HFR. I think it's psychological, we're used to 24fps films and the 48fps reminds us of video which subconsciously we probably think is 'cheap'. I think it's a bigger deal than 3D itself and hope it's here to stay. I think Edison himself once said that anything less than about 48fps might make people sick or something like that. And they made it 24fps because of sound IIRC.

Back on topic regarding the DF article on Lego MSH, I wonder if PS4/XB1 parity was a choice? I would hate for this to be the case in future cross-platform titles.

I dont see whats wrong with parity in multiplat titles. Yes the Ps4 has more shaders and rops but when you are targeting a release on 3 systems are you really going to invest time and money into optimizing your game on one platform over the other?

I think the main differences will be similar to last gen but in reverse. The Ps4 will have higher resolution and framerates on some multiplat games this gen. Where as last gen the 360 had the advantage in res and framerate. Alot of games this gen will have parity just like last gen.
 
There's nothing wrong with parity at all, but there is a gap in performance this time around that really wasn't there last gen. Many 360 titles had the edge in the first few years but most PS3 and 360 releases have been overall within parity for a while. A lot of that was due to the architectural differences. The PS4 and XB1 are directly comparable because they're architecturally the same. The difference being the PS4 has 50% more shaders and TMUs, twice the ROPs, and overall is more than 40% higher in GLFOPs. Which has been talked about at length elsewhere. Optimizing for the XB1's fast eSRAM isn't ever going to achieve parity, it will just get that 50% closer to 40% so-to-speak.

If this is the best TT could do given the time and budget and multiple platform releases, then that makes sense. But every cross-platform game has been better on PS4 so far except for this and it's missing some of the effects that the PC version has including SSAO. If they really are identical, I'm leaning towards platform parity being some sort of a business decision more than hardware optimization. I'm thinking a lot of games will see less parity as time goes on, especially compared to the last generation.
 
People complained about HD? :oops:
I've watched both of the Hobbit films in HFR. In both cases, the smoothness bothered me for the first 20 or so minutes and afterwards I got used to it. I'm not a big fan of 3D, but 3D really pops in HFR. I think it's psychological, we're used to 24fps films and the 48fps reminds us of video which subconsciously we probably think is 'cheap'. I think it's a bigger deal than 3D itself and hope it's here to stay. I think Edison himself once said that anything less than about 48fps might make people sick or something like that. And they made it 24fps because of sound IIRC.

Back on topic regarding the DF article on Lego MSH, I wonder if PS4/XB1 parity was a choice? I would hate for this to be the case in future cross-platform titles.

I'd say it was 'good enough' for a launch title PC port, supporting two player split-screen at native res and all. But unless they up the graphics considerably, I'd be disappointed if they couldn't up the framerate for the next one.
 
I've watched HFR 3 times. The first time, I noticed there were several moments where it looked like the projector was playing at double speed, which made the motion look ridiculously fast. My wife isn't very technical, but she loves the clarity and brightness of the Hobbit movies.

My 7 year old isn't articulate enough to say what was different about it, but she was also in love with the movie magic.

Personally, other than that first time (and then only for 2 brief moments early on), I am fully onboard with HFR cinema. This whole argument reminds me of people who complained about HD a decade ago.

Looking around I found there are other ways to produce the TV soap opera look. Higher refresh rates (120-240 hz) combined with interpolation can produce the effect.

I don't think it will ever go away for film unless filmmaking introduce better lightning techniques to fool the viewer. I've run across the effect multiple times and it always stand out like a sore thumb. Its always the indoor scenes that stand out. Its like you are on set looking at the scene play out in realtime, which kind of defeats the illusion.

I don't think its related to 3d gaming much as set pieces aren't built on dark stage sets illuminated like a office building. I have seen cgi which itself had the effect. The AMD apu technology trailer displays the effect.

http://www.youtube.com/watch?v=MUwjXQjQZZM

On the ship, starting at 0:50, the scene to me gives off the soap opera look. The lighting just seems off. Like the light intensity is too bright for such a small area to have that level of unlit areas in the scene. And times during the fight scene it looks like they are fighting on a concert stage with spot lights illuminating the knights.
 
Anybody curious about this may find the Wired article 'Racing the Beam' an interesting read. The Atari 2600 was my first experience of video games and I still remember the day my dad brought one home :D

Looking back, it's crazy what Atari (and Activision) programmers achieved with so little. You can google David Crane's recollection of developing Pitfall.

Man I too remember my dad bringing home the atari 2600. Also I remember him bring home the Vectrex. He basically started my gaming addiction, even though to me it seemed more about "neat" technology to him. He hardly ever played on either console and I had no ideal what an atari was at the time.
 
Grrr, since I was the one brought up the HFR in films, I feel obligated to reiterate the point that by making things running at higher frame rate, visually it can seem to have better clarity even when the resolution did not change.
 
Yes, HFR isn't really the discussion here. There's another thread for that. It was raised in this thread as a means (perhaps) to increase 'resolution' of the experience, with higher framerates in games managing to appear of a higher 2D clarity than their actual backbuffer resolution. To which I speculated that very high framerates may actually be able to reconstruct information as perceived within blurred pixels. If true, 120 Hz could render 1/4 the pixels of 60 Hz and yet produce a smoother movement alongside the same 2D resolution as far as the viewer is concerned.
 
Yes, HFR isn't really the discussion here. There's another thread for that. It was raised in this thread as a means (perhaps) to increase 'resolution' of the experience, with higher framerates in games managing to appear of a higher 2D clarity than their actual backbuffer resolution. To which I speculated that very high framerates may actually be able to reconstruct information as perceived within blurred pixels. If true, 120 Hz could render 1/4 the pixels of 60 Hz and yet produce a smoother movement alongside the same 2D resolution as far as the viewer is concerned.

I think people get too caught up in the "resolution" word. It would be more correct to say that you can see more "information" over a given period of time depending on resolution and framerate. And each has strengths and weaknesses relative to the other. Spatial resolution excels at static or near static images, where temporal resolution excels at moving images.

Regards,
SB
 
Looking around I found there are other ways to produce the TV soap opera look. Higher refresh rates (120-240 hz) combined with interpolation can produce the effect.

I don't think it will ever go away for film unless filmmaking introduce better lightning techniques to fool the viewer. I've run across the effect multiple times and it always stand out like a sore thumb. Its always the indoor scenes that stand out. Its like you are on set looking at the scene play out in realtime, which kind of defeats the illusion.

This is exactly my view as well. I've seen soaps at high framerate using interpolation or whatever other fancy methods TV's used for the effect 10 years ago and it looked just as unrealistic (or hyper realistic depending on your interpretation) relative to regular soap viewing as HFR does in movies today. I think its more do do with revealing the limitations of sets and lighting than it is with me being used to something else. Incidentally the issue was far less pronounced in the CGI scenes for me to the point where I didn't really notice it.
 
Live action Uncanny Valley effect...:LOL:

The uncanny valley effect crossed my mind, too. At 24fps, a viewer of a movie is distant and can enjoy a story from an objective position. At 60fps or higher, a viewer feels part of the film. That's great for a FPS, but bad for movies.
 
I think movies just need to be made differently. The past hundred years, movies were made for 24p viewing. EVERY ounce of optimization went into nothing but low framerate movie making.

Stuff like lighting and camera movement need to change dramatically now.
 
There's nothing wrong with parity at all, but there is a gap in performance this time around that really wasn't there last gen. Many 360 titles had the edge in the first few years but most PS3 and 360 releases have been overall within parity for a while. A lot of that was due to the architectural differences. The PS4 and XB1 are directly comparable because they're architecturally the same. The difference being the PS4 has 50% more shaders and TMUs, twice the ROPs, and overall is more than 40% higher in GLFOPs. Which has been talked about at length elsewhere. Optimizing for the XB1's fast eSRAM isn't ever going to achieve parity, it will just get that 50% closer to 40% so-to-speak.

If this is the best TT could do given the time and budget and multiple platform releases, then that makes sense. But every cross-platform game has been better on PS4 so far except for this and it's missing some of the effects that the PC version has including SSAO. If they really are identical, I'm leaning towards platform parity being some sort of a business decision more than hardware optimization. I'm thinking a lot of games will see less parity as time goes on, especially compared to the last generation.

I agree the architectures of the next gen systems are closer than ever.
I also agree that the Ps4 has a gpu advantage. Thats why I said it would win out in some cases. Where I disagree is the amount differences you will see as a result of more shaders and rops in multiplatform games. Third party devs arent likely to spend the extra time,money and resources to take advantage of the differences.

I also disagree with your statement about most 3rd party games looking much better on the Ps4. Infact out of all games released on both systems to this point most games do achieve parity. I can only think of 3 acceptions. Battlefield 4, Cod Ghost and AC 4. You might be able to throw in NFS but the only difference is depth of field and different methods of ambient occlusion. In reality though NFS is pretty equal on both systems. I just think 1st party and 2nd party devs are more likely to spend the time and money to take advantage of both systems architectures than 3rd party devs.
 
. I just think 1st party and 2nd party devs are more likely to spend the time and money to take advantage of both systems architectures than 3rd party devs.

I think that will only be true of third parties that aren't making PC versions of their games. Those that are are going to be able to make far fewer cuts to their games on the PS4 than they are on the XB1. Especially if using the more 'exotic' XB requires far more effort on their behalf. The majority of cases will be PC>PS4>XB1
 
Status
Not open for further replies.
Back
Top