The Great Framerate Non-Debate

That's for a reason with nothing to do with camera limits. Film is telling a story through the eyes of a third party observer, and deliberately shows events from multiple POV to build up the whole story. It's much easier to understand and relate to a person when you can see them instead of inhabiting their body. Even at 120 fps, a first-person movie would likely be a bit poop. ;) Although of course some have gone there too via the handycam movie, and cinema goers have enjoyed 24 fps horror. The low framerate probably helps with the confusion and visceral response.

I think you are missing my point. My point was more rather specifically, that film-makers are well versed in the limitations of their equipment, and use it to good effect. Good effect means in this context that a lot trickery involved in film-making (smokes & mirrors) are hidden by the low framerate. When watching a movie, the 24fps limitation is often not that noticable, since a lot scenes are usually dialog where the camera is on a steady surface, so the amount of movement is kept at a minimum. The exception where 24fps does become noticable is in scenes with lots of movement, mainly action sequences or fast moving objects (but not too fast) or landscape pans. The effect of judder is also dependant on screen size relative to viewing distance. Sitting up close in a movie theater exagerates the effect - watching it at home on a small(er) screen with further viewing distance, decreases it. Also, even in action sequences, 24fps is usually less noticable because you tend to focus on what the director wants you to. Depth of field effects can also be used to great effect in these instances, which are less possible in games because when gaming from the first person perspective (or other), the game-maker can't know on what the gamer/player is focusing on, hence why the entire image is usually sharp. It's like looking through a window into the virtual world you're playing.

The other point is that the 'cinematic' effect is far fetched because both mediums have entirely different purposes. You rarely watch a movie feeling immersed to the point that you see yourself playing the main character. Because the story, as you pointed out, is usually told from the 3rd perspective and there are multiple plot strings and angles told at once. In a game, this is usually different, because it's an interactive experience. You are controlling your character on screen, so there's already a level of immersion that is far beyond what you get while watching movies from within a passive role.

Also, as I said, a 24fps feed is far superiour in motion smoothness to a 30fps game where each individual frames represent perfect razor sharp images.

ThePissartist said:
Contrary to popular opinion, I’d like to see more games using the technique used by GG on Shadowfall’s multiplayer to get higher fps (although at a higher average, not 40-50). If a game is created with 960x1080 in mind (I hope I’m remember SF’s resolution correctly) right from the start, you’re either going to have a game that potentially does near twice the graphical effects at 30fps, or near twice the framerate of a game that’s rendering at 1920x1080. The perceptible resolution drop isn’t SO bad, especially if you consider it’s less noticeable at standard viewing distances.

This is an interesting point; How many knew KZ:SF was doing something creative with the resolution to enhance the framerate in multiplayer? How many noticed the visual difference? Perhaps some more technically minded who know that a framerate improvement can only come at a certain sacrifice and thus, went looking. The majority of people though, I'm willing to bet, would rate the multiplayer visuals on par with the campaign visuals - which was sort of the point I was trying to make in the DriveClub topic. Assuming you are well immersed into playing the game at high speed, I'm willing to bet that a hypothetical game that took a few tradeoffs to achieve 60fps would be close to what we are marveling over in the 30fps version we'll be getting.

At this point, all we are doing is marveling over screenshots and some youtube videos. Sure the graphics are impressive. That's all we have to look at, at the moment. Once you're playing the game however, the graphics tend to lose significance and the immersion and gameplay factors increase.
 
"It might work better if you alter your style" is hardly a convincing argument that 48fps is strictly superior to 24fps in film, though. Especially in a world where editors deliberately occasionally pull shenanigans like dropping framerate (i.e. to 12fps) to emphasize or surrealize moments.

I was thinking more about changing the very methodology the techniques, than style.
The matter it's interesting but I fear that I if continue I will derail the thread am derailing the thread :LOL:
 
The majority of people though, I'm willing to bet, would rate the multiplayer visuals on par with the campaign visuals.

Actually the SF community noticed that the MP was less "good" than the campaign.
Also many asked GG to lock the frame-rate to 30fps in MP rather than keep variable frame rate because the experience it's not really smooth, and this happened before they "confessed" what the resolution in MP really was and before they gave the option to lock the SP.
 
Actually the SF community noticed that the MP was less "good" than the campaign.
Also many asked GG to lock the frame-rate to 30fps in MP rather than keep variable frame rate because the experience it's not really smooth, and this happened before they "confessed" what the resolution in MP really was and before they gave the option to lock the SP.

I agree that the framerate wasn't great in multiplayer, but if a game were created with that resolution and 60fps in mind from the start, it could have a fairly static fps.
 
Actually the SF community noticed that the MP was less "good" than the campaign.

Due to the resolution difference? Or because MP usually does look different, because there's a trade-off involved when delivering maps of multiple players sharing the same world in a heated, unpredictable and non scripted gameplay environment? I only played MP a short time and the visuals to the untrained eye are pretty similar IMO (despite the obvious different level design etc).

The resolution effect also, only seems to be noticable when in movement (due to the trickery they are doing with blending frames), so maybe the strange artefacts are what caused people to question what exactly was happening. I also didn't get the impression that it was such a big deal - but more that the difference is there if you focus on it, you see it, otherwise it's pretty much a non-issue and hardly noticable.
 
Bang for the Buck
Overall resume is:
"sharpness" is proportional to square of a curve of MTF that has contrast in y-axis and resolution in x-axis. Then EBU tested 4k at 2.7m with improvement limited to half-quality-grade. HDR improves contrast and as MTF shows with a bigger impact in "sharpness" than resolution. Then high-fps was tested and consistently showed one-quality-grade improvement per doubling of fps with low anchor less than 60fps and max tested at 240hz.

What you guys think about not only HDR but color reproduction ?

obs. nice QA a the end.
 
Last edited by a moderator:
In that scenario all your circles would become ovals and your squares would become rectangles

Not if you've formatted your aspect ratio correctly. Do you think all of the characters on Shadowfall were suddenly very short and wide in Multiplayer?

Your field of view might be narrower, but I guess that could be fixed by extrapolating data from the previous frame.
 
Due to the resolution difference?
Yes. Even before the announcement of reprojection from subnative resolutions, people were commenting that the IQ seemed significantly worse in MP than SP, even if they couldn't articulate why (since of course pixel counters had already come up with 1920x1080).

Though I suppose it is a somewhat complex issue, as the graphics in the MP are quite different from the SP.

The resolution effect also, only seems to be noticable when in movement
Yes. Movement makes temporal reprojection more difficult, and you can wind up effectively dropping to the "per-frame" native res. This is noticeable in some way or another in every game I've played that uses temporal reprojection.
 
At the end of the day, the justification for 24fps lies in the mechanical limitations in cranking film a century ago. There is no question whatsoever that the low frame rate of film will eventually be a thing of the past. The technical reasons for low frame rates today are all about industry inertia, but inertia is a strong force. The process can easily take decades.

Oh yes, let's ignore everybody who doesn't have the same perceptions as you. Very productive :rolleyes:

I was thinking more about changing the very methodology the techniques, than style.
The matter it's interesting but I fear that I if continue I will derail the thread am derailing the thread :LOL:
Well, forced perspective was gone in the Hobbit movies due to using stereoscopic cameras and was replaced with actors on green screens. Not exactly and improvement.
 
BS, see gamasutra test
push button on controller, see result onscreen = 50msec

1. controller -> 2. console process input -> 3. create framebuffer -> 4. send to TV -> 5. display result
1&2&3&4&5 = 50 msec together yet you say just 1&5 are minimum 100msec
I think you need to reassess your info

they sampled a few games, the results were
60fps game = 50msec best case (3 frames @ 1/60)
30fps game = 100msec best case (6 frames @ 1/60)


Not really. Those numbers are way to low. schmups and other precision games are unplayable on console, TV, and wireless controller. I have never seen a game with sub 100 ms lag.
 
At 60fps it's possible that the animations in a videogame will look worse; the motion capture (data interpolation) will need significant more work for the animation to remain 'realistic' instead of strange sped up robotic. 30 fps hides this.

The Witcher on PC is a pretty big offender with regards to animation when you have it running at 60+ fps; the videogame characters because strangely disconnected from the character models... if that makes any sense :D Maybe it's just the wonky animation in general but I tend to notice it more at 60fps in that game
 
Note about "bad CGI" - this is usually the fault of the director or the producers. The actual artists tend to get impossible jobs and do what they can to still get it into the theaters.

For example the final action sequence in Hobbit 2 was conceived about 6 weeks before delivery. Weta very nearly didn't make it.
 
Nobody is stopping you from presenting your case. In fact, please do.

I already have. Read the thread.

At 60fps it's possible that the animations in a videogame will look worse; the motion capture (data interpolation) will need significant more work for the animation to remain 'realistic' instead of strange sped up robotic. 30 fps hides this.

The Witcher on PC is a pretty big offender with regards to animation when you have it running at 60+ fps; the videogame characters because strangely disconnected from the character models... if that makes any sense :D Maybe it's just the wonky animation in general but I tend to notice it more at 60fps in that game

Yes, this is what I'm talking about. At 60fps it's much easier to spot the non-realistic parts.
 
To be fair, low fps doesn't have to be stuttery/juddery.

Low framerate is always stuttery/juddery when compared to 60 hz and especially 120 hz or...reality. It is trivially easy to see individual frames of animation and rendering in fast motion scenes. And for games even in slow motion pans. Even 60 hz seems a bit "stuttery" when compared to 120 hz in games. But at the moment for gaming I'm stuck at 60 hz as there are no 120 hz 30" monitors that do 2560x1600 resolution (I use the same monitor for gaming as I do for work, I'm a cheap bastard). Now that I've had a chance to experience high framerate movies and video, film in the theaters is extremely stuttery to my eyes, even with the natural motion blur in action sequences. 30 fps video on the internet is stuttery and feels disjointed making it difficult to fully enjoy. I won't go so far as to exaggerate and call it a slideshow, but I am surely tempted to.

You seem to handle Ryse and Dynasty warriors (together with all it's fps issues from waaaay back in PS2 days) perfectly well and that's just a few months ago, why suddenly all the hate aimed at 30 fps?

As you'll note if you read my post above, I can "make do" with 30 hz gameplay. But I am becoming less and less willing to do so as time goes by. Just like in the late 90's I had to "make do" with aliasing in games. I no longer am all that tolerant of aliasing in games.

Ryse as a specific example was fantastic visually on a frame by frame basis. Once I started panning the screen, moving, and playing the game, however, I had to consciously try to ignore the fact that it was only 30 fps. It would have been far more enjoyable (both as a game and visually) at 60 fps even with a reduction in some of the graphics effects, IMO.

And Dynasty Warriors, well it's on PC now, and thus I can play it at 60 hz. :p Although I have a beef with the fact they ported the PS3 version and not the PS4 version. Argh.

I haven't yet committed to not playing any game if it is locked to 30 fps, but I am seriously considering it. And the game will have to either hold a LOT of nostalgia value or feature something truly unique and groundbreaking for me to consider it. Currently I don't see anything of the sort on the horizon.

I'd rather not start playing a game going, "oooh and ahhh" at the graphics when I'm not moving and then as soon as I start moving or panning the screen, I'm wincing at the stuttery/juddery presentation. As I did with Ryse. Or any other 30 fps game in existence.

Regards,
SB
 
At 60fps it's possible that the animations in a videogame will look worse; the motion capture (data interpolation) will need significant more work for the animation to remain 'realistic' instead of strange sped up robotic. 30 fps hides this.

The Witcher on PC is a pretty big offender with regards to animation when you have it running at 60+ fps; the videogame characters because strangely disconnected from the character models... if that makes any sense :D Maybe it's just the wonky animation in general but I tend to notice it more at 60fps in that game


IF this is the case wouldnt almost all sport and fighting games have strange animations?
 
Back
Top