First you cite The Hobbit in support of your argument. Since there is no visual difference between the 2 framerate versions of the The Hobbit this was either a very poor comparison or you were in fact trying to say that low framerate can be artistically better than high framerate regardless of what other impact it has.
Next you specifically say that a developer may make a choice - independant of hardware limits (which is the only reason to trade framerate for pixel quality) - to run the game at a lower framerate for artistic reasons.
Now if I've misunderstood your argument then I apologise but really, it's easy to see how your argument could be misunderstood from the above quotes.
Well then you understoond my point and not even realize it? Yes both versions of The Hobbit have the same fidelity. What that then showed was that higher frame rate alone was not necessarily artistically better. Now how that relates to games is the fact going lower in frame rate will generally always provide better per pixel fidelity which is tangibly and artistically better.
This is incorrect. Films and games work differently so 24fps filmed footage is not comparable to 24fps rendered gameplay. See sebbbi's post further up for the explanation as to why.
That's not correct because you could simulate the film look by using motion blur and/or interpolation in a game too.
Last edited by a moderator: