Digital Foundry Article Technical Discussion Archive [2013]

Status
Not open for further replies.
First you cite The Hobbit in support of your argument. Since there is no visual difference between the 2 framerate versions of the The Hobbit this was either a very poor comparison or you were in fact trying to say that low framerate can be artistically better than high framerate regardless of what other impact it has.

Next you specifically say that a developer may make a choice - independant of hardware limits (which is the only reason to trade framerate for pixel quality) - to run the game at a lower framerate for artistic reasons.

Now if I've misunderstood your argument then I apologise but really, it's easy to see how your argument could be misunderstood from the above quotes.

Well then you understoond my point and not even realize it? Yes both versions of The Hobbit have the same fidelity. What that then showed was that higher frame rate alone was not necessarily artistically better. Now how that relates to games is the fact going lower in frame rate will generally always provide better per pixel fidelity which is tangibly and artistically better.

This is incorrect. Films and games work differently so 24fps filmed footage is not comparable to 24fps rendered gameplay. See sebbbi's post further up for the explanation as to why.

That's not correct because you could simulate the film look by using motion blur and/or interpolation in a game too.
 
Last edited by a moderator:
This is incorrect. Films and games work differently so 24fps filmed footage is not comparable to 24fps rendered gameplay. See sebbbi's post further up for the explanation as to why.

That is for debate, precisely what is going on in this thread. And you CANNOT argue about game development artistic decisions, at least from development teams of the caliber of which we are discussing. You either like it or not like it. There is no further conversation on that matter.
 
The general consensus is...
The day internet consensus gets anything right will probably be the last day on Earth. And regards the article, although there's nothing there we haven't heard before, I think there is plenty there that many gamers haven't heard before. General consumers bombarded with casual metrics deserve at a least a chance to understand what's really going on with boxes, even if they choose to ignore it and trumpet mindless numbers.
 
Well then you understoond my point and not even realize it? Yes both versions of The Hobbit have the same fidelity. What that then showed was that higher frame rate alone was not necessarily artistically better. Now how that relates to games is the fact going lower in frame rate will generally always provide better per pixel fidelity which is tangibly artistically better.

The relation doesn't exist, you're mixing two completely different arguments.

1. Filmed/movie footage may be artistically better at a lower framerate due to the percieved realism of the difference in framerates - FINE

2. Rendered game graphics may be better running at a lower framerate bcause this compromise allows you to output higher fidelity graphics - FINE

There is no relation between the two argments. Low framerates in real time rendered graphics is a compromise based on hardware limits. In movies it's an artistic choice.

You wouldn't expect the PC version of Ryse to deliberately limit the framerate to 30fps would you? (since increasing the visual fidelity to saturate the hardware capability is not an option here).

That's not correct because you could simulate the film look by using motion blur and/or interpolation in a game too.

And to think all this time we've been struggling to hit 30fps minimum when we could have been targetting 24fps with motion blur for film level smoothness. You should spread the word amoungst developers of your breakthrough.
 
It's worthy of note that some (imho misguided) developers are happy to allow framerates to drop (consistently) to 25fps in cutscenes.
 
I think all modern TVs can sync to 24 fps so it is an option for a game to consider. We've a thread discussing whether that might be a good idea or not.
 
The relation doesn't exist, you're mixing two completely different arguments.

1. Filmed/movie footage may be artistically better at a lower framerate due to the percieved realism of the difference in framerates - FINE

2. Rendered game graphics may be better running at a lower framerate bcause this compromise allows you to output higher fidelity graphics - FINE

There is no relation between the two argments. Low framerates in real time rendered graphics is a compromise based on hardware limits. In movies it's an artistic choice.

How is higher fidelity art not an artistic choice? Are you actually saying game quality graphics is art and CG is not art?

You wouldn't expect the PC version of Ryse to deliberately limit the framerate to 30fps would you? (since increasing the visual fidelity to saturate the hardware capability is not an option here).

Sure I would because I know the most powerful PC could still be brought to its knees with CG level assets/simulation which is the ultimate goal. I'd cap it at 30fps and keep throwing stuff at it until it cannot render at higher frame rates. Of course the assets are even higher quality than the assets used in the console version. There are many things in the console version that could be improved.

And to think all this time we've been struggling to hit 30fps minimum when we could have been targetting 24fps with motion blur for film level smoothness. You should spread the word amoungst developers of your breakthrough.

I know at least one released current gen game that has already used this technique...it's not new man....;)

There is also a tech demo of frame interpolation being done with current gen that Star Wars game...
 
You do remember that Lair was rendering at 800*1080? Even less pixels than 1280*720.
Although, they used 2xMSAA and somehow messed with the buffers to get 1600*1080 in terms of coverage, but the number of shaded fragments was still less.

I forgot about that ! Lair's scope was still amazing when you put everything together. They squeezed in everything from mini-soldiers in formation to ginormous enemies to 1-on-1 rider close combat in a vast, seamless sea-air-land game world. They also devoted resources on the audio side with 7.1 dynamic orchestral music. All in a split-pool 512MB box near launch.

Was buggy and had game design issues. Sony should have spent more time polishing their game. I mean look at The Last Guardian. :runaway:
 
Sure I would because I know the most powerful PC could still be brought to its knees with CG level assets/simulation which is the ultimate goal. I'd cap it at 30fps and keep throwing stuff at it until it cannot render at higher frame rates. Of course the assets are even higher quality than the assets used in the console version. There are many things in the console version that could be improved..

This is getting ridiculous. PC hardware is always moving and will always outpace engine and art assets. Eventually there would be a PC that could max out everything and still get 60fps. Are you claiming that players would limit it to 30fps for some artistic reason? :rolleyes:

30fps is nothing but a compromise, no matter how you spin it. Every dev would rather have 1080P 60fps all things being equal. They then compromise on frame rate and resolution to make the best looking game and best playing game possible (30fps already compromises the "best playing").

With your logic why don't devs just aim for 24fps with triple buffering and 540P to maximize "pixel fidelity" (whatever that is)? Because it is shit and no one wants it. 30fps leaves double the time to render the frame and 900P means less pixels to push. The magic word is compromise, not "cinematic", not "balance".
 
This is getting ridiculous. PC hardware is always moving and will always outpace engine and art assets. Eventually there would be a PC that could max out everything and still get 60fps. Are you claiming that players would limit it to 30fps for some artistic reason? :rolleyes:

30fps is nothing but a compromise, no matter how you spin it. Every dev would rather have 1080P 60fps all things being equal. They then compromise on frame rate and resolution to make the best looking game and best playing game possible (30fps already compromises the "best playing").

With your logic why don't devs just aim for 24fps with triple buffering and 540P to maximize "pixel fidelity" (whatever that is)? Because it is shit and no one wants it. 30fps leaves double the time to render the frame and 900P means less pixels to push. The magic word is compromise, not "cinematic", not "balance".

CG doesn't stand still either....so I'm not sure where you're going with your "argument"...

Look at the stuff Laa-Yosh has been working on....sorry man PCs aren't there yet. That kind of stuff takes probably hours to render a single frame on a single PC...
 
Was also utterly, ridiculously ugly and rushed.

The landscape LOD was aggressive when you fly around too. ^_^ The dragon and city designs look good.

It got delayed to/released in September 2007. But that game was no small engineering feat. It looks like the developers really did believe in a "nextgen" vision even by today's standard, but had to fit everything in a smaller reality called PS3. ;-)

Glad I bought and played the flawed game.
 
@DrJay24
1080p 60fps can also be a compromise, nothing more than tick boxes on the back of a case and for reviews, and now for people who seems to think that's the only thing that matters.

I don't understand why people believe every game has to be 1080p60.
If a game does not require or benefit from it then your compromising what you may be able to do otherwise just to achieve it.

Why are people even discussing some sort of phantom machines decades down the line?

Everything is a compromise, and you can make your compromises based on artistic reasons also. As in, what ever the power of the box, you would still aim for 900p30.
Will that, and should that be the case for every game no.
 
1080P is the max of most TVs, 60fps is the max of most TVs. They represent the "best of" target for most homes, they are not compromises in themselves. I never said all games have to to be 1080P @60fp. I said "all things being equal" - never mind, I'm not going to repeat my whole post, I think it was clear.

900p 30fps is not an artistic choice, it is a technical compromise to get what you want from a fixed piece of hardware. If you took all that art and did a PC port next year, people would be playing at 1080P (or higher for monitors) and 60fps. Guess what? It would look and play better, not worse, there is no artistic choice involved, it is objectively better to have more frames and more pixles!

Did Alan Wake become a worse game on the PC when it wasn't 540P? Was 540P are artistic choice or a compromise for the hardware? The only choice Remedy had was dumb the effects down and use more pixels or not (simplified), if they could get both they would have.
 
900p 30fps is not an artistic choice, it is a technical compromise to get what you want from a fixed piece of hardware. If you took all that art and did a PC port next year, people would be playing at 1080P (or higher for monitors) and 60fps. Guess what? It would look and play better, not worse, there is no artistic choice involved, it is objectively better to have more frames and more pixles!

Which are not games and not real time, why bother...

WTF are you talking about?:LOL:

If the game was made from the ground up instead of a direct port Crytek could use higher quality source art and bring PCs down to a crawl.;)

Think way higher polygon counts for everything for one, real time simulations on everything in the world, real time weather simulation, instancing on every character so no soldier looks or moves exactly alike, more sophisticated AI on all NPCs. The list goes on and on. Also CryEngine is compatible with pro CG file format so it's easier than you think.

@DrJay24
1080p 60fps can also be a compromise, nothing more than tick boxes on the back of a case and for reviews, and now for people who seems to think that's the only thing that matters.

I don't understand why people believe every game has to be 1080p60.
If a game does not require or benefit from it then your compromising what you may be able to do otherwise just to achieve it.

Why are people even discussing some sort of phantom machines decades down the line?

Everything is a compromise, and you can make your compromises based on artistic reasons also. As in, what ever the power of the box, you would still aim for 900p30.
Will that, and should that be the case for every game no.

Yep some people keep on ignoring 60fps being an upgrade in frame rate but down grade in graphics.
 
Last edited by a moderator:
If the game was made from the ground up instead of a direct port Crytek could use higher quality source art and bring PCs down to a crawl.;)

And there would still be PCs that would run it at higher resolutions and higher frame rates. You do understand that PCs are not fixed hardware, there is an ever changing spectrum of power, you can not make a game tap out a platform that is by definition unknown. Should the devs try to lock that down to fit their "artistic vision" or let them run higher? Would the game look and play worse after those upgrades?


Yep some people keep on ignoring 60fps being an upgrade in frame rate but down grade in graphics.

Straw man? Everyone here but you seems to understand it is a compromise, that was my whole point. You seem to think 30fps is the perfect frame rate no matter what, which is ridiculous.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top