Digital Foundry Article Technical Discussion Archive [2013]

Status
Not open for further replies.
So now you're saying that the choice of 30fps over 60fps is to allow greater pixel fidelity. That's fine and a perfectly valid developer choice. Its not however what you were saying originally which was that 30fps could be better than 60fps all other things being equal. I.e. its a choice based on artistic preference and not technical limitations.

No I never said all things being equal, stop making things up.

As I said in the case of Ryse 30fps was a design choice because the art looked better. If they had more resources to run at 60fps they'd still bring it back down to 30fps and add even more stuff per pixel to make it look even better...they would keep doing this until there was no power left over for higher framerates....

My position never changed from my first post regarding this topic...

http://forum.beyond3d.com/showpost.php?p=1798877&postcount=4541
 
Last edited by a moderator:
I don't think it makes much more sense to argue that given that we take out the whole discussion of artistic preference in movies vs in games having theoretically the same factors, the difference is gameplay related. And one of the more important factors in gameplay is response speed to your actions. For visual stimulus to physical response, this time is about 210ms for people, with best cases typically around 130ms.

Very likely, this means that if your action has a response in the next frame on a 30fps game, that's about 33ms, and not too many people will complain, taking just the video processing into account. However, it's likely that a controller action is detected during a frame, and then the next frame draws the response, then that time goes to an average of 48ms (half a frame on average for the button press to register, then a frame for drawing the response). If you then add the time the TV takes to display the response, it's starting to show. And in practice, we see that a 30fps game usually takes longer than 48ms - typically 80ms, and then the display lag is added.

Fortunately people can adjust to a certain amount of lag. But the quality of the experience will benefit from as little as possible of it. 60fps brings the time for detection and then the time to draw the response to an average of 27ms, given detection during one frame, and then response drawn in the next. At close to 10% of human response time, that will be negligible for most people. But unfortunately of course even here display lag can add a considerable amount (but is not something game designers can do much about). Then there is netcode lag ... it all adds up, but there is no coincidence that multiplayer game designers in next-gen are starting to understand that whatever they can do to minimise lag at the graphics engine layer helps compensate for the lag netcode adds to an extent where gamers do notice.

In a non-interactive experience, 24fps can look filmic enough, but I never liked it. Sure, people get used to it, but it's limitations for capturing movement are too often visible to me. But we've had that discussion before.

The real crux of the matter is clearly that for videogames the amount of detail you can draw at 30fps is about twice that of 60fps, and that's still a big difference that game designers are going to keep being tempted at bringing out, and this will remain the case for any game that can still gain visual benefit from the extra time. After all, our brain seems to prioritise a lot of its resources to visual processing ...
 
It will likely be a year before we can really determine the real world difference between the consoles. Remember the whole 30 fps vs 60 fps of early Madden in the PS3 and 360? That wasn't indicative of the PS3's lack of power. And it got resolved eventually.

But the problem is first impressions matter, and if MS had less mature tools than Sony for a console being released at the same time, MS still messed up big time.

Can't just look at Madden.

It may also take more than a year to realize several "leap" in performances ?

In the PS360 era, the PS3 was more gimped by the split RAM size. The Cell + RSX combo has the necessary peak power, L1-level latency, and high internal bandwidth to work things through.

Most launch games didn't use the SPUs except for perhaps vertex culling, which explain the unrealized power. But for games that tap on the SPUs selectively, you could already glimpse at some potential (e.g., RFOM's glass simulation, Hedgehog physics, and 40 player MP right off the bat; 1080p vast sea-air-land Lair game world + 7.1 audio, 1080p F1 with rain, etc). Some were early effort, but you could see the developers try. It was not blind faith.

We have also heard developers (desperately) trying to implement better streaming subsystem to alleviate memory size limit. There is also the RSX's weakness in dealing with alpha transparency. Those problems would become more challenging for later games. They never did go away, but developers worked around them. The first parties simply craft games that are tailored to their target platform's strength.

In the sports game genre, MLB on PS3 was actually more interesting to watch than Madden

Nextgen ? If the developers develop their contents like a PC game, then the platform that is the closest to PC architecture may get an easier time. But the peak power, better latency, and tool advantage should be there as long as the developers, especially first parties, tap on them.

We should be able to see even Cloud contribution in various areas, from CPU MP to GPU-based remote game streaming.
 
Joker, do you also/still have the problem that you do a tec analysis of the onscreen content while gaming...like the dev said?

That curse remains but has been getting less and less as my tech knowledge slips away. It's been about 5 years since I wrote a line of code so I don't dissect games as much as I used to, but I still do sometimes stop while playing, look around, check how things work, ridicule the obviously lazy devs, etc. Hopefully one day I'll be able to just play a game like normal people do, but that day hasn't arrived just yet.


Just wondering, do you sometimes get the blues and miss all this high tec stuff going on in cutting edge game development, or are the 'bad' memories (like crunching time) overshadowing everything cool/positive ?

Hmm, do I miss it when I'm filming two red heads for my all girl website? Umm, not really. Do I miss it when I still read about 80 hour work weeks or when a buddy messages me that he is still in the office on a Saturday night debugging some code? Definitely not. Do I miss the instability of knowing that the company and/or job may likely be obliterated after the project is done? Nope. Having said that I do miss the tech challenge, the camaraderie of working on a game team, and the intense satisfaction of shipping a game that provides entertainment to so many. There really was nothing like it, and the people I worked with for the most part were all awesome. The work/life situation of that industry means I would never go back full time, although I have contemplated contract work every now and then. Then again my tech knowledge is now kind ancient really, things have moved on somewhat since I last coded anything there so more than likely I'll never return to it. In any case, filming girls is much more fun :)


function said:
For a PC example, think of the PC gamers who chose to run Crysis at 25 fps on very high instead of 50 fps on medium.

Yeah that I would totally understand, and that's probably what the Ryse dude wanted to say. But there's really no easy way for him to come out and say that 30fps was a hardware limitation choice to get the visual fidelity they wanted without pissing off the console maker, so he had to skate around it a bit.


patsu said:
It may also take more than a year to realize several "leap" in performances ?

I wonder if there will be any leap at all, unless it came from new algorithms or implementation ideas really and not more effective use of hardware. The ooo x86's and amd gpu's aren't really an unknown thing like having to write multi core in order hand optimized code and shaders were back in the day. You could easily look at a ps2 or ps3 and say yeah it'll get much better just due to hardware use alone, but can you say the same thing for the new hardware? Maybe if the profile tools are still weak but I can't imagine they would be.
 
I wonder if there will be any leap at all, unless it came from new algorithms or implementation ideas really and not more effective use of hardware. The ooo x86's and amd gpu's aren't really an unknown thing like having to write multi core in order hand optimized code and shaders were back in the day. You could easily look at a ps2 or ps3 and say yeah it'll get much better just due to hardware use alone, but can you say the same thing for the new hardware? Maybe if the profile tools are still weak but I can't imagine they would be.

Well... The CPU and GPU can work more efficiently now. And when the developers are ready, they can optimize more for these 2 SKUs. And we should have some network help in some areas. Plus developers don't necessarily perform well all the time, especially for first attempts (we all make mistakes). They will improve along the way.

I am pretty optimistic. PS3 was held back by its memory architecture and size so far. If they can use the GPGPU effectively, then we should see more interesting ideas later on.
 
1080p vast sea-air-land Lair game world

You do remember that Lair was rendering at 800*1080? Even less pixels than 1280*720.
Although, they used 2xMSAA and somehow messed with the buffers to get 1600*1080 in terms of coverage, but the number of shaded fragments was still less.
 
If he had unlimited hardware power he could have unlimited frame rate and unlimited fidelity.

I was talking about unlimited in the context that the game could be run at 60fps instead of 30fps but they still chose 30fps and added more and more stuff until there's no more power left for 60fps. Any moron understands there always hardware limitations when you're talking about computer generated graphics...;)

Also there's no such thing as unlimited hardware in the CG world...or even the real world...unless we're all living in a computer simulation...You'd be the dumbest of dumb to think there's such a thing as unlimited hardware for generating unlimited frame rate and fidelity in CG...:LOL: To even bring that up is mind boggling not to mention strange...

Again in movies there's no limit to the frame rate ie no real limit to 24fps but most choose 24fps anyway. Even pure CG Pixar movies where you don't have to worry about the "soap opera" look don't run at 60fps. I realize most theaters don't have 60Hz playback capability but that is beside the point as the response from The Hobbit got mixed reactions.

Now if you want to have a meaningful discussion....all games will have a natural limit but this can be artificially chosen by the game designer. One can choose a frame rate or resolution or number of polygons etc etc. as the limit and then build the art around that. Just because a frame rate of 30fps is chosen doesn't mean that is the "natural" limit. For example if a game is designed to run at 640x480 on X1 is that a natural hardware limitation??? Most likely no. Also who are you to say every game needs to run at Z resolution or framerate? That's like telling an artist his painting needs to be on X size canvas and use Y number of paints...lol....sorry doesn't work that way.

I don't expect you to understand this concept but at least provide a valid argument with evidence to back it up.
 
Last edited by a moderator:
I was talking about unlimited in the context that the game could be run at 60fps instead of 30fps but they still chose 30fps and added more and more stuff until there's no more power left for 60fps. Any moron understands there always hardware limitations when you're talking about computer generated graphics...;)

Unlimited but limited, choose one, don't say both.

Also there's no such thing as unlimited hardware in the CG world...or even the real world...unless we're all living in a computer simulation...You'd be the dumbest of dumb to think there's such a thing as unlimited hardware for generating unlimited frame rate and fidelity in CG...:LOL: To even bring that up is mind boggling not to mention strange...

Movies can spend hours rendering every frame if they have the time to do it. Those also qualify as CG.

Again in movies there's no limit to the frame rate ie no real limit to 24fps but most choose 24fps anyway. Even pure CG Pixar movies where you don't have to worry about the "soap opera" look don't run at 60fps. I realize most theaters don't have 60Hz playback capability but that is beside the point as the response from The Hobbit got mixed reactions.

24fps is very clearly a format issue.

Now if you want to have a meaningful discussion....all games will have a natural limit but this can be artificially chosen by the game designer. One can choose a frame rate or resolution or number of polygons etc etc. as the limit and then build the art around that. Just because a frame rate of 30fps is chosen doesn't mean that is the "natural" limit. For example if a game is designed to run at 640x480 on X1 is that a natural hardware limitation??? Most likely no. Also who are you to say every game needs to run at Z resolution or framerate? That's like telling an artist his painting needs to be on X size canvas and use Y number of paints...lol....sorry doesn't work that way.

I don't expect you to understand this concept but at least provide a valid argument with evidence to back it up.

I don't think you understand why certain devs are shooting for 60fps at the cost of your beloved "pixel fidelity".
 
Unlimited but limited, choose one, don't say both.

Understanding the context or not is your choice and hopefully not a mental limitation.

Movies can spend hours rendering every frame if they have the time to do it. Those also qualify as CG.

Live action is not CG.

24fps is very clearly a format issue.

Yet The Hobbit proves that your "format issue" is not an issue at all and that higher frame rate is not artistically better. If it was clearly better artistically everyone in the film industry and their grandmothers with cheap DLSRs would be shooting movies at 48fps or higher. If it was clearly better everyone who saw The Hobbit at 48fps would've unanimously agreed...instead it was a mixed bag of reactions.

I don't think you understand why certain devs are shooting for 60fps at the cost of your beloved "pixel fidelity".

I understand why those developers can't achieve CG like pixel fidelity with their 60fps games...

So would you say 60fps was a choice or limitation? Why not shoot for 120fps?
 
Last edited by a moderator:
Understanding the context or not is your choice and hopefully not a mental limitation.

If you have unlimited hardware capabilities there is little to no reason to render at 30fps over 60fps. There just isn't.
If you have limited hardware then you can have a choice of fidelity over smoothness and we'd understand.

Please, personal insults are allowed here now?

Live action is not CG.

(facepalm)

Yet The Hobbit proves that your "format issue" is not an issue at all and that higher frame rate is not artistically better. If it was clearly better artistically everyone in the film industry and their grandmothers with cheap DLSRs would be shooting movies at 48fps or higher. If it was clearly better everyone who saw The Hobbit at 48fps would've unanimously agreed...instead it was a mixed bag of reactions.

http://www.cinemablend.com/new/Scie...t-Looks-Weird-48-Frames-Per-Second-34673.html

Sure The Hobbit was weird, but it doesn't in any single way prove that higher fps is worse.

I understand why those developers can't achieve CG like pixel fidelity with their 60fps games...

So would you say 60fps was a choice or limitation? Why not shoot for 120fps?

Oh maybe because MOST MONITORS DON'T SUPPORT 60FPS+? :oops:
 
Wait. What is all this film vs interactive software garbage?

Game state -> Display -> Eyes -> Decision -> Motor reflex -> device input -> Game state update -> Display.

You can go more in-depth than that but the fact remains that you want to reduce the time between that sequence of events as much as possible and a higher framerate is always better regardless of artistic design decisions.

If you can design gameplay around lower framerates for the sake of graphics then fine, it can and does work, but as long as there is a decent amount of interactivity (especially competitive games) you always want higher refresh rates.
 
Yet The Hobbit proves that your "format issue" is not an issue at all and that higher frame rate is not artistically better. If it was clearly better artistically everyone in the film industry and their grandmothers with cheap DLSRs would be shooting movies at 48fps or higher. If it was clearly better everyone who saw The Hobbit at 48fps would've unanimously agreed...instead it was a mixed bag of reactions.

It was a mixed bag of reactions because people have been trained for what a movie is with pictures running at 24 fps for their entire lives, with soap operas running at 50/60fps depending on where you lived on earth. I think it's more of an expectancy issue than lack of being artistically better. It also proves to me that the frame rate has much more profound effect on how footage is perceived as opposed to resolution.
 
with soap operas running at 50/60fps depending on where you lived on earth. I think it's more of an expectancy issue than lack of being artistically better.

I am unsure if this is actually true. Do you actually know if this is correct.
 
I was talking about unlimited in the context that the game could be run at 60fps instead of 30fps but they still chose 30fps and added more and more stuff until there's no more power left for 60fps. Any moron understands there always hardware limitations when you're talking about computer generated graphics...;)

Also there's no such thing as unlimited hardware in the CG world...or even the real world...unless we're all living in a computer simulation...You'd be the dumbest of dumb to think there's such a thing as unlimited hardware for generating unlimited frame rate and fidelity in CG...:LOL: To even bring that up is mind boggling not to mention strange...

Doing insults will not get you anywhere. Don't try to blame others just because you can not express yourself or do not understand english.
 
Again in movies there's no limit to the frame rate ie no real limit to 24fps but most choose 24fps anyway. Even pure CG Pixar movies where you don't have to worry about the "soap opera" look don't run at 60fps. I realize most theaters don't have 60Hz playback capability but that is beside the point as the response from The Hobbit got mixed reactions.

How would they display those non 24 fps movies in cinemas and homes?
 
It was a mixed bag of reactions because people have been trained for what a movie is with pictures running at 24 fps for their entire lives, with soap operas running at 50/60fps depending on where you lived on earth. I think it's more of an expectancy issue than lack of being artistically better. It also proves to me that the frame rate has much more profound effect on how footage is perceived as opposed to resolution.

Most people agreed it is at least an improvement for 3D. Also, personally, it really brought out how much we're being held back at the moment - panning scenery and various other stuff just looked so much better, especially at higher speed.
 
Status
Not open for further replies.
Back
Top