*spin* another 60 vs 30 framerate argument

That may apply as an argument in general, but if you look back at the beginning of this thread, it was about why can't Guerilla make KZSF a 60fps game? Because there's "so much extra power", or because there's 8GBs of RAM instead of 4, or because people are buying COD games...

That's the start of this thread. Of course some trolling made it to diverge all around the map by now and it's somehow turned into a general argument.

Oh I see. Well doubling the ram sure wouldn't do it, I think people know that has little to do with fps (except perhaps on pc due to virtual memory, swapping over the bus, etc). I can understand though why a premier launch title would go for 30fps. They really need to make a big visual splash to intro the new machine, but at the same time they are working with new hardware, tools and code. Throw in speed bumps like having to be ready to present at e3 and a fixed launch date means I can understand why you wouldn't try and target 60fps right way, more so because they know this game will be dissected by all the gaming media. Shipping a game in general is an unnatural act and shipping a launch title doubly so, hence I can totally understand them aiming for 30fps. I think people need to cut them some slack.
 
Not to mention - again - that the PS4 is about 6-8 times as fast as the PS3, so going 60fps would only give a 3-4x performance boost to spend on more detail, and going 1080p + 60fps would leave even less resources for better graphics. And then there's the question of how much of that theoretical performance a first gen game could actually utilize...
 
I dont know it sounds like alot of pc gamers have become divas lol. I play all my pc games at 60 fps and most my console games at 30 fps. Its not the end of the world droping down to 30.
Yes 60 fps is nice but to say 30 fps is bad, is an overstatement.
 
The best way I can reply to the 30fps/60fps argument is with a quote from Glengarry Glen Ross:

"All train compartments smell vaguely of shit. It gets so you don't mind it."

Point being that much of this comes down to what you are used to. I used to not mind 30fps, that is until having games exclusively on pc at 60fps for the past couple of years. Now 30fps is simply not an option anymore as it both looks and feels absolutely terrible. Two years + 1 day ago I would have never said that. Now if I can't play a game at 60fps then I won't play it at all until I get a video card fast enough to run it at 60fps. Your mileage may vary of course, but I think the only way to really know how much frame rate affects you is to game exclusively at 60fps for a long period of time, then try a 30fps game and see what you think. You may be surprised at how awful it looks, I sure was surprised especially given how long I happily accepted 30fps.

I personally ain't that sensitive to framerate I can pop out the n64 which is said to have sub30fps games and it won't bother me.

Personally I prefer 30fps over 60fps, I've played games with both framerates and the difference is barely noticeable, so if graphics detail can be doubled by going 30fps I'm all for it. On pc I'll up effects,AA,etc until the framerate is in 30s.

That quote is a vast exaggeration if you're applying it towards a 30vs60 framerate argument
You dont drive a beam of light, next time your driving get your passenger to cover your eyes for 9 out of every 10 feet
60fps would still be 5ft jumps which is not that much better. And even if we had something like 120fps I doubt human reaction times can keep up with it.
 
I'm all for sticking to 60, unless you're intent on doing something slow and cinematic (where the amount of graphic details you can render increases, not because I think 30fps is necessarily more cinematic ;) ). When gameplay is the primary component of a game, as I think it too often isn't, then 60 is a big win. Alternatively, make sure your systems are decoupled from the rendering framerate. This can help make a game more responsive too.

But for most 'games' (of which I think there are too few), yeah, 60fps should be the target. Less realistic graphics can still look fantastic given the right art style, but the gameplay just feels that much better at 60.
 
I dont know it sounds like alot of pc gamers have become divas lol. I play all my pc games at 60 fps and most my console games at 30 fps. Its not the end of the world droping down to 30.
Yes 60 fps is nice but to say 30 fps is bad, is an overstatement.

No. It is not an overstatement. (Was that categorical enough? :))

24 fps (or its close neighbour 30fps) can create a facsimile of continuos movement only under very specific constraints that movie makers strive to adhere to. Those constraints do not naturally apply to game play.
If the game is limited in what it allows in terms of movement, and uncritical in terms of timing, 30 fps can be acceptable, but we need to recognize that a low frame rate is in itself a game design constraint.
Not only that, but we also need to recognize that temporal aliasing is a visual artifact, and low temporal resolution is a graphics quality problem, not only a control issue.

It is unfortunate that the shift to LCD TVs also piled additional input lag on top of the already poor frame-rates of television. Console games suffer from this at the design stage, much as online games are designed to be tolerant of latency in the connection. (For an old timer like me it is quite apparent how the shift from LAN-play to internet play changed the multi-player shooters. I hate moving like a slug on valium.)

(I'm pretty much a PC gamer exclusively, and up until I got a 27" 2560x1440 IPS panel in 2009, I clung to my CRT as it allowed me to play all games at a solid 100-125 fps I simply dialed down the resolution and eye candy until I got the frame rate where I wanted it.
Since I no longer play competitively, I found the immersion allowed by the larger screen more valuable than the frame rate allowed by the CRT, but it was a close trade-off, and for a couple of years or so, I selected display on a game by game basis before the awkwardness of the setup became too much.)
 
The best way I can reply to the 30fps/60fps argument is with a quote from Glengarry Glen Ross:

"All train compartments smell vaguely of shit. It gets so you don't mind it."

Point being that much of this comes down to what you are used to. I used to not mind 30fps, that is until having games exclusively on pc at 60fps for the past couple of years. Now 30fps is simply not an option anymore as it both looks and feels absolutely terrible. Two years + 1 day ago I would have never said that. Now if I can't play a game at 60fps then I won't play it at all until I get a video card fast enough to run it at 60fps. Your mileage may vary of course, but I think the only way to really know how much frame rate affects you is to game exclusively at 60fps for a long period of time, then try a 30fps game and see what you think. You may be surprised at how awful it looks, I sure was surprised especially given how long I happily accepted 30fps.

This is much like how I had been using an iPhone 3GS for 3 years til I got the iPhone 5 last Oct and was just playing around with the 3GS again and am absolutely astounded at how terribly low res the screen looks now compared to the Retina displays, when before I never noticed at all.

Thankfully I don't play 'proper' games on my computer (have a MacBook) so won't have your problem - but yeah, I can definitely see PC gaming spoiling you in terms of IQ and framerate.

That quote is a vast exaggeration if you're applying it towards a 30vs60 framerate argument60fps would still be 5ft jumps which is not that much better. And even if we had something like 120fps I doubt human reaction times can keep up with it.

Well, games like Forza simulate the physics at 360hz and only the rendering is done at 60hz, so that would be good enough.

What do you guys think of having controller inputs and response decoupled from framerate, I think Need for Speed HP does that and gets controller response time of 83 ms despite running at 30 hz (most 60 fps games have response times of 66 ms)

That would be the best trade off for demanding games I think, 30hz with low response times coupled with good motion blur.
 
Last edited by a moderator:
You dont drive a beam of light, next time your driving get your passenger to cover your eyes for 9 out of every 10 feet


Well why didnt you say so earlier, Hey amd dont worry about performance in racing games Grall doesnt like them...

Your point is moot, you couldn't move fast enough to cover your eyes that fast, you broke your own arguement. ;)

Lets say you are moving at 300 feet per second or 204.54 miles per hour, in your F1 car. Turn your head 90 degrees to the left and tell me how many details you can pick out in a 10 foot section of the the road. :p
 
That would be the best trade off for demanding games I think, 30hz with low response times coupled with good motion blur.

Racing games may be the one genre where motion blur makes sense in game play because it can be assumed that the eye of the driver will follow the road ahead, that is, you have a decent idea of what to blur. Generally though, motion blur doesn't belong in games, since you cannot say what the player is looking at.

Example - imagine that you are sitting on a bench near a street, and there is traffic moving in both directions. The viewport doesn't move. Should the pretty girl on a bike coming in from the right be blurred? Not if she caught your attention and you followed her movement over the screen, then the static parts of the scene should be blurred instead, since in effect you are panning across it with your eyes. However the game cannot know if you're following the girl, looking at the design of the house across the street, (or following the handsome guy coming in from the left!). Since the player is free to focus on and follow anything in view, it's impossible to know what to apply blur to, and having it applied to the wrong objects is distinctly annoying.

Games are not movies.
 
The games I enjoyed most this generation were all 30 fps. So people can come up with all kind of technical reasons why 30 fps is bad but at the end of the day I really enjoyed uncharted, demon souls, gears of war, killzone, and many others. IMHO I didnt like call of duty and no not because its 60 fps but because of how it feels like you ice skating everywhere. So for me 60 fps is nice but its not the most important thing, yes it would be nice if every game was 60 fps but its not the be all and end all.
 
The games I enjoyed most this generation were all 30 fps. So people can come up with all kind of technical reasons why 30 fps is bad but at the end of the day I really enjoyed uncharted, demon souls, gears of war, killzone, and many others. IMHO I didnt like call of duty and no not because its 60 fps but because of how it feels like you ice skating everywhere. So for me 60 fps is nice but its not the most important thing, yes it would be nice if every game was 60 fps but its not the be all and end all.

Good games transcend technical issues. Generally control is more important than graphics in all gaming, physical or virtual. How else could we get to were we are?

Also as I mentioned, games are designed after the limitations of their platform.
When I played a bit of WoW, I found that in 98% of my fights I could get off my chair in the middle of the battle, go down the stairs and put a cup of coffee on in the kitchen and walk back to continue gaming without affecting the outcome of the ongoing fight. Talk about input latency! :)
Doesn't mean the game wasn't enjoyable.
 
What are the chances that we'll see this Frame-Rate Upscaler technology in next gen games?

I guess it's the same thing that Timothy Lottes spoke about in his blog when commenting on PS4 and XBox720. He deleted his post, though. According to Digital Foundry it delivers the best of both worlds: The graphics of a 30FPS game and the reduced inputlag and the smooth image of a 60FPS game.
 
@Hecatoncheires
Interpolation, as the article says, is a technique first appeared on TV.
My TV can do interpolation and tough the fluidity is increased there is no improvement in input lag whatsoever given that no new frame is actually created.
IMO Criterion solution is much more interesting: NFS Most Wanted runs at 30fps but they manged to get input lag to 83ms.
That is for me me the best bet way to marry the "30fps eye candy" tot he "input lag of 60fps games"...of course when 60fps and eye candy in not possible.

@dobwal
I quote for Wikipedia: "Mean RT (Reaction Time) for college-age individuals is about 160 milliseconds to detect an auditory stimulus, and approximately 190 milliseconds to detect visual stimulus."
 
Last edited by a moderator:
Interpolation, as the article says, is a technique first appeared on TV.
My TV can do interpolation and tough the fluidity is increased there is no improvement in input lag whatsoever given that no new frame is actually created.
IMO Criterion solution is much more interesting: NFS Most Wanted runs at 30fps but they manged to get input lag to 83ms.
That is for me me the best bet way to marry the "30fps eye candy" tot he "input lag of 60fps games"...of course when 60fps and eye candy in not possible.

Interpolation on a TV increases the input lag and creates artifacts. The frame-rate upscaling method described by Digital Foundry reduces input lag and deals with artifacts by pointedly adding motion blur. In my eyes these are two different animals. 100/120 Hz interpolation on HDTVs is awful, I always deactivate it since it also creates the ugly "soap effect". I'm not sure whether the frame-rate upscaling creates this ugly effect as well.

I don't know about Criterion's approach. Does it increase the smoothness of the image as well? In my eyes 30FPS is the minimum frame-rate that can be described as enjoyable. For my Gaming-PC I use a 120Hz screen (the real deal, not this interpolation crap) and it just looks excellent. Playing a game with more than 100FPS is just awesome. 30FPS is the lowest end possible. It can be enjoyable for certain genres if the frame-rate is rock solid, though.
 
Interpolation on a TV increases the input lag and creates artifacts. The frame-rate upscaling method described by Digital Foundry reduces input lag and deals with artifacts by pointedly adding motion blur.

As described above, there are inherent problems with motion blur in games.
It can look nice in demos, obviously, because the demonstration has a defined item of interest which the motion blur can optimize for. Games, in general, do not.

The profile of B3D forum members have changed over the years. When I first got in here, a fair number of visitors belonged to the Quake123/UT/CS communities, and their reason to be interested in the technical side of 3D rendering was performance - high frame rate rendering (and high frequency keyboard/mouse polling) simply meant better competitive performance.

These days those individuals are all but gone. Their problem is solved. On the PC. You can pretty much always dial down the settings to meet 120 or 60 solid FPS, and USB polling rate is decent even without modification.

However, console gamers do not have the same graphical control options, their displays are anything but fast, their input devices do not support fast and accurate positioning. So we still have these 30 vs 60 fps discussions. People aren't allowed to make their own decisions and the underlying hardware poorly supports the benefits of high frame rates.


You guys who look at it from a static reaction time point of view are somewhat missing the point. The question isn't only, or even primarily, time from detection to response even though that is indeed a factor. Rather, it is in the process of making the appropriate response. If something is coming in fast from the left, and you need to shoot it down, you need to judge two things - speed and trajectory, and respond accordingly. The simple fact is that the more samples your brain has to work with the better it can evaluate that movement, and allow you to intercept that trajectory with a well placed bullet.

It's a bit like catching a ball under stroboscopic light - the longer between the strobes the harder it is to catch the ball. (Again, in an actual gaming situation motion blur doesn't help you here, as it doesn't know what movement you're trying to follow (if any). It would do fine at the proof of concept stage where it would only be that one opponent to care about.). Everyone can understand this, but it depends on the game, the hardware support, the individual player skill et cetera to determine when more input data (more frames) no longer result in a better prediction. It lies in the nature of the problem that the better you are, and the better your means of control, the faster the game is, the more critical this becomes as a factor.
Ergo: There will never be universal agreement on how high frame rates are desirable.

I will say this though. Peter Jackson filmed The Hobbit using double the normal frame rate, and even without being able to A/B test, and without control issues or any means for the viewer to affect the viewport or anything else, there has been a vast internet storm decrying the movie as "not being film like". The difference is apparent enough even under such controlled circumstances. So the issue of frame rate is definitely not only about control, but also a matter of graphical quality. Although, as with The Hobbit, the lack of it may actually be what you prefer. :)
 
I disagree with the whole reaction and interception argument. 30 fps is enough to accurately predict where something is going to be. There are YouTube vids of gamers getting id-air sniper shots and the like with incredible accuracy. The only time it's not enough is when a change happens in trajectory, and 60 fps gives an extra 17ms advanced warning. Actually I'll give 60 fps more benefit as it provides potentially 3 frames of motion where 30fps looks static, providing 50ms useful info. But unless you're a top-tier player, it's hardly a game changer. TV and game display lag is far more of an issue than 30/60 fps in responsiveness. Internet lag is even more of an issue and that's the predominant experience in these multiplayer games people are comparing.

60 fps just looks better. It gives a game a sense of polish and fluidity, and makes it feel more comfortable and...'responsive', even if technically it likely makes no difference.
 
In the end, the movie framerate discussion only highlighted again for me that anything that has fast panning movement needs a high framerate. Movies often avoid doing that now, because 24fps can't handle it well. A fast turn in a driving game really highlights the issue. So that you can probably determine what framerate you need by determining how much movement you have in your games.

But in games response time, response between your action and that something happens on screen is a factor that does not come into play at all when watching movies. So any game in which the speed at which you can respond to things happening on screen and see the result of your response is going to have an effect on how well you can play a game, is going to benefit from having a higher framerate.
 
Back
Top