Importance of 60 fps in arcade/simulation racers *spawn

but being well informed by constant input/feedback.
but your feedback isnt constant at 200mph you get input every 10ft
The visual feedback is 'constant'. You see the corners, or the cars, approaching over a period of some time that prepares you to respond at the appropriate time. The only time reaction speed matters is to avoid a crash or something.

Agreed and if the point you need to send the signal to your foot to hit the brake at the 95ft is at the 115ft mark its still less than ideal.
No, because the underlying IO and physics engine is running at 60 fps. You watch the display updating 30 fps. This prepares you for when you need to press the brake, with the motion on screen creating a world-view in your mind that can anticipate actions at the time they are needed. You apply the brake at exactly the right time (depending on how good a driver you are!) regardless of when between frames that occurs. The IO reads this (MrFox's excellent explanation conveying how inaccurate this is, but it doesn't matter because your brain adapts to the latencies to apply the actual force at the right time) and applies it to the simulation. On the next refresh, the game draws the current state of the simulation.

The display has no direct baring on the IO responsiveness anymore. It's only important in being fast enough to provide suitable feedback to provide a decent mental model. 5 fps clearly wouldn't be fast enough as too much could change. Given limited manoeuvrability of automobiles, we don't have to worry about 180 degree turns in tiny fractions of a second, so the delta between frames is kept low enough that 30 fps provides enough info to track everything. 30 fps is good enough as evidenced by existing racing games that are very playable.
 
Are games like Burnout excluded from the Racing genre? I have no trouble listing them as car-combat or something similar... but I guess they are generally refered to as racers AFAIK.
None of this hold for Burnout. Split second reaction is a must if you cross and drive against oncoming traffic and you simply cant predict whats coming at you after the next turn.
This is probably true for most racers where its ok to bumb your opponents around... the more the game is grounded in the reality the fewer unpredictable issues crop up.

(And I cant see a Burnout working at less than 60fps)
 
Yeah, for some particular car games 60 fps is pretty important. For racers though, the usual suspects where everyone's going around a track, 30 fps provides enough feedback. Even in the case of a sudden obstacle like a rock-slide to dodge, the 30 fps issue is only a tiny drop in the issues facing a rapid response, as MrFox describes. After seeing the rockslide (for which 60 fps will help), which is delayed by TV lag, you have to process and react at 200ish ms, then press the button, then have the game receive the IO and respond. An extra 18 ms from 30 fps is barely going to make a difference. As long as the update is fast enough to accommodate and communicate the in-game action, the refresh is acceptable. There doesn't need to be a blanket threshold limit for all games of a genre. Hell, even shooters can be comfortably 30 fps if they're low-sped enough.
 
I prefer 60 fps to 30 fps any day in all games. I'd much rather have a silky smooth rock solid frame rate. Even more so in a racing game. There's a better sense of immersion that comes with a higher frame rate and for many people this results in better gameplay. I think it's pretty much the same for FPS and racing games when it comes to framerate. An FPS is playable at a stable 30 fps, but more playable at 60 fps. I'm definitely not going to ignore Drive Club because it runs at 30 fps, but imagine I'd enjoy it more if it ran at 60 fps. If the motion blur is decent enough to get me more immersed then I'm there. The one thing I will not tolerate from a racing game is a frame rate that varies on a regular basis, if it is a rare occurence then that is fine.

Most of my desire for 60 fps in racing games come from my own experiences playing so many racing games through the years. I would be disappointed if the Gran Turismo or Forza series went from 60 fps to 30 fps. But for a fresh game like Drive Club it doesn't matter as much.
 
I prefer stable lower framerate over fluctuating higher framerate for any game that require tight timing. For games that don't require tight timing, I prefer prettier visual over fps (even at the cost of stable fps).
On the topic itself, I don't think it is that important, or at least not as important as trying to achieve stable fps. After that, it's up to them to target more on 60fps or pretty visual. I don't mind either way.
Of course I'm gaming from far away where the TV only cover around 30% of my fov, thus 30fps isn't really a concern for me. For those people that the display cover near their full fov, I don't think 30fps is enough for a fast moving game, especially if there were going to be lots of fast pans.
 
1. The rendering pipeline is already causing a lag of at least 32ms to as much as 100ms, I have no idea why this wouldn't be a more important issue in this case.
2. Your TV adds somewhere between 16ms and 100ms depending on the brand.
3. Your brain requires 100ms to react to a visual stimuli.
4. A force-feedback wheel will take over 100ms to turn it because it has mass, and because it's pushing against your muscles. Similar situation for load-cell pedals.

That's at least a third of a second. Adding or removing 16ms of lag on the pipeline won't change anything to your lap times. The brain adapts to reach a perfect timing, as long as this lag (controller+pipeline+TV) is stable. If it had have any impact, the best players in the world wouldn't use a force-feedback wheel and load-cell pedals. They would use a DS3 with hair triggers.

Where does this 16 ms of lag comes from? If you have a rendering pipeline of 3 frames from user input you will have a 48 ms lag difference at 30 vs 60 fps.

Also, regarding 3, I always feel that I can see motions more clearly with a higher frame rate. If it takes you 10 frames to understand what is happening on screen you would gain 16X10 (160) ms at 60 fps (vs 30 fps). So your total gain would be over 200 ms!
 
Also, regarding 3, I always feel that I can see motions more clearly with a higher frame rate. If it takes you 10 frames to understand what is happening on screen you would gain 16X10 (160) ms at 60 fps (vs 30 fps). So your total gain would be over 200 ms!
This is perhaps one of the best arguments in favour of a higher framerate; regardless of direct latencies, it's easier for the brain to interpret a higher-frequency stream of information. That's an especially obvious point to make in terms of twitch shooters, where most of the on-screen content might change from one frame to the next at lower framerates, but the notion doesn't really vanish for racers (and it's strongest for the breed of cartoonishly high-speed arcade racers).

Although, by a similar token, it could maybe be argued that comparable results could be achieved by motion-burred video at a lower framerate.
 
I do not see how hiding details would help.
The argument would be that the motion blur turns the "hard to interpret" separated ghosts of an object into a clear path with easily-distinguishable motion vector. It eliminates some of the "interpolation" that the brain is having to do to figure out what the elements of the scene are doing.
 
Can't you just put an arrow with the direction on the object then?
The point is that our brains are very naturally experienced with interpreting a motion blur sort of effect. It's an indicator that is a close approximation to how our visual system picks up moving objects in the real world.

In addition to looking bizarre, it's not obvious that an arrow would tip the brain off as well; it could be just another weird object jumping around the screen.
 
Where does this 16 ms of lag comes from? If you have a rendering pipeline of 3 frames from user input you will have a 48 ms lag difference at 30 vs 60 fps.
It depends what's causing the lag. If a process has to wait for a frame, then faster framerates can reduce overall lag, but if the latencies are caused by time to actually do work, the latency will be measured in ms, not frames. A game rendering at 60 fps will add 17 ms more lag if dropped to 30 fps regardless how many frames latency that is. That is, 5 frames of 60 fps latency will equate to 3 frames of 30 fps latency, +17ms. 12 frames of 60 fps lag will equate to 6 frames of 30 fps lag.

Also, regarding 3, I always feel that I can see motions more clearly with a higher frame rate. If it takes you 10 frames to understand what is happening on screen you would gain 16X10 (160) ms at 60 fps (vs 30 fps). So your total gain would be over 200 ms!
That's absolutely true, which is where higher framerates are beneficial. However, if the amount of change on screen between frames isn't too high, 30 fps is perfectly adequate. See the real-life 30 fps driving videos to see 30 fps is perfectly fine to see what's happening in a typical racer.

Although, by a similar token, it could maybe be argued that comparable results could be achieved by motion-burred video at a lower framerate.
Absolutely. Mo-blur produces larger points of recognition with defined motion vectors and greater continuity with previous frames. Fast interpretation of motion data doesn't require clear surface details. An object moving 1/10th the distance across the screen with no motion blur is basically teleported and there's nothing to connect it from its old position to its new. The same object can't be made out clearly with mo-blur but will have its passage over time made obviously visible. At higher framerates and shorter distances, mo-blur has less value, but it is otherwise an aid to motion interpretation, the only downside being it's applied according to the game engine and not the viewer which makes it unnatural.
 
Where does this 16 ms of lag comes from? If you have a rendering pipeline of 3 frames from user input you will have a 48 ms lag difference at 30 vs 60 fps.

Also, regarding 3, I always feel that I can see motions more clearly with a higher frame rate. If it takes you 10 frames to understand what is happening on screen you would gain 16X10 (160) ms at 60 fps (vs 30 fps). So your total gain would be over 200 ms!
Yes, I agree it would be 16ms x number of frames pipelined, for the passes of the pipeline that need twice the time, it could be all or it could be a single one... but it's still assumed that the game engine itself is running at 60fps, so we can ignore that pass, and at the other end, the buffering and vsync delay also happens at 60fps. Whether 16ms or 32ms or 48ms, my arguments still stand for this type of game.

The most important feedback that needs fast reflexes is the tires behavior, and this isn't a visual feedback, if your brain detects a loss of grip by decoding your change of trajectory, it's already too late. Physical feedback is king here.
 
Btw, I don't think I ever heard the answer to this question. I want to ask in the just locked 45fps thread but this thread seems relevant enough. Why not target 50? Are those TVs in NTSC land can't accept p50? I live in PAL land and as far as I'm aware, all modern TVs (LCD) accept both NTSC and PAL. Resolution wise, I believe it's the same for HD content (unlike SD where it's a trade off between refresh rate and resolution).
 
*If* all modern TVs support it (and the CE industry is notorius for imposing arbitrary limits on their devices' firmwares), you'd still have an issue for those in NTSC on older displays. You'd probably need some major-league stats to decide to support 1080p50. Conceptually it's a valid choice though.
 
Do the new consoles not come in pal and ntsc versions like the old ones ?
ps: I dont know if when it comes to 1080 res the old pal/ntsc refresh rates apply

pps: Sort of related to 30/60 fps
in pc space theres a program that will double your fps by halving your resolution at a button press
HiAlgo Switch
http://www.hialgo.com/TechnologySWITCH.html
 
My TV is definitely receiving and displaying a 60hz signal for games (24 for blu-ray) so I'm pretty sure the whole 50/60hz was normalised when we got HD resolutions. TV is still at 50 though for whatever reason.
Could be wrong so don't quote me on that.
 
That's absolutely true, which is where higher framerates are beneficial. However, if the amount of change on screen between frames isn't too high, 30 fps is perfectly adequate. See the real-life 30 fps driving videos to see 30 fps is perfectly fine to see what's happening in a typical racer.

Do you mean a video of a racing game at 30 fps or a video of real life driving at 30 fps? If the latter then the visual experience of watching a real life driving video at 30 fps should still be smoother than a racing game running at 30 fps. At least that's my experience from watching in dash camera of races and playing racing games at 30 fps.
 
Is PAL and NTSC still a really big deal? I'm ignorant of Europe's physical connections. I had always assumed that their HD resolutions and standards were similar to North America's. Are the HDMI connections different in a major way? Shouldn't both XB1 and PS4 be able to support NTSC and PAL resolutions with a full 60 fps and possibly FSAA to boot?
 
Is PAL and NTSC still a really big deal? I'm ignorant of Europe's physical connections. I had always assumed that their HD resolutions and standards were similar to North America's. Are the HDMI connections different in a major way? Shouldn't both XB1 and PS4 be able to support NTSC and PAL resolutions with a full 60 fps
NTSC and PAL themselves aren't really relevant anymore, as HDMI cables are never used for either standard. But television broadcasts are still handled at 50Hz in some regions and 60Hz in other regions.

This is a problem when TVs built for one region don't support the refresh rates used by the other region, which I've read is the case especially for some NA televisions.

and possibly FSAA to boot?
Err, what does that have to do with this? You mean like sending a larger frame and letting the TV do the downscaling?
 
Back
Top