Importance of 60 fps in arcade/simulation racers *spawn

I meant it in terms of the game rendering at SD resolutions in respects to the differences in PAL and NTSC. There should be enough GPU power leftover to go balls to the walls in terms of AA.
 
Do the new consoles not come in pal and ntsc versions like the old ones ?
Nope. HD supports 720p and 1080p outputs. We have three different framerates available though - 60, 50 and 24 fps. Not all TVs support all framerates so I don't think devs can use them. I don't even know if a game can switch video output rate - it's only important for video output so may only be accessible via certain limited OS options.

Do you mean a video of a racing game at 30 fps or a video of real life driving at 30 fps? If the latter then the visual experience of watching a real life driving video at 30 fps should still be smoother than a racing game running at 30 fps. At least that's my experience from watching in dash camera of races and playing racing games at 30 fps.
Whatever is being done in that real-life 30 fps movie to make it smoother (motion blur!) can be applied to a game to achieve the same result.

Is PAL and NTSC still a really big deal?
CE companies still make the divide. Buy a camcorder here and you'll likely only have the option to record at 50/25 fps, despite your TV being able to show 60/30. Fundamentally every device should be switchable between refresh rates as the user requires, so record in 24, 25, 30, 50, or 60 fps and play back at native refresh on any modern display. Because the US is clearly 60 fps based, I think it likely that some modern TVs just cap them to 60 fps and don't even have the option to switch to 50 Hz. Googlage shows someone who's US TV can display 50 Hz, someone saying US TVs disable 50 Hz, and someone saying their UK NowTV box only outputs 60 Hz (like Android) which introduces judder in UK programming.

Typical media mess. 50 Hz should have been dropped when we went to digital. But it didn't, so we're stuck with it, yet probably can't even use it as an alternative framerate for games. It's bad in every single way!
 
Whatever is being done in that real-life 30 fps movie to make it smoother (motion blur!) can be applied to a game to achieve the same result.

How? Adding fake motion-blur effects I would think is not the same as the natural bluring a video camera recording will produce because a frame is effectively a picture at a shutter speed of 1/30th - while a video game renders 30 perfectly sharp images. Any bluring you'll induce at that point (if your game does it) is a fake bluring, perhaps using the last frame, but the result isn't quite the same - I would think?

MrFox said:
The most important feedback that needs fast reflexes is the tires behavior, and this isn't a visual feedback, if your brain detects a loss of grip by decoding your change of trajectory, it's already too late. Physical feedback is king here.

Physical feedback is worthless without the visual cue in a video game. Even in real-life - noticing grip loses is not by the sole fact that the steering wheel feels different - it's a continues stream of information of visual, motion (acceleration/centrifugal force) and feel that is being processed by your brain simulatenously to form what is happening. Playing a video game - the majority of information you can get is predominently through what you see on screen. Even a force-feedback device will be limited and will not make up for the lack of visual feedback you are getting by a halfing the framerate.

In other words - you could be running your game internally at 600 frames per second - it still wouldn't matter much if you're only seeing your updates on screen every 20th of a second.
 
How? Adding fake motion-blur effects I would think is not the same as the natural bluring a video camera recording will produce because a frame is effectively a picture at a shutter speed of 1/30th - while a video game renders 30 perfectly sharp images. Any bluring you'll induce at that point (if your game does it) is a fake bluring, perhaps using the last frame, but the result isn't quite the same - I would think?
Okay, it may not be exactly the same, but the principle will be there. If it's motion blur that makes 30 real life footage smoother (and that depends on the shutter speed. Low shutter speed 30 fps and 60 fps sport is still 'jerky'), adding motion blur will make 30 fps similarly 'smoother'.
 
Sorry Shifty, I wasn't correcting - it was more of a serious question. I guess I just never really played a game that featured a form of motion blur that looked natural or similar to when watching TV footage of a bumper cam. The motion blur I've usually seen, gives some form of effect like when you're drunk - or to a degree the 'Star Trek' effect before they go into warp drive.

To replicate a perfect 30Hz recorded footage in a game, I would think you would need a 60fps game that blures the frames to form a 30fps equivilant? At which point you'd probably better of just displaying the 60fps.
 
60 fps blurred wouldn't cut it. You need to sample and accumulate multiple subframes with the contents moving. Clearly this is impractical, so instead approximate by blurring objects based on their motion vector. There are different quality the results, the worst blurring the contents of an object within it's bounds thus creating weird, hard borders that fail to convey the motion. Correctly done, a good moblur technique should be pretty convincing and help convey motion.
 
Physical feedback is worthless without the visual cue in a video game. Even in real-life - noticing grip loses is not by the sole fact that the steering wheel feels different - it's a continues stream of information of visual, motion (acceleration/centrifugal force) and feel that is being processed by your brain simulatenously to form what is happening. Playing a video game - the majority of information you can get is predominently through what you see on screen. Even a force-feedback device will be limited and will not make up for the lack of visual feedback you are getting by a halfing the framerate.

In other words - you could be running your game internally at 600 frames per second - it still wouldn't matter much if you're only seeing your updates on screen every 20th of a second.
I think we're diving into nitpicking territory. Maybe a better lighting model will allow you to judge the corner distance much more effectively, and 30fps would win in this case :LOL:

Of course you need a lot of visual information, but it's not in the area of a sub-100ms reaction time. We're not talking about playing a video game in general, only realistic racing games. Twitchy FPS are a different discussion entirely.

Would you be much better with a wheel that takes 1/10 second less time to turn?

What about pedals with a shorter travel and less resistance, giving you 1/10 second improvement?

Is someone with a Sony TV (<16ms) a much better sim racer than someone with a different brand?

How would frame interpolation fit into this? Is it better to delay the frame an additional 48ms on top of all this, to get a sharper image and better motion?
 
Would you be much better with a wheel that takes 1/10 second less time to turn?
Actually I think yes.

What about pedals with a shorter travel and less resistance, giving you 1/10 second improvement?
As it is I have roughly 1mm per 1% of throttle, having 1mm per 5% of throttle would be too sensitive for me the chances of me for example hitting 42% throttle would be massively reduced if i had to press the throttle 8.4mm instead of 42mm.
As for resistance A stiffer spring gives more feeling and prevents accidental depression of the pedal from having your foot on it. Plus I use a custom spring that is graduated so resistance increases with travel.
qJQE2FI.jpg
 
Of course you need a lot of visual information, but it's not in the area of a sub-100ms reaction time. We're not talking about playing a video game in general, only realistic racing games. Twitchy FPS are a different discussion entirely.

Perhaps you misunderstood. I am not arguing that reaction time in the sub-100ms range in a racing game is important - the point was more, a physics engine running at 10 times the precision/speed will mean squat for reaction time if you're watching a slide show (yes - I'm exaggerating now).

You run a physics engine at a higher framerate, not only to improve precision, but also because that data will be needed else and to keep latency down. The data you're effectively calculating is needed to shape the virtual world you're rendering. Each of these steps take time - and add to the overal latency. If this is or isn't noticable pretty much depends of the type of game you're playing and how much latency you have.

What I find especially amusing about this framerate talk is because some seem to have the idea that better graphics make for better games. They don't. That's why, even if PCs go forward with even better graphics, us on stagnant consoles are still happy enough to play what we know and like.

As I said - this whole topic would be best resolved using some blind-test study. Get people to play the same game in two different versions: One running at 30fps, the other running at 60fps at the expense of anything necessary to make it run stable. Then let people decide what they prefer to play.

While one might definately look better (better PR for sure) - I'm willing to bet a lot that most, even you included, would prefer to play the smoother running game. Because making the right compromises, the game, despite it's double framerate, will probably look close enough to the 30fps version - but will play significantly better.
 
That's a clearly untrue statement as described by users on this board. Graphics contribute to the entertainment experience, and some games are better for more graphical swing than higher framerate. That's the 30 fps vs 60 fps discussion though. This thread is just about whether 60 fps is essential for a good (entertaining) driving game.
 
Clearly, if this topic is moving to the point where we randomly pick out singular statements we don't agree with and draw a conclusion from it - while obviously ignoring the rest - we sadly won't get very far.

Let me explain the logic:

In the context of judging the merits of 30 vs 60fps in a game, we need to understand the trade-off. If we are comparing a game running at 60fps but looks like a last generation game to i.e. DriveClub in all its 30fps glory - then yes, the visual difference is quite big. If we assume identical hardware though - and messure the drawback of 30fps to 60fps to what is hypothetically possible, as I said, I'm not sure that with the right compromises, the higher fidelity in graphics at the expense of framerate will amount to a better game.

Lets take the PS3:
We've had GT5/6 running at 60fps and compare that to NFS:Shift (or any other 30fps racing game of the same type). GT5/6 still looks pretty damn good, arguably because there's a lot of talent involved and the art and lighting is outstanding. Arguably, the 30fps game neither offers better (noticable) visuals across the board, nor better immersion. In fact, the 30fps game is severly lacking in the immersion factor because of its framerate. Now, you might say - not the same team, not the same talent, so not a fair comparison.

Maybe. Unfortunately there aren't that many games we can effectively compare them to. GT (and Forza on Xbox) has been more or less the yardstick for most racing games outthere. Maybe a developer could comment better on the relative difference we'd be looking at if we take a 30fps game and reduce it to the point that it could run at double the framerate. Or take a 60fps game like GT5 and enhance it while dropping the framerate to 30fps.

If we take DriveClub - which is where this discussion has spawned from - how much worse would it look if the developer took that game and targeted 60fps from the get-go? Perhaps take a cut on resolution, a few cars less. Would it be a worse game or a better game?

Obviously, this is pretty subjective, but considering that while playing the game, you're likely to be immersed into the game itself - I don't think you'll be in awe of every single (at high velocity moving pixels in the far distance) but enjoying the game for what it is. This is probably different to a game of a different genre, like an adventure game where you're likely be finding yourself staring at a more or less static screen and taking in the scenery for clues on what to do / where to go next. This is a racing game. This is probably also why for the most part - the GT series has always gotten away with subpar scenery - because while actually playing the game, you are far more immersed to notice and you're likely to take in the higher fidelity on the cars than the moving landscape.
 
In the context of judging the merits of 30 vs 60fps in a game, we need to understand the trade-off. If we are comparing a game running at 60fps but looks like a last generation game to i.e. DriveClub in all its 30fps glory - then yes, the visual difference is quite big. If we assume identical hardware though - and messure the drawback of 30fps to 60fps to what is hypothetically possible, as I said, I'm not sure that with the right compromises, the higher fidelity in graphics at the expense of framerate will amount to a better game.
"Better game" is subjective. For some people, better visuals will result in the better game. It's not possible to conclude that 60 fps will always be better as long as the visual compromises can be assumed to be little.

Lets take the PS3:
We've had GT5/6 running at 60fps and compare that to NFS:Shift
Instead of comparing it to a different game, look at GT5's shortcomings. Static time of day, lousy shadows, 2D billboard people, etc. Now imagine all those shortcomings cleaned up by going 30 fps. That would make a prettier game. Some people would prefer that prettier game. Others would prefer the smoothness of the uglier game. It's impossible to call either version the better version. The best one could do there is claim one to be the preferred by more gamers version.

If we take DriveClub - which is where this discussion has spawned from - how much worse would it look if the developer took that game and targeted 60fps from the get-go? Perhaps take a cut on resolution, a few cars less. Would it be a worse game or a better game?
Without knowing what the actual changes are, that's impossible to answer for each individual you ask the question of. It's very easy to propose a few, barely noticeable downgrades as evidence that the game overall would be better off at 60 fps, but in reality the differences might be far more significant and enough that various gamers presently interested in DriveClub would be less so because it's not as pretty.

All I'm arguing is that there is no frame-rate requirement for a racing game (or any other game come to that). Framerate is one of the subjective quality aspects that matter to each gamer. I like 60 fps, but I recognise other people value pixel-quality moreso. For them, a 30 fps racer (or any other game) pushing the pixels will be more appealing, and ultimately still playable in this case. A racing game is not broken by dropping to 30 fps unless it's an exceptional sort of car game with lots happening at once.

There's no point trying to argue if a 60 fps racer is better or not as that's subjective. The only real discussion is if 30 fps is 'broken'. If someone can prove a 30 fps racer is ineffective, there's an argument. Until then, given the existence of well received 30 fps racers and that framerate vs eyecandy is subjective, I think the subject is pretty conclusively summed up as '60 fps is nice but not essential'. ;)
 
With all due respect, but if you are going to dumb down the whole topic to the point of it being a subjective matter (which all things in life are) or if 30fps is "broken", then we might as well argue the merits of why not go all the way and downgrade framerate to 15fps in order to enhance the game for those people that rather look at games than play them.

This being a technical forum, we can be a lot more specific and make educated guesses. Your point of looking at the short comings of the GT5 engine is precisely the point:

"Instead of comparing it to a different game, look at GT5's shortcomings. Static time of day, lousy shadows, 2D billboard people, etc. Now imagine all those shortcomings cleaned up by going 30 fps. That would make a prettier game. Some people would prefer that prettier game."

The game would be perhaps prettier when looking at static screenshots or if you're actually a bystander watching someone else play the game (with all the time in the world to focus on visuals) - however, while playing the game yourself at high speed and being totally immersed into the gameplay, the difference between 2D billboard people to fully 3D ones as an example is likely to make little to no messurable difference to the overall immersion and gameplay of the game.

Coming back to educated guesses we can make on the difference between an optimized game at 30fps vs. a optimized game at 60fps - I think we can at least conclude that not everything we see on screen might be impacted by framerate to the same degree. Some things are more costly than others.

If we have one game running at 30fps at 1080p and a 60fps at 720p (plus some slight other drawbacks since it's not all about fillrate), I'm still willing to bet we'd end up with two games that looked quite similar to the uneducated eye, with one playing vastly better than the other. In the end, it all comes down to which compromises one is willing to take to achieve the better framerate. If what is better in the 30fps is merely a product of effectively fast moving objects, you are less prone to notice them in the first place. This is different in most other genres as you either have static scenery (in a 3d game where you might be walking at slow speed with a large draw distance or simply stationary) or slow moving one. In a racing game - the faster you drive, the less clear the details you are trading off framerate for graphics become - and the more framerate becomes an important feature of the game. To the same degree, if you create a game where you drive a Golf-kart around the track at <10kph, yes, you probably would be wise to go with 30fps or even 15fps because at that speed, you have more time to focus on visuals because everything is relatively slow moving. Assuming you are racing cars at >100kph, I'm yet to be convinced you are likely to notice all those details the team invested in for their better PR campaign.
 
The game would be perhaps prettier when looking at static screenshots or if you're actually a bystander watching someone else play the game (with all the time in the world to focus on visuals) - however, while playing the game yourself at high speed and being totally immersed into the gameplay, the difference between 2D billboard people to fully 3D ones as an example is likely to make little to no messurable difference to the overall immersion and gameplay of the game.
I think you talking too much from personal perspective. For me, coming around a corner in GT and seeing cardboard cutouts does break the immersion, as does having a saw-edge of a shadow creeping up the side of the car interior. They're things I can live with and learn to ignore, but I'd prefer them gone. Would I prefer them solved over 60 fps? No. Would everyone else categorically prefer weaker visuals and higher framerates? I don't imagine so. Saying that the player is focussed on the game and not taking in the peripheral graphics is only true to a degree. Replace all GT's scenery with coloured blocks for a really simple unrealistic view and it'll still race well and be good gameplay, but it'd be a different experience. There's a balance between artistic superiority of fluidity, which is always, still, on a per-title basis depending on the devs intentions, adn the results will be subjectively interpreted. Some gamers will want simpler scenery and higher framerate. Others will want the scenery to look as realistic as possible (like babcat on this forum! :oops:).

If we have one game running at 30fps at 1080p and a 60fps at 720p (plus some slight other drawbacks since it's not all about fillrate), I'm still willing to bet we'd end up with two games that looked quite similar to the uneducated eye, with one playing vastly better than the other.
If this is true, how come there are sooo many 30 fps games out there? Insomniac, ardent supporters of 60 fps, said gamers don't care enough and 30 fps with more eyecandy was more important. There are also clearly gamers who prefer the sharp visuals of 1080p over 720p, so a 720p60 would be the inferior game for them.
 
With all due respect, but if you are going to dumb down the whole topic to the point of it being a subjective matter (which all things in life are) or if 30fps is "broken", then we might as well argue the merits of why not go all the way and downgrade framerate to 15fps in order to enhance the game for those people that rather look at games than play them.

Because of judder.

If we were using an adaptative-sync monitor then ~24fps would be the limit under which most human eyes would see constant judder (a slideshow) instead of smooth motion (smooth as in without judder, without stuttering, not talking about framerates linearly dependant decrease of ghosting blur which is also called "smooth" by many people).
 
If this is true, how come there are sooo many 30 fps games out there? Insomniac, ardent supporters of 60 fps, said gamers don't care enough and 30 fps with more eyecandy was more important. There are also clearly gamers who prefer the sharp visuals of 1080p over 720p, so a 720p60 would be the inferior game for them.

Because games are sold and predominantly marketed through visuals. Because framerate and motion is hard to demonstrate using 30fps Youtube captures. Plain and simple: a lot of games are at 30fps because a lot of people (less technically minded) are ignorant to the fact what effects framerate have on a game. It's not as if you have a counter at the top right where you can effectively read it.

At the end of the day, a game becomes much more marketable if it has impressive graphics. Creating 60fps games isn't easy - it's probably a whole lot more difficult than to create 30fps ones. Then there is of course the point that many games don't require fast framerate, because as noted above, not all games are as fast paced, so drawing pixels at that speed will yield a smaller benefit than a racing game where conveying speed is a feature of the game.

Then there's also the point; if all games were developed at 30fps, creating a 60fps title at the expense of visuals will probably yield less favorable results when marketing the game. I believe developers like Insomniac who previously created games at 60fps and then went the 30fps are fighting exactly that problem: In a world where many games are pushing visuals at the forefront (at the expense of framerate), it's simply too difficult to reach parity if you're running at twice the framerate of the games you are competing with. At some point, you either join them or potentially lose sales. 30fps is especially a bullet point for the PR devision - not for the gamer.

And regarding your comment about the relative fall backs of GT5 - I feel sorry for the people still having to play games on older consoles while we get to experience the luxory of next gen consoles and super clean graphics. It makes me wonder how I was ever able to enjoy games (and be equally immersed into them) 10-20 years ago when we were effectively playing what can be today only best described as a pixelated mess.

How times change... tell me; how do you even play and enoy games on a PS4 knowing that there are PC games outthere that are even cleaner and prettier? Gaming will always have some sort of compromise. Even a 30fps game today has it. The bigger point is; is the slightly bigger compromise in order to achieve a 60fps game really that much of an issue if the benefit from a gameplay perspective potentially far outweighs the improvement of graphics - a feature that effectively will always get better as technology progresses?
 
Because games are sold and predominantly marketed through visuals. Because framerate and motion is hard to demonstrate using 30fps Youtube captures. Plain and simple: a lot of games are at 30fps because a lot of people (less technically minded) are ignorant to the fact what effects framerate have on a game. It's not as if you have a counter at the top right where you can effectively read it.
That's not true. I have a real life example of a game on PS3 that runs at 30 fps at 1080p, 60 fps at 720p, and my two friends couldn't clearly spot the difference and didn't care for it anyway. That had nothing to do with marketing and everything to do with user experience. In the same way some people like heavily compressed music while others hate it, some people honestly don't care about lower framerates.

How times change... tell me; how do you even play and enoy games on a PS4 knowing that there are PC games outthere that are even cleaner and prettier?
You live with it. Borderlands 2 on PS3 has a lousy framerate and is a jag-fest. I'd far prefer the clean visuals of a decent PC version. However, I picked it up for £7 and get to play it online with my friends without needing to buy new hardware, so the graphical shortcomings are something I live with as the overall package is a decent one.

Gaming will always have some sort of compromise. Even a 30fps game today has it. The bigger point is; is the slightly bigger compromise in order to achieve a 60fps game really that much of an issue if the benefit from a gameplay perspective potentially far outweighs the improvement of graphics - a feature that effectively will always get better as technology progresses?
"Far outweighs" hasn't been proven and I don't think can be proven. Unless some experienced gamer with a PC wants to go race a game at 30 fps and then at 60 fps and record their lap times to prove if 60 fps provides a better game experience, you're taking that on faith alone. IMO the greatly fluidity of 60 fps 'far outweighs' the 30 fps in exactly the same way the improved visual quality of 30 fps 'far outweighs' the quality of 60 fps, and the mild improvements that you feel 30fps brings to a racing game are important to some gamers for whom the 60fps is a similarly mild and unimportant improvement.
 
That's not true. I have a real life example of a game on PS3 that runs at 30 fps at 1080p, 60 fps at 720p, and my two friends couldn't clearly spot the difference and didn't care for it anyway. That had nothing to do with marketing and everything to do with user experience. In the same way some people like heavily compressed music while others hate it, some people honestly don't care about lower framerates.

Care to point out which game? I'd argue that part of the reason they didn't care for it was because the two games were vastly different beyond just the framerate differences. They also don't really sound like interested gamers either, although I'm happy to take your word for it if I am mistaken on that point.

Comparing different games (even if they're in the same genre) is always going to sway the picture at some point, since there's more to a game than just framerate and visuals. That's why I keep getting back to the argument that it would be nice to conduct a blind study using the same game with two different design goals in mind: one developed to perform at 30fps maximizing what's available and a second one, targeting 60fps but with compromises. Like this, you eliminate the other factors but end up with two more or less identical games - but playing vastly different. The players taking part in this test would then have to decide specifally if the better visuals are worth the trade-off - because there's effectively not more to judge.

This whole topic can be pretty much summed up to two relative simple arguments: I believe the cost to make a game achieve 60fps is not necessarely that big if you perhaps are willing to sacrifice resolution (fillrate) and make up the rest through additional trade-offs. The second argument is; in specifically a racing game - the trade-offs you will be making are less obvious because they are likely to impact fast moving pixels for the most part. This is unique to racing games.

Of course, we can hypothetically imagine a game that is absolutely maxing out a 30fps dedicated engine and would be impossible to create at 60 - but I don't believe we have seen such a game and it definately isn't DriveClub (or else, 60fps which at some point was a probability could have never been an option in the first place).
 
If this is true, how come there are sooo many 30 fps games out there? Insomniac, ardent supporters of 60 fps, said gamers don't care enough and 30 fps with more eyecandy was more important. There are also clearly gamers who prefer the sharp visuals of 1080p over 720p, so a 720p60 would be the inferior game for them.

Insomniac's review survey was really shoddy done and not possible to replicate because they didn't state their data. They also left out games like COD for no reason whatsoever.

Also, take a look at the metacritic scores and sales for their games after they published that survey. Both went down quite a bit.
 
Because games are sold and predominantly marketed through visuals. Because framerate and motion is hard to demonstrate using 30fps Youtube captures. Plain and simple: a lot of games are at 30fps because a lot of people (less technically minded) are ignorant to the fact what effects framerate have on a game. It's not as if you have a counter at the top right where you can effectively read it.

I do not think this is the case. Look at COD for example.
 
Back
Top