Nope. HD supports 720p and 1080p outputs. We have three different framerates available though - 60, 50 and 24 fps. Not all TVs support all framerates so I don't think devs can use them. I don't even know if a game can switch video output rate - it's only important for video output so may only be accessible via certain limited OS options.Do the new consoles not come in pal and ntsc versions like the old ones ?
Whatever is being done in that real-life 30 fps movie to make it smoother (motion blur!) can be applied to a game to achieve the same result.Do you mean a video of a racing game at 30 fps or a video of real life driving at 30 fps? If the latter then the visual experience of watching a real life driving video at 30 fps should still be smoother than a racing game running at 30 fps. At least that's my experience from watching in dash camera of races and playing racing games at 30 fps.
CE companies still make the divide. Buy a camcorder here and you'll likely only have the option to record at 50/25 fps, despite your TV being able to show 60/30. Fundamentally every device should be switchable between refresh rates as the user requires, so record in 24, 25, 30, 50, or 60 fps and play back at native refresh on any modern display. Because the US is clearly 60 fps based, I think it likely that some modern TVs just cap them to 60 fps and don't even have the option to switch to 50 Hz. Googlage shows someone who's US TV can display 50 Hz, someone saying US TVs disable 50 Hz, and someone saying their UK NowTV box only outputs 60 Hz (like Android) which introduces judder in UK programming.Is PAL and NTSC still a really big deal?
Whatever is being done in that real-life 30 fps movie to make it smoother (motion blur!) can be applied to a game to achieve the same result.
MrFox said:The most important feedback that needs fast reflexes is the tires behavior, and this isn't a visual feedback, if your brain detects a loss of grip by decoding your change of trajectory, it's already too late. Physical feedback is king here.
Okay, it may not be exactly the same, but the principle will be there. If it's motion blur that makes 30 real life footage smoother (and that depends on the shutter speed. Low shutter speed 30 fps and 60 fps sport is still 'jerky'), adding motion blur will make 30 fps similarly 'smoother'.How? Adding fake motion-blur effects I would think is not the same as the natural bluring a video camera recording will produce because a frame is effectively a picture at a shutter speed of 1/30th - while a video game renders 30 perfectly sharp images. Any bluring you'll induce at that point (if your game does it) is a fake bluring, perhaps using the last frame, but the result isn't quite the same - I would think?
I think we're diving into nitpicking territory. Maybe a better lighting model will allow you to judge the corner distance much more effectively, and 30fps would win in this casePhysical feedback is worthless without the visual cue in a video game. Even in real-life - noticing grip loses is not by the sole fact that the steering wheel feels different - it's a continues stream of information of visual, motion (acceleration/centrifugal force) and feel that is being processed by your brain simulatenously to form what is happening. Playing a video game - the majority of information you can get is predominently through what you see on screen. Even a force-feedback device will be limited and will not make up for the lack of visual feedback you are getting by a halfing the framerate.
In other words - you could be running your game internally at 600 frames per second - it still wouldn't matter much if you're only seeing your updates on screen every 20th of a second.
Actually I think yes.Would you be much better with a wheel that takes 1/10 second less time to turn?
As it is I have roughly 1mm per 1% of throttle, having 1mm per 5% of throttle would be too sensitive for me the chances of me for example hitting 42% throttle would be massively reduced if i had to press the throttle 8.4mm instead of 42mm.What about pedals with a shorter travel and less resistance, giving you 1/10 second improvement?
Of course you need a lot of visual information, but it's not in the area of a sub-100ms reaction time. We're not talking about playing a video game in general, only realistic racing games. Twitchy FPS are a different discussion entirely.
"Better game" is subjective. For some people, better visuals will result in the better game. It's not possible to conclude that 60 fps will always be better as long as the visual compromises can be assumed to be little.In the context of judging the merits of 30 vs 60fps in a game, we need to understand the trade-off. If we are comparing a game running at 60fps but looks like a last generation game to i.e. DriveClub in all its 30fps glory - then yes, the visual difference is quite big. If we assume identical hardware though - and messure the drawback of 30fps to 60fps to what is hypothetically possible, as I said, I'm not sure that with the right compromises, the higher fidelity in graphics at the expense of framerate will amount to a better game.
Instead of comparing it to a different game, look at GT5's shortcomings. Static time of day, lousy shadows, 2D billboard people, etc. Now imagine all those shortcomings cleaned up by going 30 fps. That would make a prettier game. Some people would prefer that prettier game. Others would prefer the smoothness of the uglier game. It's impossible to call either version the better version. The best one could do there is claim one to be the preferred by more gamers version.Lets take the PS3:
We've had GT5/6 running at 60fps and compare that to NFS:Shift
Without knowing what the actual changes are, that's impossible to answer for each individual you ask the question of. It's very easy to propose a few, barely noticeable downgrades as evidence that the game overall would be better off at 60 fps, but in reality the differences might be far more significant and enough that various gamers presently interested in DriveClub would be less so because it's not as pretty.If we take DriveClub - which is where this discussion has spawned from - how much worse would it look if the developer took that game and targeted 60fps from the get-go? Perhaps take a cut on resolution, a few cars less. Would it be a worse game or a better game?
I think you talking too much from personal perspective. For me, coming around a corner in GT and seeing cardboard cutouts does break the immersion, as does having a saw-edge of a shadow creeping up the side of the car interior. They're things I can live with and learn to ignore, but I'd prefer them gone. Would I prefer them solved over 60 fps? No. Would everyone else categorically prefer weaker visuals and higher framerates? I don't imagine so. Saying that the player is focussed on the game and not taking in the peripheral graphics is only true to a degree. Replace all GT's scenery with coloured blocks for a really simple unrealistic view and it'll still race well and be good gameplay, but it'd be a different experience. There's a balance between artistic superiority of fluidity, which is always, still, on a per-title basis depending on the devs intentions, adn the results will be subjectively interpreted. Some gamers will want simpler scenery and higher framerate. Others will want the scenery to look as realistic as possible (like babcat on this forum! ).The game would be perhaps prettier when looking at static screenshots or if you're actually a bystander watching someone else play the game (with all the time in the world to focus on visuals) - however, while playing the game yourself at high speed and being totally immersed into the gameplay, the difference between 2D billboard people to fully 3D ones as an example is likely to make little to no messurable difference to the overall immersion and gameplay of the game.
If this is true, how come there are sooo many 30 fps games out there? Insomniac, ardent supporters of 60 fps, said gamers don't care enough and 30 fps with more eyecandy was more important. There are also clearly gamers who prefer the sharp visuals of 1080p over 720p, so a 720p60 would be the inferior game for them.If we have one game running at 30fps at 1080p and a 60fps at 720p (plus some slight other drawbacks since it's not all about fillrate), I'm still willing to bet we'd end up with two games that looked quite similar to the uneducated eye, with one playing vastly better than the other.
With all due respect, but if you are going to dumb down the whole topic to the point of it being a subjective matter (which all things in life are) or if 30fps is "broken", then we might as well argue the merits of why not go all the way and downgrade framerate to 15fps in order to enhance the game for those people that rather look at games than play them.
If this is true, how come there are sooo many 30 fps games out there? Insomniac, ardent supporters of 60 fps, said gamers don't care enough and 30 fps with more eyecandy was more important. There are also clearly gamers who prefer the sharp visuals of 1080p over 720p, so a 720p60 would be the inferior game for them.
That's not true. I have a real life example of a game on PS3 that runs at 30 fps at 1080p, 60 fps at 720p, and my two friends couldn't clearly spot the difference and didn't care for it anyway. That had nothing to do with marketing and everything to do with user experience. In the same way some people like heavily compressed music while others hate it, some people honestly don't care about lower framerates.Because games are sold and predominantly marketed through visuals. Because framerate and motion is hard to demonstrate using 30fps Youtube captures. Plain and simple: a lot of games are at 30fps because a lot of people (less technically minded) are ignorant to the fact what effects framerate have on a game. It's not as if you have a counter at the top right where you can effectively read it.
You live with it. Borderlands 2 on PS3 has a lousy framerate and is a jag-fest. I'd far prefer the clean visuals of a decent PC version. However, I picked it up for £7 and get to play it online with my friends without needing to buy new hardware, so the graphical shortcomings are something I live with as the overall package is a decent one.How times change... tell me; how do you even play and enoy games on a PS4 knowing that there are PC games outthere that are even cleaner and prettier?
"Far outweighs" hasn't been proven and I don't think can be proven. Unless some experienced gamer with a PC wants to go race a game at 30 fps and then at 60 fps and record their lap times to prove if 60 fps provides a better game experience, you're taking that on faith alone. IMO the greatly fluidity of 60 fps 'far outweighs' the 30 fps in exactly the same way the improved visual quality of 30 fps 'far outweighs' the quality of 60 fps, and the mild improvements that you feel 30fps brings to a racing game are important to some gamers for whom the 60fps is a similarly mild and unimportant improvement.Gaming will always have some sort of compromise. Even a 30fps game today has it. The bigger point is; is the slightly bigger compromise in order to achieve a 60fps game really that much of an issue if the benefit from a gameplay perspective potentially far outweighs the improvement of graphics - a feature that effectively will always get better as technology progresses?
That's not true. I have a real life example of a game on PS3 that runs at 30 fps at 1080p, 60 fps at 720p, and my two friends couldn't clearly spot the difference and didn't care for it anyway. That had nothing to do with marketing and everything to do with user experience. In the same way some people like heavily compressed music while others hate it, some people honestly don't care about lower framerates.
If this is true, how come there are sooo many 30 fps games out there? Insomniac, ardent supporters of 60 fps, said gamers don't care enough and 30 fps with more eyecandy was more important. There are also clearly gamers who prefer the sharp visuals of 1080p over 720p, so a 720p60 would be the inferior game for them.
Because games are sold and predominantly marketed through visuals. Because framerate and motion is hard to demonstrate using 30fps Youtube captures. Plain and simple: a lot of games are at 30fps because a lot of people (less technically minded) are ignorant to the fact what effects framerate have on a game. It's not as if you have a counter at the top right where you can effectively read it.