Quake 4 Xbox 360

ACert93:

Acert93 said:
I am challenging that fact. Anecdotal points from an online forum is not enough. We are all aware that there are people who are dead set on a certain framerate, resolution, etc... No one is questioning that. The question is do as many people hate 30fps as you claim.

As little as I can prove that there are x amount of people that dislike 30 fps, as little can you prove otherwise - so let me suggest we just agree to disagree because we obviously won't get much further and arguing semantics will ultimately only get another topic closed.

Still I think there's a lot of arguments that suggest that there are many gamers that are very well aware of framerate differences, much more on PC than in the console industry simply because PC gamers suffer framerate problems to much larger extend than we do. It's quite obvious why: PC games evolve around having to own the necessary equipment to run the game - and those are usually quite basic, but if you really want the game to run well, you need a fast computer with a good graphics card. A lot of time is spent on optimizing all the effects to make the game run sufficient (= smooth enough) before most can enjoy games on PC to the fullest extend. This is why I would assume that many Xbox users (which have a PC-gaming background) would be very informed about framerate as well.

Acert93 said:
My experience has been the complete opposite and I have NEVER seen any facts, studies, etc that show that people hate 30fps as much as 15fps.

YOu are looking far to deep into that statement I made. There aren't any 15 fps games as far as I know in the console space - it was hypothetically, as in, "you'd disliking 15 fps games is like many disliking 30 fps games".

Acert93 said:
Actually, everything I have read shows that *most* people cannot distinguish the difference between a solid 30fps from 60fps.

Can you link me to any article suggesting this? I know this is a general impression one can get by reading forums, I'd like to point out though that this greatly varies on the type of game and simply that many uninformed users notice a difference but fail to put it down to something specific like framerate. I'm sure a big majority does notice it, but they don't know it's framerate - how could they? These things aren't written in game manuals - they perceive a difference but can't put down what exactly it is, that's different (certainly nothing technical like framerate - they wouldn't even know what framerate is.

I remember in the olden days, when I played NFS1 on my P133MHz rig. It ran quite smooth and when I finally upgraded to NFS1:Special Edition, I read all this thing about "better graphics" but at the same time failed to understand why the game felt "choppier". I did notice the framerate difference - it was still constant, but it was noticably "slower". I really didn't know what framerate was though, not until much later until I started using the internet and surfed technical boards that dealt a lot with this topic. I see many uninformed individuals falling into the same category - they notice the difference, but can't put it down to "framerate". And lets face it, the majority of the console buyers are average-joes - people that are hopelessly ignorant to the technical side of things. They're not stupid, they would perceive the differece, but just because they can't put it down to framerate (the technical term) doesn't mean the industry doesn't care.

There are a few basic facts, one of them is; the human-eye can perceive the difference in framerate up to a certain point - this point is not 30 frames-per-second. It's a simple undebatable fact that the human eye can perceive up to 60 pictures per second (after that, it becomes a non-issue because the differences isn't perceived anymore) - this isn't something that varies from human being to human being. We all have our preferences and opinions, but we are virtually identical in many of our limitations (in that we can distinguish a difference in smoothness of motion up to a certain point).

Given the fact that we humanbeings can distinguish the difference in smoothness of a given motion up to around 60 Hz (It's actually a bit higher I think accoarding to various papers on eye-research), I fail to see why 30 Hz should still be even a target for game-developers, especially for faster paced games. It does extend the experience in many ways (noticable even if not every single person puts it down to framerate), I would think solely on this it's worth it!

Acert93 said:
e.g. PGR3 and NFS are both at the stage where they may, or may not, hit 60fps (IGN today on NFS). Both have said if given more time they COULD hit 60fps. So it is not a matter of graphics vs. framerate. It is a matter of, "My PUBLISHER wants the game out the door NOW. If we miss this holiday season we lose money and gamers miss out on the game".

This is not true - game development (or development of any kind) has a lot more to do than blind faith and deadlines. As I already told you, framerate is something that needs to be planned into the design process because everything is dependant on it: physics synching, how much textures you push through the pipeline, the geometry you transform - EVERYTHING. YOu just don't hit 30 fps or 60 fps on faith - you hit it when you plan it and you fail when you pack to much into the game - slowdown occurs then when too much is being processed because maybe at that given time you have too many objects on screen all doing something that is requiring more process time than under "ideal conditions". This things need to be factored in and given that these things are factored in properly, slowdown will not be an option.

For example Wipeout Fusion. The team aimed for 60 Hz, yet there are some very unfortunate times where you have too many ships on screen that are all firing a weapon all at the same time - the result being that more needs to be processed which obviously takes up to much time (more than 1/60th), so slowdown occurs. The team should have factored in these possibilities better in that they either program the AI not to fire once x ships had already fired in the last 10 seconds or that less processing is going on. These things need to be planned - they obviously didn't think of every possibility and thus we have a game that suffers from slowdown when one of those things occur. Maybe they thought at some point "it's more important to launch the game and live with this "minor" flaw" opposed to that they'd solve it. The point though is, framerate was apart of the design choice and somewhere down the line someone stuffed up in that they didn't take some (likely) scenarios into account that ultimately leads to the slowdown in those unfortunate situations. The better the planning is, the better the end-product -> this is integral part of any project if you have goals you want to meet. The result of a team lowering their original design goals from 60 fps to 30 fps occurs then when they at somepoint realize that they were unrealistic in their budget planning (they overestimate what their engine can handle and find out that in real life the engine is too slow and thus 30 fps is the only possibility without loosing a lot more time on engine tuning). This is something that can be avoided though - either solved by more realistic (easier) targets and estimates, sometimes more efficient programming as well. It's not uncommen that you know you have to write a function that requires a certain speed and then you simply find out that there's no way to write that function fast enough. Correct estimates is a very important aspect, one that many times is where many are too optimistic and have to retract on their original goals - which logically can play a significant factor when framerate targets aren't hit anymore.

I suspect many teams to aim for 60 fps and then simply find out that they were overly optimistic in many cases and then simply downgrade to 30 fps. A possible solution would be to plan better (i.e. aim for less but achieve it instead of aiming for too much that could be unrealistic but sounds impressive).

In the end, I stand by my point that achieving a rock-solid 60 fps is as possible as rock-solid 30 fps. It all depends on what goals the developer-teams set and how realistic those goals are. It also depends on how important it is to them to achieve that framerate. Unfortunately, I think in many cases it, 60 fps isn't a high priority because 90% of the gamers are ignorant fools that while would notice the difference can't put it down to framerate and because games are sold on the premise of "nice graphics". This is how games are sold and this is what publishers want from their teams - give us a game that looks good and we can market around key points like i.e. brilliant graphics. Graphics are sold through impressive screenshots - why do you think each team sends out oversized dev-screens? So that it looks super duper impressive - something that will ultimately impress gamers and sell their game. No one sees framerate by just looking at the screenshot - usually, games reach the market without any definite answer what framerate it even is. The only ones that find out the definate framerate are the ones that know what to look out for and can point at and draw a conclusion (based on estimates). All the others will simply notice that it doesn't run as smooth or fluent - nothing more. They'll still love the game for its gameplay and they'll already have bought it - it doesn't mean the experience couldn't be much more fullfilling though if the framerate would be better.

Again, we humans can see up to 60 fps, we all have this gift - while not everyone can put it down to framerate - we do perceive these differences - and it should be a minimum requirement, especially now that HD resolutions are becoming the standard. It's like going to the movies and watching a movie on the big screen opposed to watching it on the smaller tv at home - it's much easier to notice the low framerate on a big screen than it is noticing it on a small screen lower in dimension and resolution.

But hey, the only reason of supporting 30 fps opposed to 60 fps is more eye-candy!!!111. More impressive screens. More detail in the game. Sad but true.
I admit, not all games require 60 fps, but some games of certain genres do. If you can't appreciate that, it's time you compare Forza to a game like GT4 (side by side) or Halo with TimeSplitters. Feel the difference as you move your view from one side to the other... notice the responsiveness, the accuracy. It's worth it, which is why it should be mandatory.

If the majority of games would be 60 fps - you bet more people would notice the difference. There is an influx of games at 30 fps on the market which is why people tend not to notice - they can't compare because few games aim for that 60 fps. Just because we are damned to have too many 30 fps games that hinder our judgement - don't we still deserve better? :devilish:


[EDIT]
:oops: cookies?
 
Last edited by a moderator:
Phil said:

What, you want a cookie for the nice long reply?

Sure!

chocolate-chip-cookies_small.jpg
 
Last edited by a moderator:
:oops: thank you. :smile:

BTW; that last bit about comparing games wasn't necessarely directed at you - I know you aren't the average joe..
 
ive been a long time lurker of these forums and i just registered to say thank you Phil for putting into writing what i felt so strongly about but could never find the proper words for.
 
Phil said:
It was an example. NTSC is 60 Hz (NTSC games either run at 60 or 30 fps, PAL games at 50 or 25 fps). Also, Google is your friend.
Google is also my homepage, and I know the difference between NTSC and PAL refresh rates; but, I started by asking about 30fps or 60fps and you responded with a comment about 50hz or 100hz so hardly see what your point is in telling me to search for anything. Beyond that, I still haven't seen an answer to my question but I suppose it is a bit off the starting topic so I'll just start my own thread on it.
 
kyleb said:
Google is also my homepage, and I know the difference between NTSC and PAL refresh rates; but, I started by asking about 30fps or 60fps and you responded with a comment about 50hz or 100hz so hardly see what your point is in telling me to search for anything. Beyond that, I still haven't seen an answer to my question but I suppose it is a bit off the starting topic so I'll just start my own thread on it.

To answer your question, 45fps on a TV would produce tearing. So i'm not sure how you're playing your games on a TV, at 45fps. Can't you notice tearing?
 
kyleb said:
Google is also my homepage, and I know the difference between NTSC and PAL refresh rates; but, I started by asking about 30fps or 60fps and you responded with a comment about 50hz or 100hz so hardly see what your point is in telling me to search for anything. Beyond that, I still haven't seen an answer to my question but I suppose it is a bit off the starting topic so I'll just start my own thread on it.

Just to add what London-Boy just replied: If your TV refreshes at 60 Hz and your game runs at 45 frames-per-second, the games and the tv refresh rate wouldn't be properly synched (in parallel), thus producing tearing.

If you're game however runs at 30 fps and your TV refreshes at 60 Hz, then in a simplified example, the TV would be receiving the update every second refresh, thus no fluctuations and as a result no tearing. A game running at 60 fps on a TV with a refresh rate of 60 Hz would be running in parallel - as a result, you wouldn't have any tearing either.

If you're game refreshes at 45 fps and your TV refreshes at 60 Hz, the fundimental problem is that they both start in parall (TV 1st refresh = 1st refresh of game), but from that point they'd be missing each other when the refreshes occur.

My example of using PAL refresh rates was an unlucky one - living in PAL territory I sometimes use the numbers while forgeting that the majority uses NTSC and the popular 30 / 60 fps. Hope that answers your question.
 
You won't get tearing with a lock to the vertical refresh though. Movies run at 24 fps and I don't see no tearing when I watch them on TV. Is this also not the case for PAL>NTSC conversion where frames are doubled? 45 fps could be shown as a doubling up of ever second frame.
 
Shifty Geezer said:
You won't get tearing with a lock to the vertical refresh though. Movies run at 24 fps and I don't see no tearing when I watch them on TV. Is this also not the case for PAL>NTSC conversion where frames are doubled? 45 fps could be shown as a doubling up of ever second frame.

DVD conversions run at 30fps. So your TV always gets a 30fps signal, even though the source (the movie) was running at 24fps at the cinema. I think it's all down to the DVD conversion. Or even broadcasts. Basically displays those 6 missing frames by showing one every 4 frames twice.

Might be wrong completely though. I'm sure i read something like that somewhere.
 
Yes, they show frames twice. Likewise a game could. It could show every second frame twice on a 60 fps display to show a 45 fps output, without tearing and, AFAIK, without visible jitters.
 
Shifty Geezer said:
Yes, they show frames twice. Likewise a game could. It could show every second frame twice on a 60 fps display to show a 45 fps output, without tearing and, AFAIK, without visible jitters.

At some point though it will fluctuate, giving the sense of a fluctuating framerate (at times a frame would be displayed twice, at times not).

(PS: we should stick to one topic :D )
 
Back
Top