Quake 4 Xbox 360

Phil said:
(i.e. PAL are 50Hz or more often 100 Hz)
PAL is always 50Hz. Some TVs framedouble internally to 100, but there's no new information displayed in that process, no new frames appear out of thin air. Just the same old frames displayed twice to get rid of interlace flickering.

This is only applicable to CRTs and CRT projection sets, naturally...
 
Inane_Dork said:
Wha? We already have developers not too happy with MS for forcing 720p or greater. The simple reason being that it's better for the developer to decide whether the game looks better at 480p or 720p or whatever else. Maybe some effect is possible at 480p that is simply out of the question at 720p.

Forcing 60 fps is just as bad. It's better for developers to decide whether the game looks better at 30 fps or at 60 fps.

I think it's quite obvious the game will always look better at a lower framerate. Still, I think there should be a minimum default that developers should aim at. I think it's safe to assume you wouldn't be happy with 15fps games either - as much as you'd dislike that, there's an equal amount of people that are unhappy with 30 fps games as well. Especially knowing that the Xbox draws a lot of PC gamers due to the nature of the support it's getting - how many of those do you actually think are happy to play their HALOs and Gears of War at 30fps when all their life they've been playing at over double the framerate? 60 fps should be a minimum requirement at least for first-person shooters, racers and platformers - preferably also for slower paced action games or 3d games in generall that rely on big worlds and fast responses. Aiming at 60 fps is a little sacrifice with many benefits (games are more immersive, smooth and allow for better responsiveness through more visual feedback). If everyone would follow it, it'd be half as bad.

The higher the resolution and larger the tv-screen, the more obvious it gets that framerate is a very important aspect. 30 fps is simply not sufficient enough given we're moving into a "high definition" era. It wasn't this generation and it certainly isn't going to be next-generation either.

Personally, I think Microsoft or whoever can enforce such guidelines better start with it in favour of adding to the gaming experience. I think it's about time developers started to concentrate on how to get the best possible experience for the player, not the most shiny details in screenshots.

----------

Guden: I know. ;) I should have worded my reply better.
 
I think that some people here have to understand that 60 Fps is a must with fast games where reaction is VERY important (race games) but not nessesary with certain other games. I.e GOW is such a game that can look smooth on 30Fps since it looks like a slow paced game.
 
Alstrong said:
Plus, I'd rather have the framerate locked at 30fps than jumping in between 30 and 60.
Why?

For example, a fps that runs at 60 fps when you're just walking around, but drops to 30fps when there's some heavy fighting going on.
A racer that runs smooth 60fps most of the time, but drops to 30fps in start line when all the cars are on screen at the same time.

I'd rather take the 30-60fps, than 30fps throughout the game.
What are the benefits of constant 30fps over 30-60fps in your opinion? Do you mean that if the framerate is constant, it helps the immersion as the fluctuation causes distraction or what?

A 30-60fps game rarely, if ever jumps between 30-60fps every second or two, I don't think the change in framerate is as distracting as a bad framerate all the time.
 
I prefer a consistent framerate. If locked to a framerate you get used to it, but if it changes up a gear you get silky smooth visuals and then, oh, it's 30 fps. Ho hum. Ah, I've been playing a while now and don't notice it and...oh, look, 60 fps. This is more like it! Really smooth and oh, back to 30 fps. People get used to stuff and then don't notice them so much until they change.
 
Phil said:
...because the tv has a refresh-rate as well, so the framerate needs to be synched... (i.e. PAL are 50Hz or more often 100 Hz)
Discounting the fact that some console games don't even use vsync, how are you suggesting that 30fps or 60fps are more suited for 50hz or100hz refresh than something like 45fps would be? I'm at a loss to follow you as I often play PC games on my TV (at 60hz) and adjust the graphics options to shoot for the 45fps mark with vsync on. So again I ask; I don't understand why there is a fixation on 30fps or 60fps, why not shoot for a middle ground of about 45fps?
 
Last edited by a moderator:
Phil said:
I think it's safe to assume you wouldn't be happy with 15fps games either - as much as you'd dislike that, there's an equal amount of people that are unhappy with 30 fps games as well.
Link.

Especially knowing that the Xbox draws a lot of PC gamers due to the nature of the support it's getting - how many of those do you actually think are happy to play their HALOs and Gears of War at 30fps when all their life they've been playing at over double the framerate?
I'm a PC gamer, and I rarely play at 60 fps. My rig isn't good enough. I think you dramatically oversimplify the PC and Xbox audiences with that remark.

Aiming at 60 fps is a little sacrifice with many benefits (games are more immersive, smooth and allow for better responsiveness through more visual feedback).
By the same logic, aiming for 720p should be universally promoted, yet not all developers are on that boat. And I trust them a heck of a lot more than I trust just some guy on the net.

Personally, I think Microsoft or whoever can enforce such guidelines better start with it in favour of adding to the gaming experience. I think it's about time developers started to concentrate on how to get the best possible experience for the player, not the most shiny details in screenshots.
How horribly arrogant of you to say such a thing. The only thing 30 fps games get that 60 fps games do not is shiny details? There is absolutely NO support for that claim.

Developers should and will choose for the good of the game. Frame rate is one of the variables. To take away one of the variables is to make the game possibly worse. I mean, if it's really such a slam dunk in favor of frame rate, why is developer support so split on the topic? If it's really so cut-and-dried, you'll have absolutely nothing to lose by making it optional.
 
Last edited by a moderator:
The simple truth is the gaming purchasing public at large could give two-shits about 60fps.

When have you ever seen a game ad advertising framerate? WHen has a trailer ever montioned 60FPS as a selling feature? Has a game ever sold less because of being 30FPS? Has a game ever recieved a lower review for being 30FPS?

No.

There's just a very small, outspoken PC base that whines and complains at every game not being 60FPS, but the vast majority of people can't even tell the difference, let alone care enough to make a big deal out of it.
 
Halo was sub 30fps a lot of the time. It was enough for me.

The 60fps thing honestly seems to come more from gamers who started gaming in 1999 and onward. They didn't grow up on computers without acceleration, or games where fps wasn't really important. I can play Guild Wars at 1920x1200 at 20fps or so and be fairly happy. For the A.D.D. FPS gamer 30fps isn't enough, and I can pretty much agree with that. But there are FPS games that can be played at that speed. Morrowind is just fine at that pace.

Thus, the niche isn't even "online gaming freaks" in general but is "online FPS gaming freaks". :) Hell my sister is a pretty big gamer and she played Morrowind on a Celeron 700 with a Radeon 7200 for a long time. It's all relative. The FPS people are just WAY more whiney czu their games demand it.
 
hardcare gamers from 25+ age must have played a lot of games BACK IN THE GOOD TIMES at 5-15fps :)

many flightsimulators and tank combat games i enjoyed on my amiga and pc with those kind of fps..
 
rabidrabbit said:
"HD Era" really only begins at beyond 32", you don't benefit from the extra resolution enough until 40" and above.

I keep hearing this once in a while, and I can't understand why people keep saying this. Even on a 22" tv 720p looks far superior to 480p.
 
Dr Evil said:
I keep hearing this once in a while, and I can't understand why people keep saying this. Even on a 22" tv 720p looks far superior to 480p.

Yep. That's pretty much like saying that on a 17" monitor, there should be no point in playing around with resolutions, cause you only start seeing the results after 32"...??? :???:
 
Sure it looks better, but unless you site 3-4 feet from your TV, you won't see all the definition that HD provides.

On a 22" TV the correct viewing distance would be 3 feet.

Who sits 3 feet from the TV? Any more than that and you won't be able to make out any of the HD details.

In a regular living room people sit at least 5 feet from the TV, the correct viewing distance for HDTV is around 12" from the TV for every 7" of screen. So for 5 ft, you would want a 35" TV minimum in order to make out all the HD details.

It's a fairly valid statement to say that 32" is pretty much the bottom for effective HDTV, but of course, if you wanted to sit 3ft-4ft from the TV you could get away with a 21"-28", don't think it would be too great on your eyes though.
 
Phil said:
I think it's quite obvious the game will always look better at a lower framerate. Still, I think there should be a minimum default that developers should aim at. I think it's safe to assume you wouldn't be happy with 15fps games either - as much as you'd dislike that, there's an equal amount of people that are unhappy with 30 fps games as well. .


I believe Microsoft does have a 'minimum framerate' spec as part of their "720p, 2x AA, etc" mandate. Its not 60 tho. :)

J
 
kyleb said:
Discounting the fact that some console games don't even use vsync, how are you suggesting that 30fps or 60fps are more suited for 50hz or100hz refresh than something like 45fps would be? I'm at a loss to follow you as I often play PC games on my TV (at 60hz) and adjust the graphics options to shoot for the 45fps mark with vsync on. So again I ask; I don't understand why there is a fixation on 30fps or 60fps, why not shoot for a middle ground of about 45fps?

It was an example. NTSC is 60 Hz (NTSC games either run at 60 or 30 fps, PAL games at 50 or 25 fps). Also, Google is your friend.


inane_dork said:

What, you want me to link to each individual out there? Get a grip - look around some boards, especially check the Forza threads to see the bickering many have with the framerate. Those that do notice do care. (and it is noticable, if you don't, then you probably haven't played enough 60 fps games to notice.)

inane_dork said:
How horribly arrogant of you to say such a thing. The only thing 30 fps games get that 60 fps games do not is shiny details? There is absolutely NO support for that claim.

Of course there is. Slower framerate, more details. Faster framerate, less details. The first will always look better, simply because the rendering clock is half the speed (more can be done).

inane_dork said:
Developers should and will choose for the good of the game. Frame rate is one of the variables. To take away one of the variables is to make the game possibly worse. I mean, if it's really such a slam dunk in favor of frame rate, why is developer support so split on the topic? If it's really so cut-and-dried, you'll have absolutely nothing to lose by making it optional.

It's a split cause because games are being sold on the premise of having good graphics - graphics that are marketed through screenshots because those end up in magazines, internet and on the cover of every single game that leave an impression. There are numerous examples of games where framerate shouldn't have been substituted, namely, Forza, PGR2, Halo, KillZone... The framerate was a flaw in these games.

And between this whole talk about "rather constant 30 fps than a fluctuating 60 fps" - that is obvious, but it's also obvious that a game is as possible to run a constant 30 fps as it is to make it run at a constant 60 fps, given it was apart of the design-choice and a high priority. When you develop a game, there are goals that are set and the aim is to achieve them. If they aim for 30, 60 or 120 fps - if it's factored into the planning process, any planned framerate is possible without any drops.
 
Phil said:
I think it's safe to assume you wouldn't be happy with 15fps games either - as much as you'd dislike that, there's an equal amount of people that are unhappy with 30 fps games as well.

Inanedork" said:

Phil said:
What, you want me to link to each individual out there? Get a grip - look around some boards, especially check the Forza threads to see the bickering many have with the framerate. Those that do notice do care. (and it is noticable, if you don't, then you probably haven't played enough 60 fps games to notice.)

You made a statement of fact.

# of people who dislike 15fps = # of people who dislike 30fps

Technicalities aside (obviously the number of people upset with 30fps would be upset with 15fps in addition to those who can tolerate 30fps and not 15fps), I think your point was the number of people who can tolerate 30fps and not 15fps is equal to the number of people who can tolerate 60fps and not 30fps.

I am challenging that fact. Anecdotal points from an online forum is not enough. We are all aware that there are people who are dead set on a certain framerate, resolution, etc... No one is questioning that. The question is do as many people hate 30fps as you claim.

My experience has been the complete opposite and I have NEVER seen any facts, studies, etc that show that people hate 30fps as much as 15fps. Actually, everything I have read shows that *most* people cannot distinguish the difference between a solid 30fps from 60fps. And what keys most people off is not 30fps, but when a game jumps up and down. When the framerate fluctuates between 30 and 60 fps the eye catches the difference. The eye can capture only so much, but it can percieve shifts in fluidity quite well.

You are a self admitted 60fps junkie. I can respect that--I am too :devilish: That is fine--we all have strong preferences and buy accordingly. But to insist our preference is the preference of the industry, and yet not offer any clear numbers to back it up, is too much. No one is questioning that 1.) some people are sensative to sub-60fps or 2.) a large widescreen TV will exaggerate the lack of frames.

But I do not agree, based on anything I have read, that in general that most people are unsatisfied with 30fps.

From my understanding most people cannot tell the difference between a stable 30fps and stable 60fps.

Obviously given a choice every consumer would choose the bigger number (duh), but I too would like a link showing a great number of consumers can tell the difference between 60fps and 30fps (let alone prefer one over the other on the pains of death).

I think you and I are a select breed and do NOT represent the industry. A link of a study disproving this would be nice of course... because then I can rant about 60fps all the time as well :devilish:

Until then I don't buy this position. I can tell the difference and much prefer 60fps, but I can live with 30fps if it is solid. There were quite a few N64 games I never finished due to the combination of fast action + uneven framerate.

Ps- A couple people, even dozens, on an internet forum is not representative of the typical consumer, or even a lot. That is not to invalidate their opinion and preference--because it does not--but the question is one of numbers. You suggested an equal number of contrast between 15/30 and 30/60. So like Inane, I would like some proof of this.

Until then I have to chalk this up to your general fps ranting :p

Phil said:
And between this whole talk about "rather constant 30 fps than a fluctuating 60 fps" - that is obvious, but it's also obvious that a game is as possible to run a constant 30 fps as it is to make it run at a constant 60 fps, given it was apart of the design-choice and a high priority. When you develop a game, there are goals that are set and the aim is to achieve them. If they aim for 30, 60 or 120 fps - if it's factored into the planning process, any planned framerate is possible without any drops.

There is a balance between project competion and optimization.

e.g. PGR3 and NFS are both at the stage where they may, or may not, hit 60fps (IGN today on NFS). Both have said if given more time they COULD hit 60fps. So it is not a matter of graphics vs. framerate. It is a matter of, "My PUBLISHER wants the game out the door NOW. If we miss this holiday season we lose money and gamers miss out on the game".

If most gamers really cared they would not buy these games. Yet they do. Why? Because most people can overlook the issue.

This is an example where since the devs are under the gun (they need to get the game out and start a new one) and most consumers don't care the 30fps is an acceptible product to the marker. Stinks for us... YES!! :devilish: This is an example where the mainstream "acceptance" level means a less than perfect product.

But there are a LOT of games out there. MS has announced 160+ for the Xbox 360 and Sony 100+ for the PS3. Each game will cater to their market.

And that goes for ALL graphics. Some of us HATE anime. Others hate retro or grunge/earthy graphics. The balance of "what looks good" is different across the board.

For those who love 60fps will get games that are 60fps. For those who love games that are 30fps there will be those.

There is no "one is better" in that not all consumers have the same tastes, cares, or perception. Theoretically, if GPU rendering was the bottleneck, a 30fps game could look 2x as "good". Some gamers would PREFER that over a game that is 60fps.

I am not willing to say, as a rule, 60fps > looks.

Many gamers will prefer a 30fps game with 2x the detail.

Publishers should get to choose that. If gamers want it they will buy it. If they don't the publisher will hurt. So far gamers are not telling publishers with their $ they hate 30fps games.

Obviously I hate rushed titles, and prefer 60fps, but I am not convinced it is the end of the world. I remember someone mentioning the other day one of the MGS games ran at 30fps and it did not hurt sales. Ditto Halo (everyone I know with an Xbox lives it... much MUCH more than TS which I like a lot!). Mario 64 was not a constant 30fps, nor ZOoT at times, yet people LOVED these games. Ditto GTA3. I can think of quite a few 60fps games that underwhealmed. If a lower FPS means better gameplay/AI etc... than so be it. And given the choice of Halo 2 in fall of 2004 @ 30fps or in fall 2005 @ 60fps, well, I think the answer is simple. People will put up with less frames for an earlier release. So we are victims of the market.

Choppy framerate may be a flaw, but it is one most people do not notice. Good or bad, that just seems to be the case.
 
Last edited by a moderator:
london-boy said:
Yep. That's pretty much like saying that on a 17" monitor, there should be no point in playing around with resolutions, cause you only start seeing the results after 32"...??? :???:
Yes, but I meant in livingroom conditions, where normal people usually sit further away from the telly than when sitting in front of a PC monitor.
I play console games in my livingroom, sitting about 3 meters from the telly, and I'm sure I would feel on a 32" TV the High Definition resolution would pretty much be wasted. I am not planning to play my consoles and watch HD films like I play on PC, sitting uncomfortably near and limiting the viewing from others.
I think the majority of families have their livingrooms like I do. I'm aware that nerds like to use the PC, watch TV and movies from a computer monitor, but they're in the minority.

If I had my sofa a meter from the telly, then I sure could see the difference between SD and HD on a 32"
 
expletive said:
I believe Microsoft does have a 'minimum framerate' spec as part of their "720p, 2x AA, etc" mandate. Its not 60 tho. :)

J
IIRC the minimum was 15fps, which isn't a minimum worth mentioning.
 
Back
Top