Gears of war behind the scenes

PARANOiA said:
But do you honestly think every game on PS3/X360/Rev will run at 50/60fps? I'd like to say it will, but I don't see a hope in hell of it happening.

There should be more than enough power to do that. Most of ps2 games I own runs 50/60 fps. Now and then I'm sad when I see new PS2 games do not run 50/60 fps.
If I could make the priority list how things should be in next gen:

1. Framerate to solid 50/60 fps.
2. 4x or better AA (depending of resolution 480p or 720p..)
3. Overall eyecandy, textures, models, etc.
4. Resolution 720p.
 
I don't want to turn this into another framerate discussion. There are plenty of those. All I will say is that it depends on the game how important it is and as long as there is new hardware there will be developers who will attempt to bring it to its knees if it means outputting a prettier picture at a solid 30fps. Even some ps3 developers have said they are shooting for 30 fps.
 
Jogi said:
There should be more than enough power to do that. Most of ps2 games I own runs 50/60 fps. Now and then I'm sad when I see new PS2 games do not run 50/60 fps.

IMHO 60FPS is a must for racers, but in other games i'd rather have a solid 30FPS with loads of enemies and details on screen...
 
I agree on the steady, stable and fluid enough framerate being something that most next gen games should have. And that framerate should be 60 or above.
It might not be so important on fixed camera adventure games, but faster moving games like racers, fps and tps shooters, fighters, football/soccer... 60 fps is a must imo.
I wouldn't tolerate choppy video on movies, why should I happily allow my eyes be strained and get a headache when playing visually detailed next gen games that run at a jerky 30fps.
No matter if the 30 fps is "stable" ie not dipping below it, it does give a strong impression there's something very wrong in the motion.
 
rabidrabbit said:
Why >60 fps is bullshit for next gen consoles?

Cause a TV won't be able to display it. Screen would tear.
A monitor would be able to handle higher refresh rates, but not that much. And still it looks funny when the framerate is higher than the refresh rate.
 
As the next gen consoles are trumpeted to be "HD era" they'll likely be used with high resolution progressive scan display devices, not unlike PC monitors, with higher than tube tv refresh rates.
OK, maybe I put it wrong, I should have said "at least 60fps".
 
No. The games must be compatible with old TV's...

So you're looking at 30 or 60 (as 50 is supposedly dropped).

IMHO > 60 is a waste of resources anyway.
 
rabidrabbit said:
As the next gen consoles are trumpeted to be "HD era" they'll likely be used with high resolution progressive scan display devices, not unlike PC monitors, with higher than tube tv refresh rates.
OK, maybe I put it wrong, I should have said "at least 60fps".

Well, the HD era refers to the resolution. All HDTVs still run at 60Hz.
Though i can see how console will be able to connect to monitors, i'm not sure developers will want to change their engines for it. It's much better for them to maximise detail in the game to be able to run at 60fps stable. On a monitor, 60fps is crap, but it's still the same discussion, how many people will hook consoles to monitors?

Much better to target 60fps for everyone - we already have enough "options" with resolutions and stuff - even then, we're still seeing 30fps games in the next generation so... :|
 
london-boy said:
On a monitor, 60fps is crap, but it's still the same discussion, how many people will hook consoles to monitors?

I will ;) And as long as they support it I will use it!

Much better to target 60fps for everyone - we already have enough "options" with resolutions and stuff - even then, we're still seeing 30fps games in the next generation so... :|

30fps will be with us forever. As long as the HW leaves developers wanting more power there will always be the tradeoff of graphics vs. framerate. Theoretically, if you are limited in some graphic aspect (lets say polygons) you could theoretically double the graphical finese of your title by cutting the framerate in half--while keeping the game playable. And we all know that screenshots and stills have an influence.

And the other one is rushed games... publishers will always want a game out the door. If it is 30fps and the deadline has come that is good enough. Waiting 2 months to get 60fps and missing the holidays is NOT an option.

Sad facts, but they will be with us for as far as I can see :( Ironically, I know too many people who cannot tell the difference. Seeing someone call 30fps a 60fps game indicates that a lot of people just cannot tell or care to tell the difference. As long as it is not jumpy a lot of consumers do not care. Sad, but true.
 
rabidrabbit said:
Ok, a fixed 60fps then, no more no less :)

I have had lot's of fun with Halo 2 eventhough it's only 30fps and it's rarely that I hear people complaining about Halo 2's framerate, solid 30fps is enough 60 is better and in racers it's benefits are clear, but even there I wouldn't say it's a must, Forza is pretty damn good game eventhough it's only 30fps. If one plays 60fps game first and then switches to 30fps it feels really choppy, but after a while your eye get's adjusted to it and it doesnt' bother that much anymore, imo ofcourse.
 
Well i think companies should force developers to go 60fps except very very strictly regulated situations.

Like MS said all games will have 720p and 4xAA. I wish they could also force "all games will have a stable 60fps". If the game doesn't, lower the detail till it does. Don't like it? Tough.
 
Yes, the eye is very adaptive fortunately.
Even GTA:SA framerate doesn't bother that much after you've got used to it, and after you've had your first couple of migraine attack.
But in all cases, 30fps framerate do imo make the game experience much less than what it would be with 60fps, it reduces the "feeling of being there" considerably.
Don't know, maybe some people are just more sensitive to motion.
 
I could have swore it was 2x AA required. I cannot find the original source. There is a post on Major Nelson that says this (but again not a primary source). The closest I could find:

FiringSquad: You said earlier that EDRAM gives you AA for free. Is that 2xAA or 4x?

ATI: Both, and I would encourage all developers to use 4x FSAA. Well I should say there’s a slight penalty, but it’s not what you’d normally associate with 4x multisample AA. We’re at 95-99% efficiency, so it doesn’t degrade it much is what I should say, so I would encourage developers to use it. You’d be crazy not to do it.

If 4x AA was required there would be no debate; but the ATI spokesman seems to indicate that they are "encouraging" all developers to use 4x, while the Xbox "requirements" said all games would have AA. That indicates, to me, that 2x AA is required but that they might as well use 4x AA.

Semantics to one degree because even if only 2x is required it seems most would use 4x. If there is truly only a 1-5% hit in performance I doubt any sane developer would not take the small penalty. At worse, if 5% is the real expectation worst case senario, a game running at 30fps would drop to 28.5fps, and a 60fps game would drop to 57fps.

But then again a game like a racer or figher, where framerate is really important, I guess I could see them trying to squeze out an extra fps if they were at the edge between playable and unplayable.
 
rabidrabbit said:
Don't know, maybe some people are just more sensitive to motion.

It's got to do with IQ.

The higher the framerate changes you see, the higher the IQ and the bigger the penis.
 
Back
Top