Sooo...that's probably how they are going to do it...???

Conker Live

Newcomer
I think I know how the developers are going to get that big graphical leap over the current generation on X2, PS3 and N5.... They are going to render most games at 30 fps! OK, not to long ago I read an Xbox 2 thread (here or at gaming-age) that claimed a developer said most games of the next generation will run at 30 fps (atleast the first generation will). I thought what the hell is this crap...why would developers go with 30 fps with all this power coming their way? Is it do to the complexities for these multi-core CPUs? Maybe in part but I bet most games will be at 30 fps because I don't see an advantage to 60 fps for most next-gen games. Why? Because if the physics and animation are calculated at 120 fps or so, then the 30 fps will be as smooth as TV. I believe this technique is what GT5 will be using on PS3. This can be done by using different threads for physics, animation, etc...not an easy task but definitly doable. I'm not saying that ALL games will be at 30 fps...I saying that most games will be just fine at this render rate; including fighting and racing.

Think about it....DVD movies are smooth but they are only 24 fps (one frame is shown twice every so often (3:2 pull down I believe)). FMV sequences are smooth as hell but run at 30 fps. Intel is demoing an unreal engine 3 based game that does what I talking about.

I bet that most to all high-end next-gen games will be at 30 fps without the short comings of many 30 fps games today. This will allow for an overall smoother gaming experience with those visuals that everyone is expecting.

What do you think? If you disagree...then why so much CPU power?
 
Movies and (good) pre-rendered video have temporal antialiasing* (motion blur). If certain types of games realistically replicate the effect, I won't mind them running at only 30fps, but I'd still want my racing and fighting games to be 60fps because there's still a visual benefit to this. IMAX movies have a higher framerate than standard film because the details are represented better. To implement temporal antialiasing and intelligently and dynamically blur up the scene is more taxing to the hardware than to just display the details that are already being rendered, so it's not likely that it'll be used instead of running at 60fps. And without temporal AA, 30fps games just aren't going to look as fluid as video or film.

*note: Don't confuse this with what ATi calls temporal antialiasing.
 
The problem isn't really 30fps . Its a constant 30fps that you need. Any dips or spikes and it will make that target unplayable .


As for power and what not . THe systems will be powerfull and there will be an improvement from last gen but this improvement has been happening in the pc section for many many years and the tech going into these boxes (excluding the cpus ) is not going to be much more advanced than what we have in the pc section.
 
jvd said:
The problem isn't really 30fps . Its a constant 30fps that you need. Any dips or spikes and it will make that target unplayable .


As for power and what not . THe systems will be powerfull and there will be an improvement from last gen but this improvement has been happening in the pc section for many many years and the tech going into these boxes (excluding the cpus ) is not going to be much more advanced than what we have in the pc section.

I doubt that. The minimum I expect from this next gen, is UE3 quality.
 
a688 said:
jvd said:
I doubt that. The minimum I expect from this next gen, is UE3 quality.
which runs on current pc hardware :rolleyes:

Its not his fault he is from the future.

Might as well hang out with this guy then...


;)

Seriously though, I think we should be glad if a large scale game with UE3 quality graphics can be achieved on next-gen in the first place, with all the high-end AI, physics,... that are expected to come with it.
 
I think there will be a much higher % of games running at 30fps next gen it's already happening on Xbox with blockbuster titles. People aren't complaining about Halo 2 or Forza running at 30fps are they? The way I see it Microsoft might just encourage devs to sacrifice the 60fps standard for certain game types for the extra performance.
 
jvd said:
I doubt that. The minimum I expect from this next gen, is UE3 quality.
which runs on current pc hardware :rolleyes:

Define "run". I could be wrong, but I doubt even a FX-55 with SLI 6800 ultras could run a full UE3 game at acceptable frame rates.


Evil_Cloud said:
Might as well hang out with this guy then...

;)

LMAO! Where did you find THAT?!


Seriously though, I think we should be glad if a large scale game with UE3 quality graphics can be achieved on next-gen in the first place, with all the high-end AI, physics,... that are expected to come with it.

This is exactly the area I'm expecting to see great progress on next-gen console games, because of their enourmous CPU power (at least we expect so).
 
Let's wait until NDAs are lifted and devs start talking rather than basing everything off what some 3rd rate developer said on a 4th rate forum.
 
Xeno said:
I think there will be a much higher % of games running at 30fps next gen it's already happening on Xbox with blockbuster titles. People aren't complaining about Halo 2 or Forza running at 30fps are they? The way I see it Microsoft might just encourage devs to sacrifice the 60fps standard for certain game types for the extra performance.
Halo 2 doesn't ever get praised for its "steady" framerate, does it? And I (and a lot of other people across many forums) have done a lot of complaining about Forza's 30fps ceiling. I'd rather they drop the "realtime" (slideshow) environment maps and crank the game up to 60fps. And while they're at it, bundle USB adapter so I can use the Driving Force Pro I'm gonna get for GT4 on the Xbox.
 
Anyone who thinks 60 fps is going to be prominent next generation has another thing coming to them. It will have nothing to do with how hard it is but more like the same excuses that developers use these days.

I'm upset that Forza is only 30 fps and claims to be a GT4 killer. That is definitely going to hurt the game when being comapred to gT4.

Halo 2 doesn't have a steady framerate as it is so I really don't think consumers would complain about it having 30 fps. It would have to be a constant 30 for anyone to complain about it.

A steady framerate is important in the next generation. There are too many games that have dips when giong into highly populated scenes and this is a major distraction in any game. I think a policy each console manufacturer should take is having a steady framerate on the consoles. 60 fps for fighting and racing games and any other fast twitch games that require more timing and attention would be good. 30 for anything else is fine as long as it is locked on and steady.
 
I think Halo2s framerate is fine atleast in multiplayer. Could care less about SP. Very Very rarely ever dips unless its an extreme situation like a vehicle blows up as a guy shoots a rocket launcher into a crowd of 6 players, who are all double fisting and firing weapons and a ghost comes screaming by at 60mph. Then it might dip just slightly enough that you can appreciate the beautiful mess for a split second.

Wanna talk about a horrible framerate play Ghost Recon 2 on the xbox. Good game but it feels like its running at 20fps especially in SP. Or better yet Vice City on the ps2 feels like Doom3 on a TNT2.
 
Conker Live said:
I think I know how the developers are going to get that big graphical leap over the current generation on X2, PS3 and N5.... They are going to render most games at 30 fps!

I hate to break it to you but many, if not most, current console games run at 30 fps. It takes a trained eye to distiguish a solid 30 vs a solid 60 fps. It's more a matter of feel than look. As many here have mentioned, running 60 is great for pride, but in practice it is more important that the frame rate remain steady. A game than bounces between 30 and 60 will get a worse reaction than one that vsync-locks to guarantee 30 -though most consumers won't be able to explain their own reaction.

My company has shipped games that the reviewers praised as having "incredible graphics at a silky smooth 60fps" when the reality was that we were running 30 and using triple buffering to smooth over any rough frames.

If your game is twitchy and the reactive feel is critical to your gameplay then you should shoot for 60. For most games though, 30 lets you do twice as much stuff and that is more than worth an extra 16 milliseconds of lag.
 
30fps hurts my eyes in most games, and I think I can nearly 100% tell if a racing game or fps game is running in the 30's. In a game like Silent Hill the 30fps would most certainly go unnoticed.

True, it might be more a matter of "feel" than look, but one should not underestimate the "feel" of a game, as it is still a very crusial part of the gaming experience. The difference between 30fps and 60fps is the difference between feeling good playing the game versus feeling "stuttery" and getting eyestrain.

The human eye after all is adapted to natural movement, imagine if similar stuttering were present in your field of vision while you were driving a car in real life, you'd go see a doctor!

The difference between 30fps and 60fps might be less noticable in smaller screens, or even big screens if viewed from distance.
But for example on a 42" display, viewed from less than 3 meters, the difference is like night and day.

Claiming 30fps is "enough" sounds like an excuse.
Bad 30 fps not be noticable in screenshots, or videos dl'd from web, or maybe even if you see the game played in tv.
But when played in person, the difference is there, and most consumers will either feel or see it.

But if next gen hardware will have some motion blur filter that does similar to the 30fps as film does to 24fps, it'll be less noticeable in many cases, but still in quick pans it'll look awful compared to 60 or above fps.
 
rabidrabbit, I second your opinion, still IMO even with motion blur movies are not perfect at 30 or 25 or 24 fps. TV programs that are recorded at 60/50 fps are much more pleasure to watch. I really can't understand if any next-gen game will be using just 25/30fps. Even with this gen games I sometimes choose not to buy a game that runs only 25/30fps. Buying Richard Burns Rally for ps2 was one mistake I did, oh my, I was so angry when I found out it was running just 25fps (pal). Oh yes, MGS3 is out... with 25 and lower fps!! so no thanks, even if I really liked the MGS2 !! :devilish:

Antialiasing is another thing that I hope they take care in nextgen.. see new thead about that :)
 
The lower framerate in MGS3 is most noticeable in first person view, as in that view you are only able to move your view in x - y the less smooth panning is clear in larger areas. I think the framerate is about the same as in MGS2 in closed areas, at least I could not tell the difference.
I'm not sure, but are you also able to move the view much faster in MGS3 first person, than you were in MGS2? At least I remember that the MGS2 fps camera was limited to very slow movement.

In couple of areas the framerate is very noticeably sub MGS2 even in non first person view. But in most places I must say I can not tell the difference between MGS2 and MGS3, maybe if I played the MGS2 and right after that the 3, I could tell.

These kind of games are the ones that suffer the least from sub 60fps.
 
My reaction to sub-60/50 fps games was :cry: It's a hang-up from the PC days IMO. In the era of 16 bitters, the Amiga, ST etc, locked to the VSync and had smooth graphics, whereas the PC jittered all over the place and still does to this day.

However, a well AA 30/25 fps constant frame rate should be fine. It is the rate of TV and Film after all. That super-smooth 60/50 is very nice and to me, what differentiates a console from a gaming computer. But I can live without it if the eye-candy is up to snuff and the gameplay actually DOES makes use of super AI, physics, et al.

I do think, though, that the frame rate comes second place to stills quality. Games sell of quality of stills, not quality of moving images. I grabbed a copy of FFX yesterday. Screenshots look very nice. The shimmer and jaggies are HORRENDOUS!!!!
 
Maybe the million $ question for the lotta'ya's is- would you prefer your action-based game at 30 fps progressive, or 60 fields interlaced? Eh? ;)

More simply, if you could get 60 Hz refreshes in the time domain, would you be willing to trade off progressive formatting?
 
Back
Top