Resolution vs Framerate on next gen consoles?

I was reading a discussion about the whole 720p vs 1080p argument when some interesting things came up.

Some argued that you could only do 1080p at 30fps, due to bandwidth reasons and performance. In some ways that makes perfect sense. Upping the resolution in Unreal sure as hell makes the framerate go down.

However, if all next gen games were made with 720p in mind (or 1080p for that matter), what does that mean for 480i/480p performance of said games?

Would it be correct to assume that a game with a "60-ish" framerate at 720p would be a rock solid 60fps at 480p?

Or does it not work that way with consoles?
 
It would depend on whether the game is actually using a different framebuffer size when the console is set to 480p vs. 720p vs. anything else. Usually they'll just use the same size regardless of the setting and just scale it up or down to match the target output resolution. It's just easier that way - less variability. So the answer is 'Probably not', although they could code it that way if they really wanted to.
 
Shrike_Priest said:
I was reading a discussion about the whole 720p vs 1080p argument when some interesting things came up.

Some argued that you could only do 1080p at 30fps, due to bandwidth reasons and performance. In some ways that makes perfect sense. Upping the resolution in Unreal sure as hell makes the framerate go down.

However, if all next gen games were made with 720p in mind (or 1080p for that matter), what does that mean for 480i/480p performance of said games?

Would it be correct to assume that a game with a "60-ish" framerate at 720p would be a rock solid 60fps at 480p?

Or does it not work that way with consoles?

Depends why something is running at a particular framerate. If something is slow because of issues unrelated to resolution (scene complexity, AI, physics, streaming overheads, whatever) then it won't help. On the other hand, if it's down to fill-rate (overdraw, pixel-shader complexity), it certainly would help to run at a lower resolution.

So sometimes your assumption would be true, sometimes it wouldn't. In some games it might even vary depending on situation anyway.
 
HDMI can currently handle 1080p/60, just not with some of the newer audio formats.

HDMI 1.3 will be able to handle 1080p/60 with the newer formats, on one cable.
 
Last edited by a moderator:
Titanio said:
HDMI can currently handle 1080p/60, just not with some of the newer audio formats.

HDMI 1.3 will be able to handle 1080p/60 with the newer formats, on one cable.

Yeah but what about the processing capabilty of the TV.

Most 1080p sets around at the moment only have the bandwidth to display 1080/30.

Example A 480i set refreshes at 60hz but only has enough bandwidth to display 240 lines per field, a 480p set has the bandwidth to display at 480 lines per field.
 
Last edited by a moderator:
I will take rock solid 60fps framerates in all games over higher resolution any day. maybe the very nature of Revolution gaming with its controller will demand better framerates. I hope!

if so, Revolution could become my most played console.
 
Framerate is king!

Megadrive1988 said:
I will take rock solid 60fps framerates in all games over higher resolution any day.
Same thing here.

I'm one of thoses who think that the framerate is king.
Render first at a steady 60 fps, and then talk about the rest, be it resolution, AA, AF, complex shaders, lighting effects...

I can "understand" when certain type of games render at 30Hz, like RPGs or any other slow paced games.
 
I don't think it really matters if its 30fps or 60fps as long as it is stable.

I would rather have 720p at 30fps with 4x FSAA, than 720p at 60fps with no FSAA.
 
Last edited by a moderator:
framerate will never sell games like purty GFX do, so it will always take a back seat.

personally i could care less about framerate, if it's good it's good. Aliasing ruins my immersion in a game 10x more than a little stutter in the framerate.
 
It's all personal perception really.

Personally i "feel" much much better when the game i'm playing runs perfectly smooth. Stuttering or slow framerate always reminds me of PC gaming for some reason, which starts getting really really irritating after a while. More of a psychological thing really, probably bringing back memories when the games i played on my first PC were horrible framerate-wise... It's fine today, unless i wanna play F.E.A.R

A non-natural framerate really kills the experience to me, much more so than some lack of deatil here and there. We all love eye-candy, but i like games cause they're an experience to me, and if i don't feel right, the experience is ruined.

I mean, the graphics are videogames graphics anyway, they don't look real, and even if they did, i would know it's a videogame, so having a slightly more "realistic" image really adds nothing to my experience of a game. It's a game, it looks like a game. Even the best looking game out there still looks like a game, so to me they're all the same, after the initial "wow" effect a particularly pretty game might give me. If i want to finish the game, it needs to be a good game, it doesn't need to look pretty. Which is also the reason why i never finished HL2, FarCry and Doom3. I bought them purely because they looked pretty (and yes, on my rig they all ran at more than 60fps all the time on max detail), but i got so bored with them i couldn't force myself to finish them.
 
True, for example, while Shadow Of The Colossus is an absolutely beautiful game, I can 't help but constatnly think it would be even much better with better framerate trhoughout, and some more plolish in the graphcis.
But it's one of the few games where you are ready to forgive the technical shortcomings because of the impressiviness of the game elsewhrere.
 
rabidrabbit said:
True, for example, while Shadow Of The Colossus is an absolutely beautiful game, I can 't help but constatnly think it would be even much better with better framerate trhoughout, and some more plolish in the graphcis.
But it's one of the few games where you are ready to forgive the technical shortcomings because of the impressiviness of the game elsewhrere.

But would SotC be as good an experience if it had 60Hz graphics of 1st-gen quality (no fur shading, no pseudo-HDR, loading screens instead of streaming enviroments)?

I would probably argue that for this particular game, I'll let them off the hook on the framerate because their ambition (expressed in every aspect of the game) was part of what made it so enjoyable.

Meanwhile for a racing game, I don't care how shiny the cars are, if it doesn't run at 60Hz I reserve the right to mock the developer. Wipeout on the PSP for example is a fine game - but it suffers from running at less than 60 (yes, I do remember that the original PS1 games didn't run at 60 either and that the PS2 version was rubbish despite the framerate - I want a perfect version of Wipeout someday damnit). Sony Liverpool, consider yourselves mocked. I am mocking you. Mock mock mock.

Ahem.

My point, is that some games don't *need* 60Hz, even if it would be nice to have, whereas the improved quality of the graphics can add a lot. But some games definitely do benefit from a higher framerate, and for those I completely agree that other niceties should be sacrificed.

Try telling that to a marketing department keen to send screenshots out to websites and magazines though. Framerate does not translate too well to a still image - but nicer pixels do...
 
MrWibble said:
But would SotC be as good an experience if it had 60Hz graphics of 1st-gen quality (no fur shading, no pseudo-HDR, loading screens instead of streaming enviroments)?
...
Of course not, but those features you listed, if stripped all at once from the game, would likely be more than the sacrifice we'd suffer from, had for example PGR3 been made 60fps.
Yes, I do agree with your post MrWibble.
 
I gotta have consistency, and it's gotta be above 30. Anything else, and my head hurts. I don't care how many dot products it took to render an image...I've got to look at the damn thing, and if it's stuttering and jumping all over the place, I can't stand it. For that reason (and the horrible color dithering), I always considered N64 games to look much worse than SNES games. I know Factor 5 expressed a few reservations about HD and framerate, but some X360 games are already turning out pretty smooth, so I'm not too worried.
 
The ideal would be having mandatory 60fps, 1080p w/ 4xAA for every game. Average "visuals" would be less impressive, but in general videogames would be orders of magnitude better.
 
Vaan said:
The ideal would be having mandatory 60fps, 1080p w/ 4xAA for every game. Average "visuals" would be less impressive, but in general videogames would be orders of magnitude better.


The ideal would be TVs that can actually take that signal (none out yet). Oh and free for everyone obviously. And donuts to go with it. With coffee. Oh and the PS3 here now wouldn't hurt either.
 
between a poor quality at 60 fps and a great quality (more detail, more effects, more eye candy, more enemies at time) at stable 30 fps, I will ever choose the 30 fps eye candy
 
Why wouldn't the 60fps games be impressive, exactly?

I mean some of the most impressive games, on their respective machines and on their respective categories, be it PS2, Xbox or GC are in 60fps, Jak serie, Tekken 5, GT3/4, God Of War, Ninja Gaiden, Outrun2, Rallisport Challenge 2, DOAU, Panzer Dragoon Orta, Riddick, Metroid Prime, Rogue Leader, F-Zero, etc...

60Hz refreshing rate for me, for most games, is as mandatory as texture filtering or perspective correction. I don't care if a game could boast more polygons or more effect if it did have to cut back on some of the basics just to obtain an impressive "looking" result.

The thing is It's just harder to get a technically impressive game running in 60Hz than it is to get it to run in 30.
 
Back
Top