1080p 30fps VS 720p 60 fps - The 2012 poll

What's your preference?


  • Total voters
    105
  • Poll closed .
What about people like me running linux. You have it much better on Windows where installing a beta driver or everything else is done by double-clicking setup.exe and it works 99.99% of the time, with huge backward and forward compatibility.

I'm nostalgic of consoles and especially the NES these times, for some reason I feel the urge to play the first ninja turtles games :LOL: damn it was hard and would really punish you (when you get both Leo and Donatello dead) and reward you (pizza slices and freeing Donatello or Leo).

And yes linux is so fucked up it's hard to get a NES emulator running (you get about a couple ones in your distro's package and pray for one of them to work, instead of choosing between twenty ones. Well, I know what I'll do, get a NES emulator running under Dosbox :D)

The NES had smooth scrolling, so I guess I should have voted for 720p60.
 
And yes linux is so fucked up it's hard to get a NES emulator running (you get about a couple ones in your distro's package and pray for one of them to work, instead of choosing between twenty ones. Well, I know what I'll do, get a NES emulator running under Dosbox :D)

The NES had smooth scrolling, so I guess I should have voted for 720p60.

I do not know if windows is any better. You have 20 different choices but they all seem to suck in some sense. I tried to get Wine running, but it seemed like such an investment I gave up. Didn't really try it on Linux though...
 
Where's the middle ground options like 900p 45fps?
 
Unless the next-gen consoles have way too much processing power or too little processing power I don't see why there won't be a lot of games striking a balance between image quality and smoothness between 1080p/30fps and 720p/60fps by going for somewhat of a middle-ground option around 1440x900 WXGA+, especially if more rendering engines switch to dynamic resolutions. Fixed resolution targets seems so last gen.
 
Just gotta make sure it stays a fixed res during development so the artists don't just keep on adding shit thinking the framerate is ok while the res keeps dropping. ;)
 
I have a feeling 1280x1080 will be a very popular res next gen. That would keep marketing happy since they can still trumpet 1080p and it can let devs push pixels a bit more if needed.
 
Well it would be scaled horizontally of course to 1920x1080 before being displayed.

That could be the case though if next generation consoles have lots of very fast memory it may not be a problem going all the way to full HD. I guess it really depends on what the limitation is for next gen titles, eh?
 
it's not always simple for a developer to "turn down a few settings" in order to double the framerate. If a studio targets 30 fps, there's probably a good reason for it.

I laid out the reasons why a studio might decide to target 30 fps. Because graphics sell. It's easier to sell your game on the premise of good graphics because that's what the people tend to see first. Games are marketed through magazines and the internet - and even if videos on Youtube and other platforms are becoming bigger factors in the sales of games, due to the nature of how movies are captured and compressed, it's easier to mask a game running at less than optimal framerate. Add to that, that framerate is not just something you see, it's something you feel - something you can't relate to at all when just watching a pre-recorded video review of a game.

Graphics sells.

Maybe to illustrate my point by this little illustration I've drawed up:

graphics_timeline.gif


What you see here is a timeline that not only shows the time, but more importantly also shows the technological progress and the graphical perception. I say perception, because technically, what you have when your hardware is finalized is what you have - and everything, right down to game mechanics, framerate, texture resolution is a compromise limited to what your hardware can do.

What I tried to portray is that graphics and framerate are directly related to each other. Up the graphical fidelity and it's at a direct compromise of framerate and vice-versa. Of course, this is a simple way to look at it - technically, you'd have to add the other aspects to this diagram, but given the focus of this topic, it's easy to just compare the two in an isolated case.

My point is, when the new hardware launches in 2013 or 2014, you'll have a line on that diagram somewhere with graphics that is limited to the technological progress of that given time. We won't know yet to what kind of graphics that will translate to - we can only make educated guesses on what is possible on the PC today, because it's an evolving platform.

What we will get, is what we will get used to. If console makers enforce a hypothetical 60 fps minimum limit, we'll get used to that. We wouldn't know how much more would be possible, just as today on our current platforms, we can only guess how much better games would look if they'd be shipped with a targeted framerate of 15fps (half of what is the norm).

In other words, this argument about wanting better graphics is all relative anyway. We can only judge what we get and see. If we stick to what we know is a solid framerate for all genres (60fps), our graphical perception will be limited to that. So, I really don't see how this is bad. If we want better graphics, by the same argument, why not just wait another half a year to launch your hardware? Or buy a platform that is evolving, like the PC?

Setting a mandatory framerate limit to 60fps instead of 30fps is IMO a little price, relatively speaking. But it's only really possible at the start of a new platform, enforced by the hardware maker, because by the way the industry works and how games are sold, it's doubtful that studios will make the choice on their own.



EDIT: Realistically - I would say that the lines in my diagram are far too close. I would guess the difference between i.e. 480p/30 and 480p/60 must be bigger than a few months. Probably closer to a year? It's just a minor point - but one that I wanted to make before it is pointed out. And I was too lazy to change it. :p
 
There is something I dont quite get in the way current consoles output the image compared to previous gen consoles.

The PS1 and Saturn output resolutions from 256 × 224 to 640 × 480. The DC was outputting mostly at 640 × 480. Using SCART for the former and VGA for the latter output a crisp but pixelated image on my TV, just like they should and just how low res PC games looked on a PC monitor. They dont look blurred. Same counts for old console games running though emulation on PC. Crisp but pixelated
But on the same TV, the PS3 and the 360 when they output their games on sub-HD or SD resolutions, the image doesnt look crisp and pixelated, but a blurry hell. I never understood why.
 
There is something I dont quite get in the way current consoles output the image compared to previous gen consoles.

The PS1 and Saturn output resolutions from 256 × 224 to 640 × 480. The DC was outputting mostly at 640 × 480. Using SCART for the former and VGA for the latter output a crisp but pixelated image on my TV, just like they should and just how low res PC games looked on a PC monitor. They dont look blurred. Same counts for old console games running though emulation on PC. Crisp but pixelated
But on the same TV, the PS3 and the 360 when they output their games on sub-HD or SD resolutions, the image doesnt look crisp and pixelated, but a blurry hell. I never understood why.

CRT's didn't have an "optimal" resolution. So on a CRT a game at 256x224 is most likely displayed at 256x224 on the TV. 640x480 was an odd one as that actually had to be downscaled depending on your region and TV. :D Fortunately downscaling is much easier and less likely to introduce blurring. Think of it as SSAA. Render at 640x480 and downsample to the TV's max resolution.

With LCD's the farther you get from the native res, the more upscaling that needs to be done. That upscaling will include blurring as the difference gets larger and larger. So, on a 1080p set for example, EVERYTHING is displayed at 1920x1080. A 720x480 source has to be upscaled to 1920x1080 along with any artifacts (blurring) that might include. If it was a CRT, then there would be no upscaling. The TV would theoretically just display it at 720x480.

In other words, hook up that Dreamcast to a 1080p TV and it'll display a blurry image when running a 640x480 game. How blurry the image is will depend entirely on how good the upscaler is in the TV. But it will be blurry.

Regards,
SB
 
CRT's didn't have an "optimal" resolution. So on a CRT a game at 256x224 is most likely displayed at 256x224 on the TV. 640x480 was an odd one as that actually had to be downscaled depending on your region and TV. :D Fortunately downscaling is much easier and less likely to introduce blurring. Think of it as SSAA. Render at 640x480 and downsample to the TV's max resolution.

With LCD's the farther you get from the native res, the more upscaling that needs to be done. That upscaling will include blurring as the difference gets larger and larger. So, on a 1080p set for example, EVERYTHING is displayed at 1920x1080. A 720x480 source has to be upscaled to 1920x1080 along with any artifacts (blurring) that might include. If it was a CRT, then there would be no upscaling. The TV would theoretically just display it at 720x480.

In other words, hook up that Dreamcast to a 1080p TV and it'll display a blurry image when running a 640x480 game. How blurry the image is will depend entirely on how good the upscaler is in the TV. But it will be blurry.

Regards,
SB
Good explanation but I am confused.
Is there something related to the cables as well?
Two days ago I plugged a Saturn using scart on my 1080p 42" LCD TV but didnt produce any blur. It looked crisp but pixelated like it should. If I recall correctly though playing my PS1 games on PS3 (using HDMI) produced the blur.
 
Two days ago I plugged a Saturn using scart on my 1080p 42" LCD TV but didnt produce any blur. It looked crisp but pixelated like it should.
Because the pixels are massive and exact factors of 1080p. You can triple the 640 pixels horizontally to produce a 1920 pixel wide image, meaning 3x3 screen pixels per display pixel. The PS1 emulator applies an upscale filter which tweens values, equivalent to a blur if they aren't using a fancy interpolator.
 
Because the pixels are massive and exact factors of 1080p. You can triple the 640 pixels horizontally to produce a 1920 pixel wide image, meaning 3x3 screen pixels per display pixel. The PS1 emulator applies an upscale filter which tweens values, equivalent to a blur if they aren't using a fancy interpolator.

So anything that is an exact factor of the native resolution of the TV should be expected to be outputted on the display as is and as it should, regardless if the running resolution is lower than the max output of the display. Correct?
Ok then to understand better if this is the case we should check the resolution of the game running,
Game I run last time was Panzer Dragoon 1. What is the resolution of that game? It certainly wasnt 640x480. So....was it really an exact factor of 1080p? And what about the vertical resolution? 480 is not an exact factor of 1080 pixel wide image.
Any Saturn game of varied resolutions I remember trying on my TV in the past produced the same crisp image.
 
Back
Top