1080p 30fps VS 720p 60 fps - The 2012 poll

What's your preference?


  • Total voters
    105
  • Poll closed .
Methods of scaling vary dramatically in their result, funnily the simplest one is the one which gives you a crisp and pixellated NES or SNES game on an emulaltor, whereas that blurry mess you speak of is like using bilinear filtering on your NES game ; still a simple algorithm and one that gives consistent results but yes, it's very blurry. I don't use it.
 
Phil you continue to miss the point. With your logic all games could easily run at 30fps because we would never know how "60fps would feel". See the flaw here? I don't get why it's so hard for you to understand that some people prefer 1080p @ 30fps over 720p @ 60fps. It's personal preference! You can make all the pictures you want, but it won't change this fact.
 
So anything that is an exact factor of the native resolution of the TV should be expected to be outputted on the display as is and as it should, regardless if the running resolution is lower than the max output of the display. Correct?
Ok then to understand better if this is the case we should check the resolution of the game running,
Game I run last time was Panzer Dragoon 1. What is the resolution of that game? It certainly wasnt 640x480. So....was it really an exact factor of 1080p? And what about the vertical resolution? 480 is not an exact factor of 1080 pixel wide image.
Any Saturn game of varied resolutions I remember trying on my TV in the past produced the same crisp image.
It might not be an exact factor, but when you have large enough pixels, a little interpolation on one side would go pretty unnoticed. Let's say you have a black pixel, white pixel, and another black pixel in your game. A perfect upscale would be 3 pixels of black, 3 of white, and three of black. If your game resolution is lower than that, you might get 3 black pixels, one grey, 2 white, one grey, 3 black. In a simply linearly upscaled game, you'll get instead a gradient from black to white to black, looking all blurry.

Aren't there PSone settings playing on PS3? - http://manuals.playstation.net/document/en/ps3/current/game/2settings.html Try switching off smoothing.
 
So anything that is an exact factor of the native resolution of the TV should be expected to be outputted on the display as is and as it should

It can, but it really shouldn't. It's typically the result of simple doubling (or tripling) of pixel values and is the equivalent to nearest neighbor filtering.

Proper upscaling hardware usually use cubic interpolation or windowed sinc, - or at the very least bi-linear.

Remember: Pixels aren't squares, they're points !!

Cheers
 
It can, but it really shouldn't.
When going via an analogue component or composite input as Nesh is, I wouldn't expect a TV to use anything other than nearest neighbour, or rather change the value according to scan timings. To do otherwise would need the frame to be digitised and upscaled.
 
It might not be an exact factor, but when you have large enough pixels, a little interpolation on one side would go pretty unnoticed. Let's say you have a black pixel, white pixel, and another black pixel in your game. A perfect upscale would be 3 pixels of black, 3 of white, and three of black. If your game resolution is lower than that, you might get 3 black pixels, one grey, 2 white, one grey, 3 black. In a simply linearly upscaled game, you'll get instead a gradient from black to white to black, looking all blurry.

Aren't there PSone settings playing on PS3? - http://manuals.playstation.net/document/en/ps3/current/game/2settings.html Try switching off smoothing.

Both smoothing turned on and off produce a slightly blurry image. I need to check the PS1 with scart vs the PS3 HDMI emulation just to make sure there is any difference but I get the impression that Saturn via SCART looks cleaner than PS1 games being played on PS3 with smoothing turned off

I believe Panzer Dragoon 1 is 256 × 224. Damn I wish I had some capture cards of some sorts. That would have been an interesting test to compare how these games are upscaled and displayed on an 1080p TV via different methods.

It can, but it really shouldn't. It's typically the result of simple doubling (or tripling) of pixel values and is the equivalent to nearest neighbor filtering.

Proper upscaling hardware usually use cubic interpolation or windowed sinc, - or at the very least bi-linear.

Remember: Pixels aren't squares, they're points !!

Cheers

Knowing that old PS1/Saturn games come at different and mostly low resolutions of all kinds, the image clarity (referring to how crisp the image looks) should be different according to the game's upscaled resolution, right? I mean 256 × 224 could be upscaled differently than lets say 320x240
It really makes me curious and wish I could check game by game through a capture card.

When going via an analogue component or composite input as Nesh is, I wouldn't expect a TV to use anything other than nearest neighbour, or rather change the value according to scan timings. To do otherwise would need the frame to be digitised and upscaled.

Ok. HDMI is a digital signal, whereas SCART is an analogue.....considering that, I suspect that the blurry image could be a result of how content is being upscaled through HDMI vs SCART. And in that case probably SCART could produce better results than HDMI when low res content is displayed on an 1080p TV

But dont take my word on that. I need to bring my old consoles and start checking and that needs some time to dedicate which I dont really have
 
So anything that is an exact factor of the native resolution of the TV should be expected to be outputted on the display as is and as it should, regardless if the running resolution is lower than the max output of the display. Correct?
Ok then to understand better if this is the case we should check the resolution of the game running,
Game I run last time was Panzer Dragoon 1. What is the resolution of that game? It certainly wasnt 640x480. So....was it really an exact factor of 1080p? And what about the vertical resolution? 480 is not an exact factor of 1080 pixel wide image.
Any Saturn game of varied resolutions I remember trying on my TV in the past produced the same crisp image.

Then you basically have the best LCD/plasma TV that is possible to make that pretty much no TV maker is capable of making.

There is no LCD/plasma TV currently available that I have viewed that can display SD content without blur related to upscaling.

That said if you sit far enough away from a TV, that blur will diminish in visibility until it may no longer be easily seen. The farther apart the resolution of your source compared to the native resolution of your TV, the farther away you'll have to sit.

Regards,
SB
 
Then you basically have the best LCD/plasma TV that is possible to make that pretty much no TV maker is capable of making.

There is no LCD/plasma TV currently available that I have viewed that can display SD content without blur related to upscaling.

That said if you sit far enough away from a TV, that blur will diminish in visibility until it may no longer be easily seen. The farther apart the resolution of your source compared to the native resolution of your TV, the farther away you'll have to sit.

Regards,
SB

Perhaps what I see is crispier quality in relative terms then
Curious to check again.

The saturn is so old that his tv could of been a crt
I am not the Angry Video Game Nerd. When I want to play on an old console I will use a TV from our age :p
 
There is no LCD/plasma TV currently available that I have viewed that can display SD content without blur related to upscaling.
At normal viewing distance, my Sammy upscales SD beautifully. It's not HD quality so isn't pin sharp, but it's smooth and balanced like a CRT a lot of the time. One reason for picking it was the rave reviews about its SD content quality. ;)

Perhaps what I see is crispier quality in relative terms then
Curious to check again.
How large was your old TV and what was your viewing distance? CRTs were typically a lot smaller than our flat panels, so the games of old occupied a much smaller FOV. I'm sure if you tried your DC on a 40" CRT today, you'd be shocked how blurry it is.
 
How large was your old TV and what was your viewing distance? CRTs were typically a lot smaller than our flat panels, so the games of old occupied a much smaller FOV. I'm sure if you tried your DC on a 40" CRT today, you'd be shocked how blurry it is.
Well it doesnt matter how big my old TV was because currently I am referring to how my current HD TV displays the content and thats the only TV I have right now.
The examples I mentioned with the Saturn, PS1 and DC (using SCART or VGA) are examples played on an HD TV/Monitor vs sub-HD or SD game content played from PS3/360 on the same HD TV/Monitor :)
 
40fps would work on a 60hz monitor too. Its just 1 duplicate frame every 3 frames instead of 1 duplicate every 2 like it is with 30fps. It fits in nicely with triple buffering too.
 
40fps would work on a 60hz monitor too. Its just 1 duplicate frame every 3 frames instead of 1 duplicate every 2 like it is with 30fps. It fits in nicely with triple buffering too.

Though that might create a worse judder effect that could arguably be considered less fluid than 30fps from a perceptual level.
 
I don't get why it's so hard for you to understand that some people prefer 1080p @ 30fps over 720p @ 60fps. It's personal preference! You can make all the pictures you want, but it won't change this fact.

I never argued personal preference. I'm taking the stance on what is the greater good and what would need to be done for it to work for everyone. Graphics will always get better - and better graphics will always be nice. But not at the expense of framerate!

Just about any game that I think of, would be benefit a lot from a better framerate at the expense of some tradeoffs in the graphics department. I played the latest CoD last night and while it certainly isn't as pretty as KZ3, the latter would seriously be a much better game at twice the framerate and worse graphics. Would it have sold as much though? Probably not. And I think this highlights the problem of the gaming-industry. Graphics is too big a factor to the marketing and sales of games.
 
You really haven't given much of a reason why this is a bad idea.

I believe I did. The fact that you prefer higher framerates to better IQ doesn't make those reasons any less valid, IMO.


Cut on image quality just to contribute to the "fairness" of a competition that's not theirs isn't really an issue, because the rules would be the same for everyone - and because of that, the perception of the individual would be bound to what everyone expects to be possible running that framerate.

But why would there be rules at all?
We have people who buy games and express their opinions. We have reviewers that will definitely tell you if the game stutters, and we have QA people working for Microsoft, Sony and Nintendo trying to make sure the game doesn't give a bad reputation to the system.

What if my studio wants to make a walking dead-esque point-and-click game with spectacular IQ that doesn't need anymore than 25 FPS?
Framerate should never - ever - be a limiting factor for making games.


Imagine a bit - how much better looking games would be if developers targeted 15fps. Would we really want that?

Maybe I would. That's not for anyone to decide before seeing the game in question.
What if I want to make the most gorgeous looking chess game in the world? Is 15FPS going to make my eyes hurt?





By the same account - graphics are cleary better on PC, so why are we still playing on consoles? Because there's more to gaming than just looking at nice pictures. Because people prefer to actually sit in a living room or just simply have that "plug & play" experience that you can't get on the PC. Because of that, there is a compromise - we live with worse visuals (compared to PCs) because we prefer the different gaming environment. So why would it be bad, if we just limit our expectations to graphics a little more for the advantage of having a better framerate (that benefits controls, responsivnes, motion of the game)?

You're tying your whole "the world should be like this" opinion based solely on your subjective preferences of IQ-vs-framerate.

I play games on a PC because I get better image quality, and very rarely I lower the settings in order to get higher framerate. I only play in a console when there's an awesome title that I want to play and that the corporate dudes mandated it wouldn't come out on PC.

I never argued personal preference. I'm taking the stance on what is the greater good and what would need to be done for it to work for everyone. Graphics will always get better - and better graphics will always be nice. But not at the expense of framerate!

You're not arguing personal preference but you're saying the whole gaming development world should abide for a rule that respects your personal preference and not mine. How is your personal preference any better than mine for what you're calling the greater good?


Framerate can't be set in stone. This matter should obviously be a case-by-base approach based on gameplay and genre, for which the third option in the poll is the closest to. I chose the first option because, for the kind of games I play and how I play them, 1080p30 feels better for me than 720p60.


Take most people. Sit them on a couch at typical/recommended TV viewing distances.

Show them content (movies, games, photos, etc.) on the TV without telling them what resolution something is running at.

The vast majority of them wouldn't be able to tell the difference or even know which was at 720p and which was at 1080p. Hence why I do ALL of my PC gaming on my living room TV at 720p instead of 1080p. I'd be willing to bet most of them wouldn't be able to tell if a game was running at 540p versus 1080p either if the game had good MSAA or RGSSAA. Pretty much the only thing that would give it away is the size of the jaggies from aliasing. Again, assuming typical/recommended TV viewing distances for a given screen size.

From my experience, this is definitely not true for my HTPC setup, and I know the Anchor Bay upscaler in my AV receiver is pretty good.
Unless you're using a combination of some very aggressive FXAA + 8xMSAA, for which you might as well just increase the resolution or everything will look like old pictures.
 
Last edited by a moderator:
Back
Top