Digital Foundry Article Technical Discussion Archive [2013]

Status
Not open for further replies.
But its the devs who have to make a call. If they think most of their audience is still on 720p TVs, obviously it will be a better decision to go with more eye candy and a lower res.After all, we have been enjoying games on resolutions even lower than 720p.

I think there are very few 720p TVs out there. Also, most of the TVs you call 720p probably had some other weird resolution.
 
I think the differences are noticeable when you can view screenshots side-by-side, but if you were to just have the game running on a tv without a comparison to another tv, you'd be hard pressed to guess the resolution.
 
I wonder if people will still argue that 1080p doesn't matter.

L5PHwXB.png

They have take a capture at 1920x1080, 1600x900 and 1280x720 and have resize them by 1.2 and 1.5 them with a good filter or not?
If they have just taken 3 captures those minitures should have three diferent sizes or show more around the gun in the second and third ones.
 
They have take a capture at 1920x1080, 1600x900 and 1280x720 and have resize them by 1.2 and 1.5 them with a good filter or not?
Delta9's image is from the comparison tool. This shows the three images upscaled to the same size. You can click the excerpts to see the full the image.

http://cfa.eurogamer.net/2013/articles//a/1/6/2/2/4/3/2/1_1080_copy.png
http://cfa.eurogamer.net/2013/articles//a/1/6/2/2/4/3/2/1_900_copy.png
http://cfa.eurogamer.net/2013/articles//a/1/6/2/2/4/3/2/1_720_copy.png

So what Delta9's image shows is native captures upscaled with a cheap, probably trilinear, resample. Nothing like the real experience on a TV versus a 900p/720p render upscaled with a good algorithm on the console prior to output to the TV. For those comparisons, we'll need HDMI captures of the output, which I'm sure we'll get when the machines are out. But before then, DF did an article on what impact 900p really makes, and people can see for themselves it's nothing like as horrendous as Delta9's misrepresentative image suggests.
 
I agree 1080p looks better, but at what cost? Framerate is not so un-important, and at 720p, a game can push more complex shaders than at 1080p, at the same performance level. Benefits of resolution bears heavily on viewer distance and screen size. Console players tend to sit further away compared to PC users (even if consoles would be connected to bigger screens) which makes resolution less of an issue for image quality.

And they are doing this blowout image comparisons for 4K. To everybody's surprise, 4K looks better!

Until you do another blowout and compare 4K with 8K and even 16K. Than 4K looks very blurry.

We should not forget that an eye can distinguish 60th of a degree, few can go over this, it's a limitation of how many cones we can pack in our retina.
 
Unless you are being shown what upgrades you are getting, I wouldn't be so quick to assume that less resolution is getting you better quality.
 
Unless you are being shown what upgrades you are getting, I wouldn't be so quick to assume that less resolution is getting you better quality.

Unless you are being shown what upgrades you are sacrificing, I wouldn't be so quick to assume that higher resolution is getting you better quality.
 
Indeed, the proof is in the showing.

Since extra resolution doesn't come free (unless the game is cpu bound, but I guess we have left cpu bound days behind, no?), it is safe to assume that graphics would have to take a hit elsewhere to hit performance targets.

I was hoping this gen we'd have a 720p60fps choice in games vs 1080p30fps, so devs could target 720p60fps and have a 1080p30 fps option, it's easier to go that route instead of the opposite (if they can't get it to work 60fps at 1080p for example)
 
Indeed, the proof is in the showing.

Exactly. The raw values representing the resolutions of CG images are irrelevant in isolation. What are relevant are the effects that the choices of those different resolutions have on the resulting images. Nothing irritates me more than the dumbing-down (especially on this forum) of analysis of graphical quality into using a single number as a pass/fail metric.
 
Indeed, the proof is in the showing.

No, the proof is in the practice. Or do you expect devs to produced natively rendered demos at both 1080p and 900p, to convince the public of their choice of resolution? The number sub 1080p or 720p games from last gen pretty much shows how willingly devs sacrificed resolution when it comes to presenting the best visual quality.

All PS3 games could of been natively rendered at 1080p or "Full HD", yet most aren't and the ones that are, aren't readily used as the examples of the PS3's prowess as a console. All the best looking games seem to fall into the "almost Full HD resolution" category.
 
No, the proof is in the practice. Or do you expect devs to produced natively rendered demos at both 1080p and 900p, to convince the public of their choice of resolution? The number sub 1080p or 720p games from last gen pretty much shows how willingly devs sacrificed resolution when it comes to presenting the best visual quality.

All PS3 games could of been natively rendered at 1080p or "Full HD", yet most aren't and the ones that are, aren't readily used as the examples of the PS3's prowess as a console. All the best looking games seem to fall into the "almost Full HD resolution" category.

No, but I also don't expect the public to blindly accept that they are getting more with less resolution. Sometimes less is more, but a lot of times less is just that, less.
 
For gamers with a small screen, less is more, because they don't see the difference in res, but they'll see the 60fps.

For those with a large TV or a good projector, given the choice many will prefer 30fps if it means full resolution and good AA instead of the post-process blur-o-matic.
 
First the viewing distance has to be taken into account, after that the apropriate resolution can be chosen. for my 1080p 42"inch TV i usually have to set my desktop to 720p because my viewing distance is around 2 meters. In 1080p i have to be 1.5m or normal size text is dificult to read. In that sense people playing at the sofa usually sit at more than 1.8 meters and 1080p is a waste of computation.

Developers may be biased in using monitors at less than 1.5m and this focused samples are not a good measure of the trade-off involved.
Gamers have an increased Contrast Sensitivity Function so in the words of ImaseScienceFoundation the most important aspects of picture quality are (in order): 1) contrast ratio, 2) color saturation, 3) color accuracy, 4) resolution. Resolution is 4th on the list, so look at other factors first. Also, be sure to calibrate your display!

http://en.wikipedia.org/wiki/Optimum_HDTV_viewing_distance

http://s3.carltonbale.com/resolution_chart.html

http://www.videoclarity.com/PDF/WPSubjectiveTestingMethodsReviewed.pdf


http://www.itu.int/dms_pubrec/itu-r/rec/bt/R-REC-BT.500-11-200206-S!!PDF-E.pdf
 
Last edited by a moderator:
No, but I also don't expect the public to blindly accept that they are getting more with less resolution. Sometimes less is more, but a lot of times less is just that, less.

Yes that is true...less often means less...less frame per second means exactly that...
 
No, but I also don't expect the public to blindly accept that they are getting more with less resolution. Sometimes less is more, but a lot of times less is just that, less.

Sure. Sometimes a lower resolution is going to reflect the developers choosing to render at a lower resolution due to it being a relatively easy optimization and there being time/budget constraints preventing another approach. I take issue with the common assumption that a lower resolution *always* indicates compromised image quality.
 
No, but I also don't expect the public to blindly accept that they are getting more with less resolution. Sometimes less is more, but a lot of times less is just that, less.

The public blindly accepts a plethora of concepts every time they purchased a game. The public willingly accepts the notion that for the most part, devs are making an effort to put out the best product possible.

Its not a common belief that resolution reduction is a product of laziness or a lack of image quality. If you make an half ass attempt at a game, it will show through regardless of the targeted resolution. And a "1080p native resolution" sticker means very little to the general public or publishers would require all games to natively render at that resolution if a lack of such labeling effected sales.

The general gaming public measures the quality of a title against other titles they have experienced, not some metric that in and of itself means nothing.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top