Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
So, if I'm to understand this correctly, the beautiful games of last-gen, like The Last Of Us, had absolutely horrible image quality. Just completely awful. They should have made the game 1080p, because the PS3 could output 1080p, and cut back on the lighting, shading etc to make sure the image wasn't being upscaled? Are upscale artifacts really as bad as you make them sound? I highly doubt it. I play upscaled games all the time, and they look great. I'd rather let a dev decide, after rigorous testing, what looks good and what doesn't, and release it as they please, whatever resolution that happens to be.

We are talking about upscaling in the context of current gen, not old gen.

In the old gen, we could compare that with: 720p upscaled by TVs versus sub-720p re-upscaled by TVs.

Relatively to most sub-720p games like Bioshock Infinite, yes TLOU at 720p had great image quality, comparatively.

Previous gen was the debate (well it wasn't but it should have being) betwwen 1 upscaling versus 2 upscaling, this gen is the heated debate between 0 upscaling versus 1 upscaling. But the relative difference between the two is roughly the same.

Also it depends of people's perception, I never said it was important for everyone. But the fact is that the people who are bothered by it are very vocals!
 
We are talking about upscaling in the context of current gen, not old gen.

In the old gen, we could compare that with: 720p upscaled by TVs versus sub-720p re-upscaled by TVs.

Relatively to most sub-720p games like Bioshock Infinite, yes TLOU at 720p had great image quality, comparatively.

Previous gen was the debate (well it wasn't but it should have being) betwwen 1 upscaling versus 2 upscaling, this gen is the heated debate between 0 upscaling versus 1 upscaling. But the relative difference between the two is roughly the same.

Also it depends of people's perception, I never said it was important for everyone. But the fact is that the people who are bothered by it are very vocals!

I was only pointing out that with a statement as hyperbolic as this:

It's the developers that should understand that on LCD screens, effects, shaders, rendering techniques, tessellation, number of polygons pushed etc. will be completely negated/wrecked by any upscaling (and blurry post effects for that matter ).

Then 720p shouldn't have even been an option last gen. By your standard, it's basically 1080p or bust, because everything else is "completely negated/wrecked." That is obviously not true, and I doubt you even believe what you wrote. Upscaling is a compromise and the question is whether devs should be able to make that compromise, or sacrifice other advances in the name of 1080p.
 
And for me upscaling is the worst image quality you can get from a game, it makes great otherwise graphics bad comparatively.
That's hyperbolic. Native 1080p res with no AA and texture shimmer and shader aliasing + moire-type patterns is clearly going to be worse overall IQ to most than slightly lower res with nicely solved aliasing issues. Native 1080p adds fidelity, but that's only part of the whole quality aspect of the image we're seeing. Another one, very important, is framerate, which even affects 2D resolution perception (detail can be perceived at lower spatial resolutions when supplied at higher temporal resolutions).

Take BF4 on PS4/XB1. For all its rendering technique and stuff it displays on screen, everything it does is wrecked...It's the developers that should understand that on LCD screens, effects, shaders, rendering techniques, tessellation, number of polygons pushed etc. will be completely negated/wrecked by any upscaling.
Again with the hyperbole! Look at The Tomorrow Children. It looks almost photoreal. It looks great. That game at 720p, if it was necessary, would still look better than a lot of other games at 1080p. Rendering lower resolution doesn't wreck the art style and rendition. It just removes some fidelity. One can even scientifically undo your crazy comment. Tesselation is not affected by resolution unless you're tesselating to the pixel level. Number of polygons pushed isn't affected at all. Shader's aren't affected unless they are producing high frequency details. Ergo they can't all be completely negated/wrecked by upscaling. Upscaling can damage image fidelty - nothing else. A subsurface shader making skin look realistic is going to produce a realistically skinned face in a blurred upscaled image as it will a native image.

What many PC gamers do when they set their games (what I was doing in the early years of LCD screens with my PC without really knowing at first the superiority of native resolution like I do now) is to first select the native resolution of their monitors because they know that everything not native will be awful and that better effects + upscaled resolution is worse than less effects + native resolution.
In the days of LCDs with nearest-neighbour upscaling, sure. But nowadays that's not an issue, and I doubt you have any stats to support you're view that most PC gamers pick native res first and then tweak everything else. Personally I played Awesomenauts at 720p instead of native 1680x1050 because it ran smoother (although still not PS3 smooth). I expect PC gamers run the full gamut of preferring resolution and preferring framerate and preferring eye-candy.

Hence it's ridiculous to favour one aspect over others as a technical requirement. If you go with higher resolution at the cost of framerate, you'll please some people and offend others. If you choose 60 fps instead of 30 fps, you'll upset people who prefer better rendering detail, and if you favour 30 fps instead of 60, you'll upset those who prefer higher framerates. If you go with 1080p60, you'll upset those who want photorealism who'd prefer lower resolution and framerate like TV/movies and allow games to look like they're real. It's impossible to please everyone, and ridiculous to try and prioritise according to some scientific analysis, so it should be left to the devs to make those choices.


Considering One started it by analyzing Halo 3 promotional material before its release, I'm pretty sure it was before 2008.
You're right, that thread is a continuation of an older thread.
 
Why is 900P to 1080P now considered marginal difference? It's more than any resolution gulf we had last gen. It's upscale vs no upscale effectively.

It might be less noticeable than usual depending on where the user sits, size of TV and we all don't really have perfect vision, but there is a difference.

They key word is "perceived difference", if you only see the percentage number difference, then of course there is a big difference, but you have to take into account that the human eye has a limit to how much detail it can perceive, the closer you get to that number the harder will be to notice when you compare two images closer to your physical limit. That's why we currently need pixel counters at this point so they can tell you what the difference is.

For example, how long would it take you to recognize the difference when you compare an image with one pixel to another with 2 pixels? It should be immediate right? Now lets compare one image with 30 billion pixels to another with 100 billion pixels on a 60 inch screen, more than a 100% difference, but lets see if you even notice it. That's why 720P to 1080P is "perceived" as a bigger jump than 1080P to 4K screens.
 
Last edited by a moderator:
ryse and BF4 are both 900p but if i did not know i would say that Ryse was higher res due to it's better image quality overall.
That's not surprising, although just in case you're discussing them both in the context of XB1, BF4 is lower-res than Ryse; PS4 BF4 is 900p, but XB1 BF4 is 720p.
 
There's even a recent blind tv test where a 4k lcd tv display lost to a better quality 1080p plasma display..

I'm guessing viewing distance was a big factor there though. I've personally never seen 4K myself but I have seen 1080p x2 in the form of 3D Blu-Ray on my home TV which I'm guessing gives an effect something akin to 4K and the picture quality there is insanely gorgeous, well in excess of the standard 2d picture from the same Blu-Ray. Thats's jusging from about 6ft away though on a 50" TV.
 
That's not surprising, although just in case you're discussing them both in the context of XB1, BF4 is lower-res than Ryse; PS4 BF4 is 900p, but XB1 BF4 is 720p.

yes i meant BF4 on the PS4.

But the thing to avoid is playing a 30fps game right after having played a 60fps one.
 
Considering One started it by analyzing Halo 3 promotional material before its release, I'm pretty sure it was before 2008.
and IIRC all that promotional material was 720p (unlike the actual game)
why was that, after all ppl cant tell the difference between 640 and 720p :p
 
That's not surprising, although just in case you're discussing them both in the context of XB1, BF4 is lower-res than Ryse; PS4 BF4 is 900p, but XB1 BF4 is 720p.

will be interesting to see what BF:Hardline eventually settles down on.
 
3) The "better pixels" vs "more pixels' argument has been tested countless times in blind tests not just by game developers but also by tv makers with better pixels usually winning out. There's even a recent blind tv test where a 4k lcd tv display lost to a better quality 1080p plasma display.

Absolutely, I have been watching some nature documentary DVDs on my 15" rMBP and the image quality is quite unbelievable considering the video stream is only 720x576 pixels, and i'm watching it on a 2880x1800 display (i.e. the video is only 8%! of the display resolution).

And sub HD games like Alan Wake, despite running at a measly 540p had far better overall IQ than a lot of 720p games with no AA. Even Halo 3 looked a lot better on my old CRT than my HDTV as it was effectively supersampled when displayed at 480p vs the horrible jaggies + upscaling from 640p that you got when playing on a 1080p set.
 
The wait for the full DIII face/off is killing me.

I want to see what the full impact of the push for a newly optimised 1080p is. Looking at the latest X1 DF videos seems to show some frame rate drops in areas that aren't terribly busy, while at other times the game is handling a ton of alpha at a rock steady 60 fps.

Will be interesting to see if there's any sign - at all - of dropped frames from the PS4.

Even Halo 3 looked a lot better on my old CRT than my HDTV as it was effectively supersampled when displayed at 480p vs the horrible jaggies + upscaling from 640p that you got when playing on a 1080p set.

Teh jaggies in Halo 3 were distracting. No AA and a lowish resolution, along with some highly contrasted edges really conspired to impact the look.

I tried it on a couple of crts too. When supersampled the edges were much less distracted even if there was loss of detail, and on a good CRT the lighting was glorious.

LCD's - particularly cheap IPS* rather than good IPS and PVA - have really hurt. Plasma and CRT are awesome, but are now both dead.

Would love to see a DF article on how different screens fare and how well they handle motion. 1080p + with heavy 30 fps motion blur + LCD motion smudging can easily make chuff-all detail visible.
 
Status
Not open for further replies.
Back
Top