Digital Foundry Article Technical Discussion Archive [2015]

Status
Not open for further replies.
If you don't like the black bars, sit a bit closer so the vertical FOV is the same as you'd have it sans-letterboxing, and you'll get the same vertical and a little wider view...

Another solution but an expensive one do like one guy with a 21:9 monitor. Someone post a photo in The Order 1886 topic.
 
Last edited:
Other problem try to prove the difference of resolution don't matter on a youtube video of Ryse is not very credible.

It seems the test of Gamespot was done on a 27 inch PC monitor... I didn't watch the video...
 
Although I don't personally have a huge issue with letterboxing, sitting closer doesn't fix the issue of consistent-looking bezel framing.
This really can't be a huge problem because everyone watches movies letterboxed. Or do they use some ghastly upscale to stretch/crop the vertical to fit their 16:9 TVs?
 
Resolution and image quality are not the same thing. There can be titles with the exact same output resolution that don't have parity IQ wise. There are several factors that have an effect on IQ. Like texture filtering, Anti Aliasing, Texture resolution and shader and lighting quality just to name a few. So please stop judging who is qualified to talk about image quality. Your average gamer who plays to win and have fun doesn't necessarily care what a game's native res is. Not every gamer is a member of B3D or even a lurker.

The article was titled "Does resolution really matter?", not "should games have really good texture filtering and AA?"

The answer is yes. Resolution does matter and its contribution to image quality is extremely high. People claiming otherwise are IMHO just making themselves look foolish. The relative weights of each individual thing contributing are highly subjective but anyone can see the difference between 900 and 1080 side by side.

The question always seems to come with the accusation that resolution could be traded for other effects. Maybe it can... no one knows but the developers of each individual title and they probably only know if they tried it. DF certainly has no idea. But the question alone shows that the asker doesn't value resolution. Backseat graphics programming aside, I think that's a huge mistake. But one I welcome from my competition to be honest.
 
I think this article is not very good for multiple reason. Far Cry 4 has better resolution and framerate on PS4 and Dark10 (John Linnerman Digital Foundry) on GAF trying to defend Richard Leadbetter on GAF said that he was talking about COD not Far Cry 4. COD Advanced Warfare was running better on Xbox One during campaign and it is the same framerate during multiplayer. CPU bound? It seems framerate is better on PS4 after patch. Is this true? I don't have the game. Dark10 said than they haven't test the game after patch.

Well, they apparently have in the last-gen revisited article:

http://www.eurogamer.net/articles/d...l-of-duty-advanced-warfare-last-gen-revisited

And yes in some areas we can see the framerate is improved on PS4, to the point that now some passages (well at least one) run better on the PS4 version vs XB1 version (singleplayer game).

Remember in the PS4 / XB1 video a real time cutscene where PS4 ran at ~50fps sustained (during ~20 seconds) and XB1 at ~58fps (with screen tearing), at 1:30 in the video:


Well that area was tested again on PS4 and XB1 compared with old gen version, here PS4 vs X360 vs PS3 at 2:40:


Now the game run at a perfectly locked 60fps on PS4 in this area, so a big improvement compared to ~50fps sustained during ~20 sec, the only explanation is that they used a patched version of the game...so they evidently tested a patched PS4 game...but simply decided not to compare it with the XB1 version again...how odd.

By the way the same passage XB1 vs X360 vs PS3 still shows ~58fps + screen tearing on XB1, the pattern of the graph is different than with the PS4 compared video (one isolated drop at ~1:22 on XB1 has disappeared on the recent video at ~2:30) so I assume they also used a recent (updated) XB1 version here:


Another remark that had being completely overlooked by everybody is that the XB1 version in the singleplayer is the only version that has screen tearing.
 
Last edited:
Well, they apparently have in the last-gen revisited article:

http://www.eurogamer.net/articles/d...l-of-duty-advanced-warfare-last-gen-revisited

And yes in some areas we can see the framerate is improved on PS4, to the point that now some passages (well at least one) run better on the PS4 version vs XB1 version (singleplayer game).

Remember in the PS4 / XB1 video a real time cutscene where PS4 ran at ~50fps sustained (during ~20 seconds) and XB1 at ~58fps (with screen tearing), at 1:30 in the video:


Well that area was tested again on PS4 and XB1 compared with old gen version, here PS4 vs X360 vs PS3 at 2:40:


Now the game run at a perfectly locked 60fps on PS4 in this area, so a big improvement compared to ~50fps sustained during ~20 sec, the only explanation is that they used a patched version of the game...so they evidently tested a patched PS4 game...but simply decided not to compare it with the XB1 version again...how odd.

By the way the same passage XB1 vs X360 vs PS3 still shows ~58fps + screen tearing on XB1, the pattern of the graph is different than with the PS4 compared video (one isolated drop at ~1:22 on XB1 has disappeared on the recent video at ~2:30) so I assume they also used a recent (updated) XB1 version here:


Another remark that had being completely overlooked by everybody is that the XB1 version in the singleplayer is the only version that has screen tearing.


Thanks for the answer
 
I spent far too much time this weekend playing Assassin's Creed Unity (900p on PS4) on a 50" 4K and the lack of resolution didn't bother me. Generally I'd be arguing for higher resolutions for games that are trying to convery great distances because nothing absolves perspective details like lack of resolution but with that rendering engine the game still looks great.

Were it staight choice of 900p or 1080p I'd take 1080p but I doubt I'd pick 080p at the expense of something else having seen how the game looks.
 
I spent far too much time this weekend playing Assassin's Creed Unity (900p on PS4) on a 50" 4K and the lack of resolution didn't bother me. Generally I'd be arguing for higher resolutions for games that are trying to convery great distances because nothing absolves perspective details like lack of resolution but with that rendering engine the game still looks great.

Were it staight choice of 900p or 1080p I'd take 1080p but I doubt I'd pick 080p at the expense of something else having seen how the game looks.

Here the choice is not so difficult and at 900p the framerate is bad, I can't imagine at 1080p if the bandwith memory is the bottleneck.
 
For me, 1080p is very noticeably sharper than 900p. Native resolution or not just really does show, and my TV is a whole lot more pixelly than my tablet. I only sit about 2.5m from my 42" TV though, so that does matter.
 
I played Assassin's Creed Unity today. I don't try to notice problems like resolution, but this was noticeable. It was like my eyes were trying to focus on detail that wasn't there. I just wanted it to be sharper. Battlefield 4 looks sharper, but has way too much aliasing.
 
The article was titled "Does resolution really matter?", not "should games have really good texture filtering and AA?"

The answer is yes. Resolution does matter and its contribution to image quality is extremely high. People claiming otherwise are IMHO just making themselves look foolish. The relative weights of each individual thing contributing are highly subjective but anyone can see the difference between 900 and 1080 side by side.

The question always seems to come with the accusation that resolution could be traded for other effects. Maybe it can... no one knows but the developers of each individual title and they probably only know if they tried it. DF certainly has no idea. But the question alone shows that the asker doesn't value resolution. Backseat graphics programming aside, I think that's a huge mistake. But one I welcome from my competition to be honest.

This post nails it for me...native resolution > non native, everyone knows that 1080p being sacrificed for preformance is a personal choice dependant on the scenario. I'd really like to have options of console gamers, they let PC gamers tweak to their hearts content - a simple toggle for console gamers would be nice.
 
I played Assassin's Creed Unity today. I don't try to notice problems like resolution, but this was noticeable. It was like my eyes were trying to focus on detail that wasn't there. I just wanted it to be sharper.
This is, for me, the issue, and I think it's actually accountable for my lousy eyesight. When eyes try to focus on detail that isn't there, they learn to relax to focussing at the necessary level for the detail attainable, as it were. And then they get lazy. My TV actually looked pretty crap when I got new glasses - it's UI is fuzzy, like interlaced edges or a pentile arrangement (which it isn't). That's what has me wanting a new TV so it's pin sharp, although now my eyes have relaxed to the same perpetual soft-focus that they prefer. ;)
 
1080P isn't just a number, its a magic number, like 30fps and 60fps. These numbers are magic for digital displays which typically have 1920x1080 pixels and a 60Hz refresh. This is the first generation that consoles are close to being able to match the native display resolution for the first time in the digital era. Last gen was a mismatch. This is the generation of 1080P.

I want a good frame rate first and foremost. If its a 30fps game, then hit 30fps most of the time. I don't care about dips every now and then. Once you hit that target, aim for 1080P, the native resolution for my display. No up-scaling blur, 1:1 (even with black bars).

I understand games like BF4 which strive for 60fps have a much harder time hitting 1080P. Fine I'll live with the muddy look, but I did prefer COD's look better that year.
 
Last edited:
1080P isn't just a number, its a magic number, like 30fps and 60fps. These numbers are magic for digital displays which typically have 1920x1080 pixels and a 60Hz refresh. This is the first generation that consoles are close to being able to match the native display resolution for the first time in the digital era. Last gen was a mismatch. This one is the generation of 1080P.

I want a good frame rate first and foremost. If its a 30fps game, then hit 30fps most of the time. I don't care about dips every now and then. Once you hit that target, aim for 1080P, the native resolution for my display. No up-scaling blur, 1:1 (even with black bars).

I understand games like BF4 which strive for 60fps have a much harder time hitting 1080P. fine I'll live with the muddy look, but I did prefer COD look better that year.

I hope next Frosbite will be 900p on Xbox One and 1080p on PS4 like COD.
 
I think he may have been referring to your TV resolution being significantly lower than 1600 x 900, and that being the TV that you notice the blurriness of 900p on.

I use a lower res plasma for gaming on myself, and I can notice the difference between the 1080 Panny plasma panel and my 1024 x 768 gaming Panny panel, even with 1280 x 720 content (I sit far too close). But my low res Panny has very low input lag and still has high motion resolution, so it wins out. Also, downscaling via supersampling doesn't particularly bother me.

My primary PC monitors are 1600 x 1200 PVA and 1920 x 1200 IPS. My low res panny looks massively better than both. LCD victory over plasma makes me cry. Next step is to run my PC through my gaming telly, with 4X Nvidia driver super sampling. For justice. For Skyrim.

I don't think this DF article is the best TBH. I'd rather see a less speculative article with more emphasis on testing a range of resolutions, AA solutions (including aniso) and different panel types. PC games might not give an entirely accurate picture of the performance tradeoffs that consoles face, but a more fleshed out article would be better to pick over and discuss.
 
I don't think this DF article is the best TBH. I'd rather see a less speculative article with more emphasis on testing a range of resolutions, AA solutions (including aniso) and different panel types. PC games might not give an entirely accurate picture of the performance tradeoffs that consoles face, but a more fleshed out article would be better to pick over and discuss.

Yeah they could have done better but I'm happy they are at least bringing attention to this serious problem. And I do feel resolution becoming a bullet point is a serious problem. To drag in the inevitable car analogy in to the fray, it would be like mandating a 0-60 time and then you can only buy a Honda Accord that has no air conditioning, no radio and only two seats because they had to remove weight to hit the 0-60 time mandate. Hooray for mandates, we all lose. We're going to see that in games as well with this 1080p or bust requirement that has unfortunately surfaced this generation. Rather than spend that relatively large amount of spare alu on good stuff, it all gets wasted on more pixels that people typically can't see anyways unless they are told so. To that effect I kinda like what The Order did, rather than forgo their vision for the graphics they letterboxed it to hit the stupid 1080p mandate and at the same time hit their visual goals. Good for them.
 
Status
Not open for further replies.
Back
Top