Although I don't personally have a huge issue with letterboxing, sitting closer doesn't fix the issue of consistent-looking bezel framing.If you don't like the black bars, sit a bit closer
Although I don't personally have a huge issue with letterboxing, sitting closer doesn't fix the issue of consistent-looking bezel framing.If you don't like the black bars, sit a bit closer
If you don't like the black bars, sit a bit closer so the vertical FOV is the same as you'd have it sans-letterboxing, and you'll get the same vertical and a little wider view...
This really can't be a huge problem because everyone watches movies letterboxed. Or do they use some ghastly upscale to stretch/crop the vertical to fit their 16:9 TVs?Although I don't personally have a huge issue with letterboxing, sitting closer doesn't fix the issue of consistent-looking bezel framing.
Resolution and image quality are not the same thing. There can be titles with the exact same output resolution that don't have parity IQ wise. There are several factors that have an effect on IQ. Like texture filtering, Anti Aliasing, Texture resolution and shader and lighting quality just to name a few. So please stop judging who is qualified to talk about image quality. Your average gamer who plays to win and have fun doesn't necessarily care what a game's native res is. Not every gamer is a member of B3D or even a lurker.
I think this article is not very good for multiple reason. Far Cry 4 has better resolution and framerate on PS4 and Dark10 (John Linnerman Digital Foundry) on GAF trying to defend Richard Leadbetter on GAF said that he was talking about COD not Far Cry 4. COD Advanced Warfare was running better on Xbox One during campaign and it is the same framerate during multiplayer. CPU bound? It seems framerate is better on PS4 after patch. Is this true? I don't have the game. Dark10 said than they haven't test the game after patch.
Well, they apparently have in the last-gen revisited article:
http://www.eurogamer.net/articles/d...l-of-duty-advanced-warfare-last-gen-revisited
And yes in some areas we can see the framerate is improved on PS4, to the point that now some passages (well at least one) run better on the PS4 version vs XB1 version (singleplayer game).
Remember in the PS4 / XB1 video a real time cutscene where PS4 ran at ~50fps sustained (during ~20 seconds) and XB1 at ~58fps (with screen tearing), at 1:30 in the video:
Well that area was tested again on PS4 and XB1 compared with old gen version, here PS4 vs X360 vs PS3 at 2:40:
Now the game run at a perfectly locked 60fps on PS4 in this area, so a big improvement compared to ~50fps sustained during ~20 sec, the only explanation is that they used a patched version of the game...so they evidently tested a patched PS4 game...but simply decided not to compare it with the XB1 version again...how odd.
By the way the same passage XB1 vs X360 vs PS3 still shows ~58fps + screen tearing on XB1, the pattern of the graph is different than with the PS4 compared video (one isolated drop at ~1:22 on XB1 has disappeared on the recent video at ~2:30) so I assume they also used a recent (updated) XB1 version here:
Another remark that had being completely overlooked by everybody is that the XB1 version in the singleplayer is the only version that has screen tearing.
I spent far too much time this weekend playing Assassin's Creed Unity (900p on PS4) on a 50" 4K and the lack of resolution didn't bother me. Generally I'd be arguing for higher resolutions for games that are trying to convery great distances because nothing absolves perspective details like lack of resolution but with that rendering engine the game still looks great.
Were it staight choice of 900p or 1080p I'd take 1080p but I doubt I'd pick 080p at the expense of something else having seen how the game looks.
I wouldn't call 900p a blurry mess. But it's definitely noticeable. And that is coming from someone who still plays on a 32" 1366x768 HDTV which I feel has been pretty well calibrated going by AVSForums standards...
The article was titled "Does resolution really matter?", not "should games have really good texture filtering and AA?"
The answer is yes. Resolution does matter and its contribution to image quality is extremely high. People claiming otherwise are IMHO just making themselves look foolish. The relative weights of each individual thing contributing are highly subjective but anyone can see the difference between 900 and 1080 side by side.
The question always seems to come with the accusation that resolution could be traded for other effects. Maybe it can... no one knows but the developers of each individual title and they probably only know if they tried it. DF certainly has no idea. But the question alone shows that the asker doesn't value resolution. Backseat graphics programming aside, I think that's a huge mistake. But one I welcome from my competition to be honest.
This is, for me, the issue, and I think it's actually accountable for my lousy eyesight. When eyes try to focus on detail that isn't there, they learn to relax to focussing at the necessary level for the detail attainable, as it were. And then they get lazy. My TV actually looked pretty crap when I got new glasses - it's UI is fuzzy, like interlaced edges or a pentile arrangement (which it isn't). That's what has me wanting a new TV so it's pin sharp, although now my eyes have relaxed to the same perpetual soft-focus that they prefer.I played Assassin's Creed Unity today. I don't try to notice problems like resolution, but this was noticeable. It was like my eyes were trying to focus on detail that wasn't there. I just wanted it to be sharper.
You should take Ubisoft to courtThis is, for me, the issue, and I think it's actually accountable for my lousy eyesight.
1080P isn't just a number, its a magic number, like 30fps and 60fps. These numbers are magic for digital displays which typically have 1920x1080 pixels and a 60Hz refresh. This is the first generation that consoles are close to being able to match the native display resolution for the first time in the digital era. Last gen was a mismatch. This one is the generation of 1080P.
I want a good frame rate first and foremost. If its a 30fps game, then hit 30fps most of the time. I don't care about dips every now and then. Once you hit that target, aim for 1080P, the native resolution for my display. No up-scaling blur, 1:1 (even with black bars).
I understand games like BF4 which strive for 60fps have a much harder time hitting 1080P. fine I'll live with the muddy look, but I did prefer COD look better that year.
I don't get it.
I don't think this DF article is the best TBH. I'd rather see a less speculative article with more emphasis on testing a range of resolutions, AA solutions (including aniso) and different panel types. PC games might not give an entirely accurate picture of the performance tradeoffs that consoles face, but a more fleshed out article would be better to pick over and discuss.