Digital Foundry Article Technical Discussion Archive [2013]

Status
Not open for further replies.
Well I'll be the first to admit that my living room makes some trade-offs VS a regular living room and basically is a dedicated entertainment room.

The video is pretty bad quality, but that is my current setup...



You can also view the TV unobstructed from that couch behind the divan. If necessary I could move about 45cm closer to the screen with this setup and naturally further back. Before I bought my 50" TV I was contemplating on a 42" model due to the ability to change the distance at will.

I would like to have a 65" 4K TV with this current viewing distance and perhaps move the couch in the back 50cm closer and move the back speakers and that divider which they stand on behind the couch, this way I can have two usable stationary viewing distances for varying quality of content with surround speaker setup and also still have some room to move either watching positions if needed.

edit:

2 pictures




Nice setup,looks comfy:p
 
30 fps for a racing game is bad, like Forza 1 bad.
native resolution is a limit, frame rate is also a limit.
some prefer one, others prefer the other.
As a gamer I expect most things to run above 30fps because it's pretty unbearable to my eyes.

This gen must have been horrible for you, most games were 30fps, if that.
 
May have to move the thread from "Console Technology" to "PC Technology".

While they have converged in the mean time, it will be interesting to see where living room gaming is heading.
 
That's why I game on PC when I can, duh.

I was thinking of all the 30fps console exclusives; Halo, TLoU, Crackdown, Gears, Motorstorm, Uncharted, Infamous, Fable, RDR, GTAV, etc. etc. Other than two driving sims and some fighting games, you must have been miserable.
 
I was thinking of all the 30fps console exclusives; Halo, TLoU, Crackdown, Gears, Motorstorm, Uncharted, Infamous, Fable, RDR, GTAV, etc. etc. Other than two driving sims and some fighting games, you must have been miserable.

You have no idea.
Some of them applied motion blur which makes it a little better, then again I'm one of those that sees interlacing on TV.
 
I don't see how 1080P 30fps is bad or how it is PS4 specific. I think gamers are expecting 30fps 1080P as the new baseline, with 60fps being achieved via concessions. I don't think anyone was expecting sub-1080P 30fps or 720P anything.

I think gamers in general are simply looking for an upgrade over the 360 and PS3. As long as the game is pleasing to the eye, doesn't play like a choppy mess and is fun, most wouldn't care less if that game is sub 1080p or sub 60 fps. 900p and 30fps aren't going to be the reasons Ryse or KZ bombs if those two titles fail to resonate with the general gaming market.

Nevermind that most 720p/sub 720 games probably have better clarity then most of the 720p/1080i content delivered by compression happy cable and satellite companies, which probably makes up the majority of HD content consumed by TV viewers.
 
Last edited by a moderator:
To exclude confirmation bias, did you invent/run this test before, or after you learned about the Xbox1 specs?

Way before. I started testing the theory out back in 2005 or 2006 when I started to contemplate making a dedicated HTPC. And then I further refined it starting in 2008 when I decided to make the HTPC gaming capable and not just for movies, music, and images.

The conclusion that I came to was that people could only notice the difference in reality if the TV featured a bad scaler. The worse the scaler the easier it was to tell the difference. Most modern TV's that aren't super budget Chinese imports, generally have decent scalers now, however. So that shouldn't be as much of a problem.

Even for people that sit close to the TV, as long as there isn't a side by side comparison and as long as they do not know the source resolution, it is extremely rare for anyone to reliably guess the source resolution. I do this by having them sit in front of my coffee table instead of on the sofa. Basically it's mostly the "hard core" videophiles that know what artifacts to look for. For most of the PC and console gamers that have gone through my "torture test" (evil grin, it's very long), they can't reliably guess the source resolution. Pretty much everyone else is unable to guess at that close distance unless I put it side by side at which point the number of people that can notice increases significantly. At the closer distance, the lack of AA is what could potentially give away a game. But at living room distances people were still generally unable to reliably guess the source resolution even without AA. Again, this is without a side by side comparison, which is valid since most people don't have 2 TVs in their living room side by side.

And more recently I've been doing this with more people to show them why 4k is basically worthless as a TV in most people's living rooms unless they plan on getting a screen well over 100". In other words, if they can't tell the difference between 720p and 1080p, there is no way they'll be able to notice any benefits from 4k.

2 pictures




Yeowza, that's too close for me. :) Then again I'm the sort of person that prefers sitting in the back of a theater versus the front of a theater so I don't have to move my head while watching the movie. :)

Although it does get you pretty close to how much of your FOV a 24" monitor would fill up.

Regards,
SB
 
Are these test subjects people who normally have a vested interest in such things? I have friends who can only spot the difference between DVD and Blu-ray if you show them back to back, otherwise they're clueless. They'd never be able to guess at a video game's resolution, and they wouldn't care enough to bother. Whereas I'm a resolution hound from my CGI days, and all those little flaws that most people don't even see stand out to me like sore thumbs because I'm used to looking for them in my own work (jaggies and the like). Combine that with pretty damned good eyeballs (paid enough for 'em.. hehe), and I'm probably more likely to be able to tell the difference than the vast majority of people off the street that you could plop down in front of a TV and start quizzing them on resolution and image flaws.
 
Yeowza, that's too close for me. :) Then again I'm the sort of person that prefers sitting in the back of a theater versus the front of a theater so I don't have to move my head while watching the movie. :)

Although it does get you pretty close to how much of your FOV a 24" monitor would fill up.

Regards,
SB


Yeah I get about 36 degree horizontal viewing angle from this setup, that is exactly what the THX standard recommends. For 1080p that works well. edit: With 4K, the viewing angle can be over 50 degrees.

http://www.acousticfrontiers.com/home-theater-blog/2013/3/14/viewing-angles

Here is my old monitor setup.


(click the pic to make them larger)

That's a 27" monitor. I found that table from Ikea for 19.95€, it let me to move the legs of the table under the divan and then move the monitor back and forth on top of it, allowing pretty good possibilities to change the viewing distance. This setup didn't require space that much.
 
Last edited by a moderator:
Couldn't Digital Foundry ask MS engineers if the Xbox One Tiled Resources technology is Tier 2 or Tier 1? I wonder.... That would be a great idea.
 
Couldn't Digital Foundry ask MS engineers if the Xbox One Tiled Resources technology is Tier 2 or Tier 1? I wonder.... That would be a great idea.

Why would it matter? The tiers are just (somewhat) arbitrary classifications used to broadly bucket PC hardware into multiple categories. The XB1 GPU will support whatever feature set it supports, and their API will expose those features accordingly. There won't be much reason for an XB1 developer to care about "Tier 1" vs "Tier 2", they will just care about whatever specific features are available.
 
Why would it matter? The tiers are just (somewhat) arbitrary classifications used to broadly bucket PC hardware into multiple categories. The XB1 GPU will support whatever feature set it supports, and their API will expose those features accordingly. There won't be much reason for an XB1 developer to care about "Tier 1" vs "Tier 2", they will just care about whatever specific features are available.
Dave hinted at the fact that there are important differences from a performance point of view because some of the available features for Tier 2 cards are done by hardware, while Tier 1 either doesn't support them or can only do that via software.
 
Are these test subjects people who normally have a vested interest in such things? I have friends who can only spot the difference between DVD and Blu-ray if you show them back to back, otherwise they're clueless. They'd never be able to guess at a video game's resolution, and they wouldn't care enough to bother. Whereas I'm a resolution hound from my CGI days, and all those little flaws that most people don't even see stand out to me like sore thumbs because I'm used to looking for them in my own work (jaggies and the like). Combine that with pretty damned good eyeballs (paid enough for 'em.. hehe), and I'm probably more likely to be able to tell the difference than the vast majority of people off the street that you could plop down in front of a TV and start quizzing them on resolution and image flaws.

It spans all sorts of people. I wasn't only interested in what casual/core gamers, casual/core film enthusiasts, etc. were able to distinguish. Like I've mentioned in the past, I've had some pretty rabid "videophiles" that would argue with me endlessly about how 1080p at any distance was far superior to 720p, take the test. Some of whom actually had 20/15 and 20/10 vision.

The key here is that...

1. They do not know the source resolution ahead of time. It's almost impossible to test yourself because of this.
2. There is no comparison image. Meaning there is no side-by-side comparisons and there is no one after another. You are shown an image, video segment, or gameplay segment and have to guess the image blindly without comparison to the same image, video segment, or gameplay segment at a different resolution. Each image, video segment, or gameplay segment is shown at differing resolutions during different parts of the test. For example, the same video segment at 720p and 1080p is shown, just not one after the other.

Note that for [2] above, even when differing resolutions of the same source material is shown consecutively, people still couldn't reliably guess the resolution from the sofa. From in front of the coffee table, however those with keener eyes and a knowledge of what to look for did relatively better when differing resolutions were shown consecutively. Still nowhere near 100%. Closer to 60-70% of the time depending on how good their eyesight was, the source material, and how knowledgeable they were about what to look for.

So, I don't doubt that if Dr Evil has good eyesight that he might be able to tell the difference with his setup (I'd still love to test him though. :D), but that represents a non-typical living room arrangement.

Regards,
SB
 
What is the distance from "in front of the coffee table" and is this the 55" screen we are talking about?

In any case this chart basically covers this issue.

resolution_chart.png


Yes you have to be really close. edit: Actually it's not about having to, it's about being able to
 
Last edited by a moderator:
Yup, 55" and from where they'd sit in front of the coffee table, it'd be ~5-6 feet to the TV. So well within the ability of people with 20/20 vision or better to be able to technically see the difference. And it is VERY obvious when shown my computer desktop at that distance. That changes when it comes to photos, video, and games (without UI displayed).

That's the point, though. Even when that is the case. People generally can't tell what the resolution is without a side by side comparison. Some can tell most of the time if there's a "one after another" comparison, but still not nearly 100% of the time. Depending on the source material, some is easier to identify than others. But even then. As I said, most people can't reliably tell the difference even when they should be able to.

I have no doubt that if I had a side by side comparison from in front of the coffee table, that most people would be able to tell which resolution is which depending on the source material. Something simple like a pie chart or static scene of a city skyline (high contrast lots of sharp lines) would be easy. A 720p game with 4xSGSSAA or RGSSAA would be relatively difficult compared to a 1080p game with AA. In that case, if the 1080p rendered game had no AA, while the 720p game had the aforementioned AA, some of the knowledgeable computer videophiles might correctly pick the source resolution, but most others would probably pick the 720p image as being higher due to it looking better and cleaner (less jaggies).

Regards,
SB
 
Status
Not open for further replies.
Back
Top