4k resolution coming

Status
Not open for further replies.
but, we cant even see the difference between 1080 and 720...
Depending on the quality of the visuals, the display size, and the viewing distance. This demo was on a small cinema sized screen at a distance of a few metres:
http://www.gtplanet.net/gran-turismo-tech-demo-shown-running-at-4k-resolution/

So of course it's going to be impressive and far better quality than 1080p stretched to that same size. Now unless there's plenty of reason to think of mass adoption of 80+" wide displays, or people deciding to sit 2 feet from their TVs, 4k is still not going to happen in next-gen consoles or mainstream media. Heck, the graphical horsepower won't be there to drive 4k anyway.
 
I can clearly see the difference between 1280x720 and 1920x1080 on a large 55" TV from as far as 20 feet away.
I'm guessing that's a typo, because you can't. Unless you're super-human. The very best measures of human acuity place it at 0.4 arcminutes resolution, or 0.07 of a degree. A 55" TV is 48" wide. Viewed at 20 feet, that's an FOV of ~11 degrees. 1920 pixels would have each pixel occupying 0.00573 of a degree. 1280 x 720 would have each pixel occupying 0.00859 of a degree. Both are an order of magnitude smaller than the human eye has been determined capable of resolving.

Furthermore, the benefits of 4k need to be considered relative to the general populace. And further to that, the cost is a factor. So you enjoying gaming on a $30,000 4k monitor or Joe Public having a great experience on a 4k 20' wide projection isn't an reference point for what sort of targets are sensible for a next-gen gaming console.
 
I'm guessing that's a typo, because you can't. Unless you're super-human. The very best measures of human acuity place it at 0.4 arcminutes resolution, or 0.07 of a degree. A 55" TV is 48" wide. Viewed at 20 feet, that's an FOV of ~11 degrees. 1920 pixels would have each pixel occupying 0.00573 of a degree. 1280 x 720 would have each pixel occupying 0.00859 of a degree. Both are an order of magnitude smaller than the human eye has been determined capable of resolving.

I really don't care what your charts say, the difference is there, I can see it. Stop assuming that everyone has the same limitations because 'smart' people said so.

It's even more pronounced when you have finer details on the screen, the grass in the film Gladiator for example.

And personally, as a senior member on this forum, don't ever tell me "I can't"

That gives me the impression that you think I'm making it up, and that infuriates me.
 
@ "Shifty Geezer":

Your calculations and arguments seem to ignore that aliasing is much more visible at 1280x720 compared to 1920x1080 in videogames.

Your calculations and arguments also seem to ignore that when discussing 1280x720 vs. 1920x1080, you also have to consider the blurring that happens on today's fixed-pixel-display technologies when scaling 1280x720 up to 1920x1080.

This aliasing and blurring is quite noticeable. Regardless of your calculations.
 
This aliasing and blurring is quite noticeable. Regardless of your calculations.

Yes, but that is separate from the resolution. You spot artifacts, not only the difference in resolution. Blur due to scaling or aliasing etc. is easy to spot, but a very high quality material, free of artifacts is probably impossible to tell apart in conditions Almighty is describing (20 feet away from a 55" TV)
 
And personally, as a senior member on this forum, don't ever tell me "I can't"

That gives me the impression that you think I'm making it up, and that infuriates me.

Then go chill out in a quiet way, this is not kindergarten.
 
Exactly, so I don't expect to spoken too like I am.

Then stop being a whiner, and just debate. Simplies. Nobody cares about what you can or cannot be told, if you want to let us know about that do it through a blog, or in RPSC. Here you debate about a technical topic, so stick to that.
 
Yes, but that is separate from the resolution.

In reality it's rather not. Almost all, if not all of today's displays make use of fixed-pixel-display-technologies. So, unfortunately, there will almost always be blurring when the input resolution doesn't match the native resolution of the display.

So, in reality, as of today, it's rather not separate from the resolution.

You spot artifacts, not only the difference in resolution. Blur due to scaling or aliasing etc. is easy to spot, but a very high quality material, free of artifacts is probably impossible to tell apart in conditions

As of today, "very high quality material, free of artifacts" unfortunately is not the reality for real-time-rendered 3D videogames though ;).

And this thread takes place in the:

Embedded 3D Forums » Console Forum

;)

So what's your point ;)?
 
Last edited by a moderator:
I really don't care what your charts say, the difference is there, I can see it. Stop assuming that everyone has the same limitations because 'smart' people said so.

It's even more pronounced when you have finer details on the screen, the grass in the film Gladiator for example.

And personally, as a senior member on this forum, don't ever tell me "I can't"

That gives me the impression that you think I'm making it up, and that infuriates me.

He doesn't think you're making it up, he thinks you're mistaken. So do I. Given a choice between what someone believes to be true based on nothing but their own perception and facts that have been arrived at through multiple independent tests performed using sound scientific methods I am more likely to accept the latter than the former.
 
So what's your point ;)?

The point is that Almighty's posts on the subject "sounds" as if he has the ability to spot the difference in images due to resolution, when he really is spotting something else like blur due to scaling or heavier aliasing. There is a clear difference there. I'm sorry if you don't understand that.








<-edit: 4K in a thread about 4K, nice :)
 
I cannot believe this is such a problem. Can't we all just agree that Almighty is seeing artifacts not related to resolution differences (perhaps a poor scaling technique, etc.)? While it may be physically impossible to tell the difference of 1080p vs 720p at ~20 feet, there may be other factors involved. These other factors could lead someone to believe he can tell the difference of 1080p vs 720p at ~20 feet.

The debate seems to be whether there will be practical benefits of buying a 4k TV in the near future for consoles. Let's stick to that.
 
Regardless of the actual or perceived benefits, looks like the industry is moving towards 4K.

TV makers have obvious reasons even though 3D didn't really result in bigger sales or margins necessarily.

The studios have seen drop in DVD sales and Blu-Ray sales haven't made up. So 4K could give them a way to increase ASPs of media.

I've seen references to "Ultra HD" which may be the moniker they use. Even Direct TV has said they will support it. Not sure how, unless they plan to launch a lot more satellites.
 
I really don't care what your charts say, the difference is there, I can see it. Stop assuming that everyone has the same limitations because 'smart' people said so.
Then you need to go see a doctor or something, because you are a medical marvel. You have 10x the visual receptor density than everyone else. This isn't based on a chart that is derived from the medical and scientific research, but the actual research into optics and eyeball anatomy. Not research that measured 100 people and decided that because they couldn't see the difference between 1080p and 720p then no-one else can, but real research spread over decades into rods and cones and pupils and light and nerves and stuff.

Scientific fact of the same sort that says matter is made out of elements made out of electrons, neutrons, and protons. Scientific fact of the same sort that says humans produce energy by controlled oxidation of carbohydrates producing CO2 and H2O and turning ADP into ATP. Sam sort of science that means if someone posted on a PC forum that they had overclocked their i7 to 12 GHz using air cooling with the stock fan and heatsink at 2000 rpm, you'd reply they weren't because that's scientifically impossible.

There are several possibilities for your being able to see the difference between 1080p and 720p on a 55" set at a distance on 20 feet.

1) Science is wrong, and the actual visual acuity of the human eye is 10x what the scientists say
2) You are a remarkable individual with 10x the visual acuity of everyone else
3) My maths is wrong and I'm made a ridiculous cock-up
4) The difference you see were imagined
5) The difference you saw were caused by something else

In response to these:

1) Science is often wrong, and I don't place 100% faith in it. If someone tells me they can hear radio on their fillings, I'd consider it a possibility and seek a proper test rather than just dismissing it out of hand. If someone tells me they can hear above 25kHz, I might believe them. In this case though, the research is well documented and makes sense across a wide range of disciplines. I see no reason to doubt the visual acuity target as 1 arcminute for the average Joe, and 0.4 on the more optimistic evaluation. If you want to convince me otherwise, you'll need a stronger argument than, "well I can".

2) Very unlikely, but not impossible. This could only be proven with a proper investigation.

3) Quite possible. I'll recheck now...

Using Pythagoras, 55" diagonal on a 16:9 screen gives:
(16x)^2 _ (9x)^2 = 55^2
256x^2 + 81x^2 = 3025
337x^2 = 3025
x^2 = 8.9
x ~ 3 inches

Therefore 16 units across = 16 x 3 = 48"
Viewing at a distance of 20' = 240" straight on forms an isosceles triangle. Taking half of that triangle, we have a right-angled triangle of width 24" and height 240". The angle of that triangle is found with arctan(24/240) = 5.71 degrees.

Therefore the FOV of the display is 11.42 degrees.

11.42 degrees / 1920 pixels = 0.00575 degrees per pixel
11.42 degrees / 1280 pixels = 0.00892 degrees per pixel

Human visual acuity at 1 arcminute means 11.42 x 60 = 685.2 samples in that viewfield
Human visual acuity at 0.4 arcminute means 11.42 x 60 = 1713 samples in that viewfield

Ah, hang on...

There's your problem; a fault in my initial maths. My bad. I was missing a decimal place in my theoretical limit calculation. Quite possibly because I keep hitting the '.' instead of the '0' on this mobile phone calculator! The actual higher-end resolution is 0.007 degrees, not 0.07. That places 1080p at above the threshold, but 720p below. Although that still places your eyesight as remarkably good, and I'd like to see test on what you can and can't differentiate, it's not the order of magnitude difference I thought. :D That's why ideas should be double-checked and independently verified. It also shows science can be trusted, but my maths can't.
 
I'm back, spent a good few hours with my PC plugged into my 55" LCD and played with native 720p and 1080p feeds.

Test was to see if the difference was big on games and the results were suprising!

It's also strange seeing twin Intel core i7 990x clocked at 5.3Ghz bottleneck 7950 Tri-fire, never thought I would see that day!!

Anyway, results were all over the place at 20 feet away so I'll just list the games and what I found..

Crysis - Tried the Harbor level as it's my favourite level, difference is apparent at 20 feet and difference is really apparent at my normal sitting distance of ~9 feet. Biggest difference was in the tree's and bushes, some extra definition of the branches at 20 feet away and a lot of extra definition at 9 feet. Going close up to a wall texture showed next to no difference at all.

STALKER : COP - Out side there was that same extra sharpness and clarity of the foliage and other fine detailed assets, inside the buildings the difference was minimal. I then deicided to load a texture pack which doubles the texture resolution and with that there was a difference between 720p and 1080p, more of the final detail was visible at 1080p. I assume it's because the textures were lower in resolution then the device they were being displayed on and increasing the texture resolution changed that?

Doom 3 - After al all the talk of the BFG Edition I fired up the classic version on STEAM, Could not tell the difference between 720p and 1080p at all at 20 feet away, game looked a little sharper at 9 feet but not as much as I was expecting. On another note, the stock game has not held up well!

NFS : SHIFT - Tried a few genres so why not try shift, pretty much the same as Doom 3, hard to tell at 20 feet, shaper and clearer at normal viewing distances, shimmering was reduced a lot when running 1080p

Resident Evil 5 - Same as the above games, hard to spot at 20 feet, sharper and clearer at 9 feet.

Crysis 2 - Same as Crysis 1, noticeable at 20 feet, really noticeable at 9 feet.

My theory...

So after that the other half was moaning as her soaps were about to start so I had to unplug my computer and give her the TV back :rolleyes:

Sitting down and thinking it about, the difference is going to primarily depend on 2 things

1. The most important one, how good your eye sight is, I have higher then average, more then I can for my hearing! If you have poor eye sight then the difference will be reduced.

2. The quality of your assets and the game itself, the games that showed small difference were either really old and thus had low resolution assets or were console ports that also had low quality assets. Crysis 1 & 2, STALKER are all top end games on PC, they have loads of instances of higher quality assets, fine details in tree's, grass, textures are high resolution. It's the fine details that are more noticeable at 1080p, tree branches, individual leaves are much easier to make out in the distance.
 
So Shifty's maths was wrong, perhaps Almighty does have the visual acuity to distinguish resolutions at that distance.

Though it would be interesting to see if you can differentiate between resampled 720p and 1080p static real life images.
As perhaps it has something to do to your TV's scaling or aliasing and other artifacts that are more noticeable at 720p in games.

Try viewing these two images on your display and see if you can distinguish the scaled 720p from the native 1080p.
http://i.imgur.com/FQgS1.jpg
http://i.imgur.com/6aBlt.jpg

But even if you can, really you'd have to do a double blind test on all media you're testing to make sure it's not just the placebo effect or bias (which are very real effects)
 
Last edited by a moderator:
So Shifty's maths was wrong, perhaps Almighty does have the visual acuity to distinguish resolutions at that distance.
It's hard to say. It's within the realm of possibility unlike my initial response, just as 6 and 7 fingered people and 10 foot tall people are, but he's right on the physiological limits if seeing the difference at 20'. We're talking 20/8 vision, and the pupil at the perfect dilation for optimal focus. The only way to be sure would be a proper study that avoids alternative influences (you know, the sort that sees audiophiles adamant that they can hear the difference between £5 and £1000 HDMI cables even when it's proven the signal received is identical in both). But with 6 billion people in the world, there will be a few with visual acuity that high, and Almighty could well be one of them. He should sell himself to medical science. ;) It's also worth noting that perception isn't just about the eye's resolving power, but the brain's ability to interpret and construct understanding, which is capable of 'seeing' finer resolution. This is where higher framerates could solve both resolution issues and game smoothness, which is why I'd rather 4k were shelved and focus placed on framerates until they have caught up with resolution progress.

However, it's worth pointing out that though my implementation of the science was wrong, the science itself is still valid and to be trusted, and prior assertions that "there is no human limit because everyone is different" is incorrect. People will fall somewhere within the bell-curve of range of limits, and the companies targeting human perception (or any other field) will take their targets from the general populace and not the outliers (common denominators). If independent research, correctly performed, finds that the pretty much every person can't resolve higher than 1 arcminute of detail, then that should be the target for displays and consoles as the optimal balance. A machine capable of rendering 4x as much detail would cost too much and that detail would be lost on the general public (unless sitting closer to the TV).
 
Status
Not open for further replies.
Back
Top