Will gaming ever go 4K/8K?

I'm not surprised, you're probably only getting a few thousand pixels per metre. How big is your local IMAX?

The screen is not that big, 24 * 18 m, but the room only has 12 rows.

They've opened in 2008 with a 70mm projector, replaced it with the 2K digital one about a year ago.
 
- I would prefer to have high-res textures first. Is this really a matter of enough memory (>> 8 GB) or just lazy developers ?

Surely it's just laziness when they want to spend some time outside the studio, instead of painting 4 times the detail.


I propose that the string combo "lazy developers" should automatically trigger a day-long temp ban.
 
Right, viewing distance is obviously a big factor as well in terms of benefiting from 4K. However, I'm speaking in terms of a typical viewing environment and how it's a hurdle for 4K adoption rate. Most people sit 8'+ from their display. And the projector userbase mainly consists of techies/videophiles... not representative of the mass market. In general, larger displays benefit more from 4K.

I think there are two adoption rates here. The rate for "true" 4K experience and rate for 4K displays. The latter will be driven by smaller displays that fit into the "typical living room/typical viewing distance" scenario. These won't benefit from 4K nealy to the degree the first group will, but the adoption rate itself will be quite fast imo, as the panel manfacturing will ramp up and simply jump there. "True 4K" experience will require people to abandon the "typicals" and build things around the display or dedicate a place for it not the other way around. Lot of that won't happen at least soon, but this topic certainly has people often clinging into an old way of thinking, trying to squeeze a new thing into an old setup when the new thing should mold the old stuff.

Huge perceived screens will probably remain quite niche until if 4K VR/personal displays become popular. Even great Full HD setups are quite rare and benefits of say Blu-ray vs DVD is slightly questionable on a 40" TV from 3.5m away.
edit: But setting up say 40" 4K TV with very close viewing distance and hooking it up to a PC is easy to achieve.
 
Last edited by a moderator:
I'm not surprised, you're probably only getting a few thousand pixels per metre.
You mean few hundred pixels per metre, surely?

- I would prefer to have high-res textures first. Is this really a matter of enough memory (>> 8 GB) or just lazy developers ?
Yes, because those are the only two possible explanations. Texture resolution has been decoupled from RAM size with the development of tiled resources so, potentially, texture size can be unlimited. I'll leave it to you to cost up the financial burden of developing ridiculously high resolution textures for every object and coming up with a business plan to pay for that investment. Let me know if you go with $300 games or $20 per month per game subscription fees.
 
You mean few hundred pixels per metre, surely?

Yes, because those are the only two possible explanations. Texture resolution has been decoupled from RAM size with the development of tiled resources so, potentially, texture size can be unlimited. I'll leave it to you to cost up the financial burden of developing ridiculously high resolution textures for every object and coming up with a business plan to pay for that investment. Let me know if you go with $300 games or $20 per month per game subscription fees.

Guess my texture frustration comes from a weekend long Wolfenstein New Order. Great game, reminds me of the good old Half-life 2 days, mostly beautiful, amazing levels sometimes, incredible fights .... but don't come to close to walls and some objects... but perhaps it's the limitation of the outdated Tech 5 engine.
 
You mean few hundred pixels per metre, surely?

Is that right? I'm calculating 4431 pixels/metre2 for a screen of 26x18m at 1080p. I guess you're referring to vertical length only?
 
Last edited by a moderator:
It's as much a limitation of production costs. If you want texture resolution good enough that you can stand inches from a wall and see it in texel : pixel quality, you'll need texture assets beyond sane sizes, 100x+ what we presently budget and generate. The solution there is probably some form of procedural detail generation. Regardless, the fact that a wall isn't pristine sharpness nor a photo on an in-game noticeboard isn't pixel quality when it fills the screen has nothing to do with the developers being too lazy to either generate high quality assets, nor too lazy to invent a way to get unlimited detail into their games (something they as a collective have actually worked on).
 
A game like Wolfenstein, with it's virtual textures, is never really completed, only abandoned. There's always another area where the artists could go back to drop a few more stamps and other details... but eventually they need to ship the game. So I guess there's a priority list and whatever can be done before the release will be done, everything else gets the short end.
 
Yeah, we need true 4K content to really see the benefits of it. We also need 4K displays to be affordable if it has any chance at mass market adoption.

Another debate within the videophile world is do you favor sharpness/detail over contrast/picture quality? Plasmas and OLED displays are generally favored amongst videophiles; plasmas are being phased out and OLED 4K displays will be stupid expensive for a while. LCDs will make up the majority of 4K displays (especially in the entry/mid-level bracket). Personally, I'd rather have a good 1080p plasma/OLED display over a 4K LCD.
Here is a shootout of the top TVs available: 4KShootout

Earlier in the thread i pointed out BBC december research on human visual system, there is no debate higher temporal resolution is more important. Marketing BS all over again even 1080p versus 720p appears to be a discussion dead in the water, only because there are no 720p sets being made but the hard truth is that in typical viewing distance there is no difference. So why this nextgen doesnt sets 60fps as minimum again?.

http://techreport.com/news/25051/bl...ers-overwhelmingly-prefer-120hz-refresh-rates

http://www.testufo.com/#test=eyetracking
 
Last edited by a moderator:
HDTVTest does great reviews and articles. Not surprised by the results. ;)
Well slightly... I would've thought that the LG OLED display would maybe edge out the ZT60. But we're comparing a first generation OLED display to a matured, top of the line plasma.

They say the number one element of PQ is contrast, followed by color accuracy/vibrancy, then sharpness/detail. The only chance 4K LCDs have against a Plasma or OLED display is if it has a full-array of LEDs with local dimming. All of the 4K test displays were edge-lit, so I'm not surprised they didn't receive a single vote for best overall display.
 
Wow talk about an apple to oranges comparison (Im using 2k instead of 1080p)

now please compare proper apples vs apples

4k LED vs 2k LED, 4k plasma vs 2k plasma, 4k LCD vs 2k LCD,
not
4k LCD vs 2k plasma, 4k plasma vs 2k LCD etc

or is this like, I dont know, whats the words Im looking for? wait its BLOODY OBVIOUS :)
 
Spoke too soon, seems like the same site did do an apples to apples comparison
http://www.hdtvtest.co.uk/news/4k-resolution-201312153517.htm
So there you have it: the superior resolution of 4K over 1080p is visible on a 55″ screen from 9 feet away, provided the content is up to par.
4k vs 2k LED

48 out of the 49 ppl who done the blind test picked the 4k!

So now we can finally put this to bed, ppl can tell the difference with the higher resolution
720vs1080-625x1000-graph.png

Well at least we wont be seeing these graphs any more
according to this its impossible to see the difference unless youre closer than 7" (and these dudes were 9")
maybe they just had better vision than 20:20 shock horror
 
Basically 90% of the graphs on the internet are wrong heres another wrong one, youve prolly seen pasted around (its been posted on this forum a few times), its basically the same as the one above. You can quickly check by going 55" see where the difference between 720&1080 is just above 11' on both graphs
resolution_chart.jpg

Ive been banging on about this ever since the retina display when apple declared at X distance you could not see the pixels, I done something so cunning noone has obviously thought about before, they just accepted the apple mantra (cause they never misrepresent the truth). I tested it Im a cunning man, I got my retina device put it at X distance & declared 'well I can see the pixels' even though Im shortsighted! (OK I dont wear glasses for near stuff)

The 720 vs 1080 vs 2160 etc at X distance thing is basically the same as the retina thing.

How did this often repeated fallacy come about?
I assume the above graphs are for 20:20 vision, which Ive heard mentioned as 'perfect vision' I can see where this comes about. eg 'I scored 20:20 on my vision test'
What is the normal human eyesight? from what I have read between 20:12 -> 20:16
FWIW mines a lot worse -2.5 & -3 in each eye but (I tested this year at the optimists) with my glasses on I could read the 20:15 line on the eyechart

This would make a good article for some tech website, as youve gotta admit its amusing that the vast majority of the graphs on the internet are wrong.
I may make a html page that does a proper graph if I get time
 
I dunno, my 4S doesn't expose pixels in normal everyday usage. Neither does my 50" plasma, and Xbox360 games are 720p at best.
 
Lets see if the zoomed graph is true, and most gamers sit at 2.5m. With average display sizes at 39". Conclusion follows that 720p is enough (at 50" 900p), then reschedule resources to improve every other aspect like aliasing > shading > framerate respecting ISF parameters of image quality.

Game developers have to match viewing conditions on their offices as well. Maybe invite Image Science Foundation to GDC or any conference to provide adequate info for the industry ?
 
Lets see if the zoomed graph is true, and most gamers sit at 2.5m. With average display sizes at 39". Conclusion follows that 720p is enough (at 50" 900p)
If you're looking at the above image and concluded based on that, then you havent written what I wrote.
Ive made a page with a more accurate graph
http://auzed.com/crap/screensize.html
from what I can find typical eyesight for adults is about 20/15 (20/20 is only normal if you're a pensioner)
for average adult vision.
720p is definitely not enuf at 2.5m, even 1080p is not enuf at 2.5m,
for 2.5m and 39" screen 2560x1440 is prolly about borderline thus 4k is not needed for normal vision
 
Awesome visualization, my understanding of Carlton Bale graph and BBC viewing formulas is that the resolution is only fully visible at the lower edge of each colored zone.
So at 20/15, 40": 7 feet 1080p is fully visible, ~8.5 feet is ~900p and 10.3 feet is 720p.


obs.: can you make the visualization 3D as Z-axis is fps scaled like the 3D Contrast Sensitivity Function and required fps/horizontal pixels tables in http://www.bbc.co.uk/rd/blog/2013/12/high-frame-rate-at-the-ebu-uhdtv-voices-and-choices-workshop
 
Last edited by a moderator:
Once again i will mention my own personal experience with a 30" SONY reference 4k Monitor.
4k60p is obviously better quality than 1080p60 from around 10feet away.
Not only for me, but for several people in our office.

That graph of Zed's appears to be much more in line with our own findings.

( we produce hardware for capture and display of all formats up to an including 4k and 8k, in high end cinema and production, so we sort of know what were doing when it comes to image quality)
 
Back
Top