Will gaming ever go 4K/8K?

8K really doesn't matter for consoles. They aren't going to render 4K any time soon, and 8k having 16x the pixels to render as 1080p makes it pretty ridiculous as a target for computer games. Irrespective of what the TV and cinema industries go with, 8K (and even 4K) is pretty pointless in the gaming field outside of high-end PCs.
 
8K really doesn't matter for consoles. They aren't going to render 4K any time soon, and 8k having 16x the pixels to render as 1080p makes it pretty ridiculous as a target for computer games. Irrespective of what the TV and cinema industries go with, 8K (and even 4K) is pretty pointless in the gaming field outside of high-end PCs.

I agree with you that in the current situation it is like this but if they suddenly magically decide to castrate all overheads decreasing performance, then we might be very well closer to the truth. ;)
 
What's more annoying is how some people go silly with those "what's next, 16K and 32K" ludicrous comments... :rolleyes:

There won't be 16K and 32K. :LOL:
Well not for another 2 decades at least. ;)

I think 4K and 8K will last a long time, i hope the industry shifts focus to refresh rates aka fps/Hz and we see true 120Hz 4K/8K displays as the norm.
 
I agree with you that in the current situation it is like this but if they suddenly magically decide to castrate all overheads decreasing performance, then we might be very well closer to the truth. ;)
Nothing like. Do you seriously believe that the overheads of PC rendering are reducing the GPU's performance to 1/10th what its really capable of? Or that an 18 CU console with 176 GB/s RAM BW is performing akin to a 180CU GPU with 1740 GB/s BW because it doesn't have PC's overhead (which is still notably below the 16x that'd be required)?
 
NHK performed the live 8K transmission from London Olympics venues in 2012 and aims to begin experimental 8K broadcasts in Japan in time for the Rio de Janeiro Olympics in 2016. Last year NHK joined hands with Brazilian commercial broadcaster TV Globo in February 2013 to shoot the Rio de Janeiro Carnival in 8K.

Is The Industry Ready?

Technology-wise, the world seems to be pretty much ready for 8K. The high efficiency video coding (HEVC, also known as H.265 and MPEG-H part 2) technology used for UHD video compression is ready and is even supported by existing encoder/decoder chips. NHK Engineering has developed 2.5” 33MP CMOS sensor capable of capturing 7680*4320 video at 60fps and has created an 8K TV camera with Hitachi. Astro introduced an 8K movie camera with NHK’s 33MP sensor last year. RED already sells 6K-supporting (6144*3160) Epic Dragon cam. There are various experimental tools for production and post-production of 8K content. Sony and Panasonic are developing new optical media capable of storing at least 300GB per disc. Unfortunately, we still lack many industry standards needed for commercialization as well as available equipment. But the industry still has years ahead of it!
What is not completely clear is Hollywood’s attitude towards 8K formats. In theory, once all technologies are ready, major studios just start to adopt them and eventually release movies created using them.
Before 8K becomes a mass market standard, 4K has to replace full-HD. The latter will happen only when decent 4K TVs will be affordable enough. The price of TV-sets depends mostly on the price of panels, so, in case of UHD 4K, the industry will need proper pricing on IGZO, OLED and other advanced panels to move forward with the new formats.

Samsung, Sharp, Phillips Demonstrate Prototype 8K Ultra-High-Definition TVs at CES.

8K TV Prototypes Make Unexpected Debut at Consumer Electronics Show

http://www.xbitlabs.com/news/multim...type_8K_Ultra_High_Definition_TVs_at_CES.html

I stand corrected!, thanks for the links. I shudder to imagine the scale of compression going on here though h265 or no.
 
HD broadcasts here and in most countries is still only 1080i with no hope of getting 1080p any time soon. And often badly compressed 1080i at that.
4k broadcasts in the UK (or Europe) is pure fantasy right now and for the foreseeable future.
 
Adventure games that involve investigating clues could benefit. Imagine being able to zoom in on everything in a room. This could mean we might see "hot spots" becoming unneeded. It would really help in the immersion as the clues would be literally right there in front of us.
 
You wouldn't need 4k to zoom in on anything in a room. That can be done with supersized, virtual-texture style graphics. The only benefit of 4k is being able to occupy a very large FOV at high resolution. On a smaller screen, you could go right up close to it like a photograph and search for details, but that'd be rather odd and niche gameplay resulting in people sitting very close to their TVs. VR is a far better solution to the realism and detail problem. Regardless of the display tech, that level of detail needs to be solved in game design and implementation rather than visualisation.
 
Adventure games that involve investigating clues could benefit. Imagine being able to zoom in on everything in a room. This could mean we might see "hot spots" becoming unneeded. It would really help in the immersion as the clues would be literally right there in front of us.


Actually greater detail makes this problem worse not better, old school adventure games that avoided the 'pixel hunt' did so by making interactive objects higher contrast than the background and reduced clutter by eschewing photo realism and instead going for bold stylised art. Contrast the clarity of Indiana Jones and the Fate of Atlantis and the cluttered mess that the Gabriel Knight games were.

Imagine a game world with 2,000-3,000 entities in a room (an item count the average front room wouldn't sweat hitting) and trying to work out which balled up bit of paper had the killers name on it and which were the old Aldi receipts? As Shifty said the key to the problem you identify is good game design rather than technology and specifically 4K only resolves more detail if you don't allow the user to zoom the image (by moving their avatar closer to the object or an actual zoom) and instead make them plant their face against the screen.
 
I wonder how Resolution independent, if at all, are screen space effects in this game such as SSAO...
 
Is there any utilities to force games to run slow-motion, for example by manipulating system timers or such? If so, you could play the game at realtime, 60fps smoothness, except at turtle velocities... :)
 
I think you're getting into diminishing returns here. I'm interested that the first QHD smartphone, the LG G3, recently debuted, and most editors say you cannot tell much difference to a 1080P screen. This is really the first time that happened. The jump to 720P was big, and the jump from 720-1080, while not as big, was still not met with such a response.

Where is the point where it gets you more return to pump graphics than resolution? I'm not sure of course, but I would think 4K and 8K are a reasonable distance down the list of ways to spend power to make your game look better. Especially on consoles where power is at a premium. We already see Xbox One often sticking to 900P rather than 1080P.

All to say, I doubt 4K is a slam dunk for next gen (2017+) consoles. Maybe they are capable, but who knows where you end up. I guess maybe at some intermediate resolution between 1080 and 4k.

But, I could be completely wrong. Maybe it's that diminishing returns means the best way to spend power becomes increasing the resolution (since it soaks up so much). *shrugs*

That is if 4K TV's are even widespread by next gen. that'd be a necessary first.
 
1080p HDTV on its own is already a huge problem in everything and 60fps playback in the Hobbit was also criticized for related issues.

Basically traditional movie making techniques have been heavily relying on the imperfection of the final picture. Every aspect of a production is streamlined to work just well enough on 24 fps film stock with significant grain; the image quality then hides all the problems that would be immediately noticeable in real life.
Make-up, fake sets and props, unrealistic lighting, stunt doubles, wires, and so on. It's all fake and the audience will only believe it because they simply can't see it well enough.

The move to HDTV in the recent decade has significantly affected production costs in TV production. Acting talent has to look more beautiful from the start (and need to get lots of plastic surgery) and movie-level make-up artists are a must, number and scale of sets and props have to be reduced, and visual effects are becoming very expensive too. HBO I think needs to spend something like 60-80 million on a single season of Game of Thrones and you can still feel some of the restrictions.

4K would make all of this even more of a problem and thus noone in Hollywood really wants it, IMHO.


Gaming also has the real-time hw performance and resource limitations to deal with. And it's not just about bigger frame buffers and more pixel processing - increasing texture and model resolution will also have much greater costs both in hw capacity and artist time. So gamers probably don't need it either.

The one exception on the asset side is VR, in that the larger FOV generally won't result in assets taking up more pixels in the final image. But the processing power requirements are still serious enough on their own.
 
That is if 4K TV's are even widespread by next gen. that'd be a necessary first.

They absolutely will be. Even today I can see the shops pushing for silly 4k 42" screens (almost completely nonsense at that size) and UHD/4k is already the new thing to have if you buy a new TV. In 5 years we'll have 8k screens being pushed down our throats by the manufacturers and 4k will be the normality. Prices are already totally acceptable and 5 years is a very long time.
 
Back
Top