4k resolution coming

Status
Not open for further replies.
DOF means parts of the content are blurred, although whatever you're focussed at should be in focus. Any movement of the subject will introduce blur larger than a 4k pixel. Any motion of the camera will introduce blur larger than a 4k pixel. So short of static photos, 4k 's benefits should be very limited. That's why lots of folk can't tell the difference between 720p and 1080p movies - the actual difference on screen is very little. Of course for games it could be different with perfect pixels, except by the time 4k becomes commonplace I'm sure devs will be implementing photography effects like DOF and motion blur pretty universally. ;)

No, the benefit should be there, as long as the screen is big enough and able to actually display 4k

It's obvious that Jason Bourne movies will benefit less but not every movie has actions scenes all the time, and even action movies do shots where it could be like "looking" through a window.

4k is very common when shooting commercials the difference is clearly to be seen in post
 
No, the benefit should be there, as long as the screen is big enough and able to actually display 4k.
You may be right, but I'd need to see good numbers to convince me. A handheld camera is going to be moving ever so slightly. 4k pixels are so dense that you'd surely be moving as much as a couple of pixels around in a stationary shot, meaning a small degree of blur equivalent to a 1080p sample. Any amount of motion blur such as leaves moving in the wind will also apply, and any camera motion with pans and tilts will be blurring significantly. The opportunities for resolving a scene to something like 0.0125 degrees (1/4000th of a 50 degree FOV) can't be that numerous. It works in nature programs staring at static rocks and the ground and stuff with creatures posing, but a lot of human interest is going to be dynamic and framed artistically.

I saw the latest Batman film on a cinema in Sony 4k, and I never noticed the difference TBH. The screen was still a big blur like usual at the cinema. ;) I'll even say that I bet viewers would consider 1080p50/48 movies as clearer and better quality over 4k 24 fps movies.
 
4k resolution will "potentially" be great for still shots, assuming the detail is preserved. In motion, however, with movie blur and other types of processing, the effect will be far more muted. Take a 1080p souce and pause a heavy action scene in a movie. Notice how much detail is lost. 4k will do nothing to improve on something like that.

For slow moving documentaries, however, it could be potentially amazing...assuming you sit inches from your TV screen. :p At typical TV viewing distances, most - if not all, people are unlikely to be able to note the differences.

Myself, I'm not at all interested in 4k for TV. I already have a tough time telling the difference between 720p and 1080p on a 55" HDTV from 3-4 meters away. 4k isn't going to improve on that.

I'm interested in 4k for computing. But don't expect 4k gaming to be achievable for the vast majority of people (using integrated, budget, or midrange video cards) for years after 4k actually becomes available.

Hell, thanks to cell phones, we have more games being released now at 1024x768 on the PC than we have had in over a decade. :p

Regards,
SB
 
Question:

If a game would manage to render at 3840x2160@30fps with a given configuration, would that automatically mean that it could manage to render at 1920x1080@120fps just as well with the same configuration?
 
Question:

If a game would manage to render at 3840x2160@30fps with a given configuration, would that automatically mean that it could manage to render at 1920x1080@120fps just as well with the same configuration?

Depends on the game.

Just go and look at the scaling of some games on PC.

Things like heavy physics, for example, won't become easier with a lower resolution. So the short answer is no. It's going to vary by game as to how much or how little performance scales with resolution.

On the flip side. As resolution increases it puts a ceiling on how well something can perform. So, for example, something could potentially scale quite well going from 720p to 1080p but then fall off a cliff before hitting 1440p (2560x1440) or 1600p (2560x1600). This are definitely going to fall off quite quickly for anything that isn't an enthusiast class video card (400 USD and up). With 4k potentially requiring 2 of those enthusiast cards for fluid gameplay (800+ USD) depending on the game.

Consoles aren't coming anywhere near that for a while. Not if they plan on releasing them for 400 USD or less.

Regards,
SB
 
Consoles aren't coming anywhere near that for a while.

Never mind, the question was more like: "Would you really want a game to render at 3840x2160 with only 30 fps for example, if it could instead render at 1920x1080 but with 60 fps for example and maybe even enhanced visuals on top?" ;).
 
depends, I play at 5292x1050 if the game works ok at that res and can keep a decent framerate then its preferable. It's totally a personal preference thing depending on the framerate and what if any eyecandy you have to give up
 
Never mind, the question was more like: "Would you really want a game to render at 3840x2160 with only 30 fps for example, if it could instead render at 1920x1080 but with 60 fps for example and maybe even enhanced visuals on top?" ;).

Oh, I got you. :) Yes. I don't mind 30 fps. Or even 24 fps in games. I can even tolerate occasional drops into the teens.

Then again I'm older now and not into competitive FPS play like I was in the past. I don't play in tournaments anymore where there's a cash prize for performing well. So I can relax a bit. Back then everything had to be 60-120 FPS (for FPS and action games). Also, CRT required. No LCD with their inferior response.

I definitely "like" it when it is more fluid (60 fps). But I don't mind sacrificing that for either a cleaner image (4-8x MSAA/SSAA with AF, I can't stand MLAA or FXAA alone) or more graphical bling. :)

And at least on PC, since I have a 2560x1600 monitor, sacrificing resolution either means a small window (not good) or scaling it up (again, not good). So, I'm not about to reduce that. If I get a 4k monitor (unlikely as I expect prices for 4k monitors to be greater than 4-5 thousand USD for at least the first 4-5 years although it may possibly drop as cheap as 2-3 thousand by year 4 or 5 from first product hitting the streets), I'll end up with the same situation.

Regards,
SB
 
depends, I play at 5292x1050 if the game works ok at that res and can keep a decent framerate then its preferable.
But that's more like just increasing the FOV though, and not really enhancing the resolution, isn't it ;)?

You're talking about something like Eyefinity, don't you? Three 1680x1050 displays? That should be 5040x1050 though?

It's totally a personal preference thing depending on the framerate and what if any eyecandy you have to give up

But this is not the PC Games forum, it's the Console Forum, isn't it? And on consoles mostly the game developers decide which resolution/framerate/etc. they are going for, don't they ;)?

I definitely "like" it when it is more fluid (60 fps). But I don't mind sacrificing that for either a cleaner image (4-8x MSAA/SSAA with AF, I can't stand MLAA or FXAA alone) or more graphical bling. :)

For consoles though, which probably are getting played on HDTVs mostly, where resolution mostly maxes out at 1920x1080 at the moment, how likely would it be that a console game developer would design a game for "4K"?

Wouldn't he rather choose to go for 1920x1080 with "more graphical bling" :)wink:) and/or more fluidity instead ;)?
 
Last edited by a moderator:
Wouldn't he rather choose to go for 1920x1080 with "more graphical bling" :)wink:) and/or more fluidity instead ;)?

For the most part, I'd imagine so. Hence why 4k resolution is going to be irrelevant other than as a checkmark box for this upcoming gen of consoles. Assuming anyone other than Sony even bother to add that as a checkbox. It's mostly useless for any serious gaming for this next generation of consoles and quite possibly useless for the generation after it.

The one caveate is for really simplistic games that rely more on style than graphical IQ. Something like Limbo, for example, would probably be just fine rendered at 4k as its graphics are so simplistic.

Something like Halo, Gears, Killzone, Uncharted, God of War, etc.? Yeah, not going to be happening at 4k on consoles in the next decade probably.

Regards,
SB
 
4k resolution will "potentially" be great for still shots, assuming the detail is preserved. In motion, however, with movie blur and other types of processing, the effect will be far more muted. Take a 1080p souce and pause a heavy action scene in a movie. Notice how much detail is lost. 4k will do nothing to improve on something like that.

If you up the frame rate there will be less blur.
 
If you up the frame rate there will be less blur.

If you up the shutter speed there will also be less blur.

Since when is resolution required for quality blur the standard by which we should build our Home Theaters after? It's not like every movie is made handheld, even avengers had static shots.
 
It's not about moving and static shots, but the capturing system introducing a tiny degree of blur enough to destroy the benefits of 4k. That is, a pin-sharp pixel at 1080p could be 4 blurred pixels at 4k, such that those pictures captured or upscaled won't look much different. We all agree there are upper limits on resolution above which there's no benefit, whether that's 10k pictures or 2k or 44 kHz audio or 192 kHz or 60fps or 120 fps or 2000 fps - at some point the added cost of supporting nigh-imperceptible qualities means there's a ceiling there's no point in chasing. That limit is also going to be a matter of compromises of what can be realistically achieved with current tech, so even if 192 kHz audio is better than 48 kHz, we won't be able to use that per channel for a good while yet and investment in chasing that higher quality should be directed at solving more important issues like video datarates and lesser compression.

In the case of 4k, I question the resolving power of most cameras in use (with motion, either larger scale or tiny hand movements, and DOF blur) to render significant difference between an image captured at 4k and displayed on a 4k screen, and an image captured at 1080p and upscaled to that same screen. The real difference across the frame is probably going to be about a few percent by my guess. That will be much better in certain areas, such as static eye shot, where the viewer's focus is, which is worth pursuing for those with a large enough FOV and will give a better impression of the improved quality too - with a small area of interest in higher detail, the periphery which is in no great detail (out of perfect focus) will not be appreciated as being out of focus. And that periphery isn't the focus of higher resolution anyway.

But the point is, for 4x the data you're getting a marginal increase in quality for a small niche of the population. Whereas improved framerate would get a far, far more noticeable improvement in quality for only twice or three times the data. It'd improve not only temporal quality but also perceived detail which is constructed over multiple sample. This is true in games also even where perfect pixels will give the best advantage to 4k. I'd be interested to see a result of gamers exposed to two different setups - a 4k game at 30fps and the same game at the same FOV at 1080p60 - and see which they would prefer to play. Personally I'd take 1080p60 on the sets and viewing distances I'll experience. I can't envisage any situation where I have a screen large enough/close enough to benefit from 4k on games.
 
It's not about moving and static shots, but the capturing system introducing a tiny degree of blur enough to destroy the benefits of 4k. That is, a pin-sharp pixel at 1080p could be 4 blurred pixels at 4k, such that those pictures captured or upscaled won't look much different. We all agree there are upper limits on resolution above which there's no benefit, whether that's 10k pictures or 2k or 44 kHz audio or 192 kHz or 60fps or 120 fps or 2000 fps - at some point the added cost of supporting nigh-imperceptible qualities means there's a ceiling there's no point in chasing. That limit is also going to be a matter of compromises of what can be realistically achieved with current tech, so even if 192 kHz audio is better than 48 kHz, we won't be able to use that per channel for a good while yet and investment in chasing that higher quality should be directed at solving more important issues like video datarates and lesser compression.

In the case of 4k, I question the resolving power of most cameras in use (with motion, either larger scale or tiny hand movements, and DOF blur) to render significant difference between an image captured at 4k and displayed on a 4k screen, and an image captured at 1080p and upscaled to that same screen. The real difference across the frame is probably going to be about a few percent by my guess. That will be much better in certain areas, such as static eye shot, where the viewer's focus is, which is worth pursuing for those with a large enough FOV and will give a better impression of the improved quality too - with a small area of interest in higher detail, the periphery which is in no great detail (out of perfect focus) will not be appreciated as being out of focus. And that periphery isn't the focus of higher resolution anyway.

But the point is, for 4x the data you're getting a marginal increase in quality for a small niche of the population. Whereas improved framerate would get a far, far more noticeable improvement in quality for only twice or three times the data. It'd improve not only temporal quality but also perceived detail which is constructed over multiple sample. This is true in games also even where perfect pixels will give the best advantage to 4k. I'd be interested to see a result of gamers exposed to two different setups - a 4k game at 30fps and the same game at the same FOV at 1080p60 - and see which they would prefer to play. Personally I'd take 1080p60 on the sets and viewing distances I'll experience. I can't envisage any situation where I have a screen large enough/close enough to benefit from 4k on games.

The 4K demostration i saw was done with the F65 which is based on a 8K sensor (downsamples to 4k). The C300 which is a 1080p Camera uses a 4K sensor. The industry is already moving in this direction, these are the current cameras and we all know that the next gen will be at least 2x that.

Sure, DOF heavy scenes/shots blurry action sequences, heavy graded stylish movies etc will benefit less, but i think that when even bluray is able to display the difference between ordinary 35mm and imax as seen in some of the action movies of late then there should be room for 4K
 
The one caveate is for really simplistic games that rely more on style than graphical IQ. Something like Limbo, for example, would probably be just fine rendered at 4k as its graphics are so simplistic.

What about games with pre-rendered backgrounds, like the classic "Resident Evil" games, for example though :cool:?

Such games probably could be designed for "4K" while not having to look simplistic at all, couldn't they :D;)?

But:

"Are pre-render based environments dead?"

:cry::cry:;)
 
Sure, DOF heavy scenes/shots blurry action sequences...
No, not DOF heavy! ;) The camera focuses a distance from the sensor as in focus, focusing that light to a tiny spot on the sensor. Everything nearer or further from that focus distance is gradually getting more and more out of focus based on magnitude of difference from the focal distance. The amount of blur is called the Circle of Confusion (less blur than that CoC threshold is perceived as in focus), and the higher the resolution, the smaller the CoC limit before it has destroyed the advantages of the higher resolution. As viewers, all we care about is looking at the image and deciding if it looks sharp or not. But within that data are progressive degrees of blurring that can mean something that looks sharp has data bleed from adjacent samples, meaning going higher resolution doesn't net you any benefits.

This is definitely a factor in choosing a resolutions for films, although I don't know what the combined factors of motion and DOF are needed for a given resolution to destroy its fidelity (that's not render it a blurry mess, but make the difference between one resolution and the next size down practically zero). But it is there, at increasing sensitivity as resolution is increased because the diameter of each sample is getting smaller and more susceptible to tiny changes. Thus it's safe to say that 10k resolution is a waste of time because there'll be too much optical noise to benefit from that sensor resolution. Similarly 480p is too low because it's missing resolving power. The sweet spot is somewhere in between with diminishing returns as cost to add more resolution increases. The argument for 4k is very limited to a subset of captures on a subset of screens.

This of course isn't the case with computer games. Super higher resolutions benefit with large FOV on PC setups. That of course is only to provide wraparound vision though, and a headset could achieve the same result in much lower resolution.
 
What about games with pre-rendered backgrounds, like the classic "Resident Evil" games, for example though :cool:?

Such games probably could be designed for "4K" while not having to look simplistic at all, couldn't they :D;)?

But:

"Are pre-render based environments dead?"

:cry::cry:;)

Think about it for just a second.

Even at 1080p or 720p all games, including PC, have (IMO) horribly blurry textures. Some are better than others and detail textures can help alleviate that to some extent but by and large texture sharpness for the vast majority of games hasn't advanced very far.

Sure for people coming from consoles there was a noticeable increase between going from last gen to this gen. But on PC, it's been pretty stagnant.

Why is this relevant? High resolution gaming has been possible on PC since the 90's when people would try to run games at 1600x1200 or 1800x1440. And eventually leading up to 2560x1600 currently or even higher with multimonitor setups.

Anyway, high resolution gaming has been available for a while on PC, and texture resolution hasn't improved significantly from when 1280x1024 sets dominated the landscape.

Higher assets require significant space in terms of storage and memory. And by extension can significantly increase loading times. And that doesn't even start to consider the potential developement and production costs of higher resolution assets for a game.

So, at my most optimistic level, I "hope" we finally get texture resolution that won't be blurry at 1080p. I certainly don't think we'll get anything that can even remotely do justice to 4k resolutions. Hence for games, 4k resolution will be far less relevant to a gamer than it will to a movie watcher, with one exception. If a game can natively render at 4k resolution, and you have a screen that is small enough with regards to your viewing distance (a 24" monitor at typical PC viewing distance for example) you minimize some of the rendering artifacts caused by aliasing. But even that won't remove all artifacts due to aliasing.

So there's one potential benefit...again assuming you have hardware that can actually render a pleasant gaming experience at 4k resolutions.

Regards,
SB
 
Did you realize that the question was about games utilizing pre-rendered environments ;)?

Yes, which the bulk of that post addressed. I was just trying to add something at the end so that people didn't think I was completely down on 4k. :) Unless someone is thinking of pre-rendered as the entire background is one bitmap/jpg/png/whatever as in the case of FF7 on the original playstation. Which I guess he was. But would gamers actually be satisified with something like that in this day and age? Even point and click adventure games which don't require real time rendered graphics are now using realtime rendered graphics.

Regards,
SB
 
Status
Not open for further replies.
Back
Top