Is current technology finally making 1080p redundant?

I've been thinking about this for a while now and when I saw Horizon Forbidden West last night on a 1080p TV (I first saw the game on a 4k TV) the difference in portrayed detail was staggering with so much of it lost.

The difference was more than I was expecting considering the viewing distance vs resolution vs display size and everything that tells us about resolution 'sweet spots', even when playing at 1080p on my PC I've found myself looking at certain games and wondering what difference would there be if I moved to a 1440p monitor.

Are we now at or coming to a point where the fine detail we're going to be getting in next generation games will too much for 1080p to show to the user?
 
Yes, we already are there. Even with downsampling much detail is lost. But there is a catch to this. Fine detail gets smaller and smaller especially in the distance. But we are already at a point where it gets really hard to tell the difference if you don't compare the pictures side to side.
Currently 1440p is the best compromise (IMHO) between visual quality, price and performance. You get a good visual uplift while you can still output at 120+Hz and you still don't need the highest ultra high end card to get a fluent picture.
4k is a bit to much for now. You loose a lot of performance that could be used for more detail, better lighting, ...

The other thing are the AA options. They often are more or less better blur filters. So they destroy small details which can also be compensated with a higher base resolution. Also those TAA methods can also give a much better (less artifacted) result with higher resolution.
 
Redundant, no. Suboptimal, yes. Increases in fine detail still look better at 1080p and GPU performance is still way too low to render it obsolete. Esp as reconstruction continues to improve.
 
I think it depends on the game and the platform. People seem really happy playing games at 720p on their Switch and at 800p on their SteamDeck. On other devices, plenty of folks will drop resolution for better frame rates. If you're having to make that choice, it's subjective. Losing detail often doesn't impact gameplay, but a poor framerate does.

I discovered years ago that 1440p was the sweet spot for my distance to my 4K TV. I really struggle to tell the difference between 1440p and anything higher unless there are artefacts introduced by a lower resolution. But I can often tell between 1080p and 1440p.
 
Redundant, no. Suboptimal, yes. Increases in fine detail still look better at 1080p and GPU performance is still way too low to render it obsolete. Esp as reconstruction continues to improve.
Well I'd say reconstruction techniques will be one of the things that does help push 1080p towards obsolescence, since the actual demands for a 4k output will go down a whole lot in terms of processor demands, all else being equal. Even if you're having to use something like DLSS Performance, it'll still be better than playing at 1080p.

Otherwise I do agree with the first part about it merely being sub-optimal rather than totally obsolete. Plenty of budget PC gamers will stay on 1080p for a while yet. Still accounts for 67% of people on Steam hardware survey, after all.

Personally, I'd consider 1440p to be slightly sub-optimal as well.
 
I've been thinking about this for a while now and when I saw Horizon Forbidden West last night on a 1080p TV (I first saw the game on a 4k TV) the difference in portrayed detail was staggering with so much of it lost.

The difference was more than I was expecting considering the viewing distance vs resolution vs display size and everything that tells us about resolution 'sweet spots', even when playing at 1080p on my PC I've found myself looking at certain games and wondering what difference would there be if I moved to a 1440p monitor.

Are we now at or coming to a point where the fine detail we're going to be getting in next generation games will too much for 1080p to show to the user?

We have been there for awhile. I recently had the chance to use a 8k screen with an unannounced graphics card and it was staggering how beautiful it looked and the detail that i could pick out over 4k even with my 40 year old eyes.

I think like most visual technology , its only when you get used to the higher fidelity that going backwards is hard. For example if you play 1080p all the time and try 4k out one or two times you will connivence yourself the difference isn't huge. But if you play 4k all the time and suddenly go to a 1080p monitor and 1080p resolution you will easily miss out on the detail and know you are missing out on it.
 
Yeah 1440p seems to be the sweet spot. Although most people probably won't be able to noctis the difference, as usual. Unless side by side. Even then, they probably still thinks the differences are too minuscule

I think like most visual technology , its only when you get used to the higher fidelity that going backwards is hard.

Its for a lot more things than just visual tech. Even non-tech stuff like... if you accustomed to live in a clean environment, then go to the city and cant stand the city smell.
 
Yeah 1440p seems to be the sweet spot. Although most people probably won't be able to noctis the difference, as usual. Unless side by side. Even then, they probably still thinks the differences are too minuscule



Its for a lot more things than just visual tech. Even non-tech stuff like... if you accustomed to live in a clean environment, then go to the city and cant stand the city smell.
that is true also. Guess its life style creep.
 
I think I'm mostly in agreement with Eastman: once you get accustomed to the resolution of a high definition device, it's hard to move backward. Even for games that "don't need the resolution", you still end up noticing the lack of resolution to the point of distraction. A top-of-mind example: letting my daughter play Minecraft on my old Lenovo Y460 -- the platform is entirely capable of playing Minecraft with high quality settings, simply because it's just not a demanding game. Still, watching her play at 1366x768 on the 14" is notably inferior to my Dell Ultrasharp U2711 2560x1440 screen on the "big" PC.

She doesn't care in the slightest; it nags me without even conciously realizing it.
 
The difference was more than I was expecting considering the viewing distance vs resolution vs display size and everything that tells us about resolution 'sweet spots', even when playing at 1080p on my PC I've found myself looking at certain games and wondering what difference would there be if I moved to a 1440p monitor.

The difference between 1080p and 4K or even 1440p when talking about a PC monitor is going to be pretty big, but if we are talking about a living room TV where the user sits a good distance away from the screen, its going to be more subtle, assuming the screen size isn't massive relative to viewing distance. If you have a 50" TV sitting 10ft or more away from the screen 1080p is going to look still look very good. One of the reasons I always liked the Xbox Series S concept, it allowed consumers who were happy with their current 1080P displays to play the newest games without needing to upgrade to get the most out of their hardware.
 
I think like most visual technology , its only when you get used to the higher fidelity that going backwards is hard. For example if you play 1080p all the time and try 4k out one or two times you will connivence yourself the difference isn't huge. But if you play 4k all the time and suddenly go to a 1080p monitor and 1080p resolution you will easily miss out on the detail and know you are missing out on it.
This is true for refresh rates as well. 60 was fine until you are used to frame rates beyond that. And then going back to 60.... Whoof. Might as well be a Charlie Chaplin movie.
 
This is true for refresh rates as well. 60 was fine until you are used to frame rates beyond that. And then going back to 60.... Whoof. Might as well be a Charlie Chaplin movie.

My 1080p monitor is 144Hz and honestly, while it's nice I do prefer 60fps with more eye candy along with lower temps and noise.
 
I think the problem isn't 1080p, it's the TAA we use today.

Playing the original 2007 release of Crysis with 4xMSAA+4xTrSAA at 1080p looks so much sharer than the remaster with it's TAA.

I just think you need to go above 1080p to reduce the TAA blur.
 
I think the problem isn't 1080p, it's the TAA we use today.

Playing the original 2007 release of Crysis with 4xMSAA+4xTrSAA at 1080p looks so much sharer than the remaster with it's TAA.

I just think you need to go above 1080p to reduce the TAA blur.
You also need less anti aliasing as you climb up resolutions. At 8k I doubt you will even need to turn it on
 
going from lower resolution to higher resolution it's hard to see a difference for some reason. But when you are used to high and then you go low, it's immediately noticeable.
I would say 4K is the limit, but the year I use an 8K screen daily I know once I downgrade back to 4K, I'll see it.
 
Sorry! I rescind my answer, I thought this was PC gaming related.

For consoles yeah I think they should aim a bit higher, 1080p is sort of out of date for a new console...although it should still work on a 1080p display.
 
Well I'd say reconstruction techniques will be one of the things that does help push 1080p towards obsolescence, since the actual demands for a 4k output will go down a whole lot in terms of processor demands, all else being equal. Even if you're having to use something like DLSS Performance, it'll still be better than playing at 1080p.

Otherwise I do agree with the first part about it merely being sub-optimal rather than totally obsolete. Plenty of budget PC gamers will stay on 1080p for a while yet. Still accounts for 67% of people on Steam hardware survey, after all.

Personally, I'd consider 1440p to be slightly sub-optimal as well.

Again subjective, but I just cannot understand how anyone could possible stand to play any game using DLSS Performance. To me it looks like utter garbage. I'd prefer native 720p on a 55" display to 4k using DLSS Performance.

Again, different people different things that could bug them. For me, DLSS Performance is a constant assault on the eyes of continuous rendering artifacts and anomalies that absolutely drives me bonkers.

Regards,
SB
 
Again subjective, but I just cannot understand how anyone could possible stand to play any game using DLSS Performance. To me it looks like utter garbage. I'd prefer native 720p on a 55" display to 4k using DLSS Performance.

Again, different people different things that could bug them. For me, DLSS Performance is a constant assault on the eyes of continuous rendering artifacts and anomalies that absolutely drives me bonkers.

Regards,
SB
If it's any consolation, my son totally agrees with you. Got him a 2060 super about the time DLSS 2.0 came out and kept bugging him to try it. He did and said it was an insult to his eyeballs.

I don't like any upscaling techniques in gaming, it just looks janky to me. (Then again, I was using AMD's solution)
 
Again subjective, but I just cannot understand how anyone could possible stand to play any game using DLSS Performance. To me it looks like utter garbage. I'd prefer native 720p on a 55" display to 4k using DLSS Performance.

Again, different people different things that could bug them. For me, DLSS Performance is a constant assault on the eyes of continuous rendering artifacts and anomalies that absolutely drives me bonkers.

Regards,
SB

I use DLSS quality mode is good at 1080p, I use it when I play Dying Light 2 with all the RT options turned on.

I genuinely can't notice the difference and with the sharpness slider changed to 60 (50 is default)
 
Back
Top