on the TV I have in my room I always used either 3840x2160 (4K) or 3840x1080 (32:9 ultrawide for productivity purposes) as it accepts both resolutions natively -along with 21:9 support-.
do you use typical native 4K or this kind of 4K on your TV?
Both on Windows and Linux the TV resolutions got detected correctly using the EDID values and I had this one too, but Windows always indicated that the recommended resolution was either 3840x2160 or 3840x1080 depending on the mode the TV was set -16:9 or ultrawide-.
Still, sometimes I set the TV to 4096x2160 and the TV image changed completely, it kinda borked my HDR settings and SDR settings, so I thought nothing of it and just forever switched to 3840x2160 and call it a day.
Since I have 50" TV and I was fiddling the other day----, I enabled it and I simply found out that the reason why all my HDR and SDR settings seemed to get borked when switching to 4096x2160 was because instead of having the TV treating it as a different resolution, it treats it like a different setting -say Movie, or Standard, or Dynamic, etc-. so I had to go to the Settings to recalibrate it, and now it looks like any other resolution.
If not for that, the TV when you switch to 4096x2160 set the image to Standard, Contrast Enhancer to High and so on and so forth, max brightness max contrast, etc etc.
For those who play at 4K, have you ever tried that resolution on your TV?
Since I use the TV for games, but also productivity a lot, on a 50" those extra ~600000 pixels are nice to have, since the PPI of a 50" 4K TV aren't particularly high. For games..., well, that resolution is a mixed bag. Nothing that Lossless Scaling on Windows and BFI on Linux can't fix, but yeah, the GPU suffers a bit more.
do you use typical native 4K or this kind of 4K on your TV?
Both on Windows and Linux the TV resolutions got detected correctly using the EDID values and I had this one too, but Windows always indicated that the recommended resolution was either 3840x2160 or 3840x1080 depending on the mode the TV was set -16:9 or ultrawide-.
Still, sometimes I set the TV to 4096x2160 and the TV image changed completely, it kinda borked my HDR settings and SDR settings, so I thought nothing of it and just forever switched to 3840x2160 and call it a day.
Since I have 50" TV and I was fiddling the other day----, I enabled it and I simply found out that the reason why all my HDR and SDR settings seemed to get borked when switching to 4096x2160 was because instead of having the TV treating it as a different resolution, it treats it like a different setting -say Movie, or Standard, or Dynamic, etc-. so I had to go to the Settings to recalibrate it, and now it looks like any other resolution.
If not for that, the TV when you switch to 4096x2160 set the image to Standard, Contrast Enhancer to High and so on and so forth, max brightness max contrast, etc etc.
For those who play at 4K, have you ever tried that resolution on your TV?
Since I use the TV for games, but also productivity a lot, on a 50" those extra ~600000 pixels are nice to have, since the PPI of a 50" 4K TV aren't particularly high. For games..., well, that resolution is a mixed bag. Nothing that Lossless Scaling on Windows and BFI on Linux can't fix, but yeah, the GPU suffers a bit more.
Last edited: