4096 x 2160

Cyan

orange
Legend
Supporter
on the TV I have in my room I always used either 3840x2160 (4K) or 3840x1080 (32:9 ultrawide for productivity purposes) as it accepts both resolutions natively -along with 21:9 support-.

do you use typical native 4K or this kind of 4K on your TV?

Both on Windows and Linux the TV resolutions got detected correctly using the EDID values and I had this one too, but Windows always indicated that the recommended resolution was either 3840x2160 or 3840x1080 depending on the mode the TV was set -16:9 or ultrawide-.

Still, sometimes I set the TV to 4096x2160 and the TV image changed completely, it kinda borked my HDR settings and SDR settings, so I thought nothing of it and just forever switched to 3840x2160 and call it a day.

Since I have 50" TV and I was fiddling the other day----, I enabled it and I simply found out that the reason why all my HDR and SDR settings seemed to get borked when switching to 4096x2160 was because instead of having the TV treating it as a different resolution, it treats it like a different setting -say Movie, or Standard, or Dynamic, etc-. so I had to go to the Settings to recalibrate it, and now it looks like any other resolution.

If not for that, the TV when you switch to 4096x2160 set the image to Standard, Contrast Enhancer to High and so on and so forth, max brightness max contrast, etc etc.

For those who play at 4K, have you ever tried that resolution on your TV?

Since I use the TV for games, but also productivity a lot, on a 50" those extra ~600000 pixels are nice to have, since the PPI of a 50" 4K TV aren't particularly high. For games..., well, that resolution is a mixed bag. Nothing that Lossless Scaling on Windows and BFI on Linux can't fix, but yeah, the GPU suffers a bit more.
 
Last edited:
Your 4K TV's native resolution is 3840 x 2160 which is four 1920 x 1080 screens worth of pixels. (The first 4K TV I used required four 1080 1.5Gb BNC inputs to get an image onto the screen).
Anything other than this resolution will require scaling by the TV and I doubt much care or thought has gone into scaling down a resolution like 4096x2160 to native, which is probably why your image looks a bit rough.
 
Your 4K TV's native resolution is 3840 x 2160 which is four 1920 x 1080 screens worth of pixels. (The first 4K TV I used required four 1080 1.5Gb BNC inputs to get an image onto the screen).
Anything other than this resolution will require scaling by the TV and I doubt much care or thought has gone into scaling down a resolution like 4096x2160 to native, which is probably why your image looks a bit rough.
what do you mean by rough? I meant that the TV treats that resolution as an extra setting and if you select that resolution the image settings you had at 3840x2160 kinda reset and you have to adjust the picture settings again, but once done you are good to go.

The TV accepts a 4096x2160 native resolution, which is authentic. I can't count the pixels I'm not Digital Foundry šŸ˜. If it accepts it, It's not much, but for a 50" TV those almost 600000 extra pixels would be an improvement. Most 4K displays seem support 4096x2160 as that is true 4K with 3840x2160 being "UHD". It also seems to decrease the framerate on videogames --gotta test it.

It seems to be a standard thing on TVs, mine is a Samsung Q80A QLED -european model which has a VA panel-.

On Windows and Linux it is a resolution option you can choose.

LG OLED also supports it.


thing is...., aren't the TVs natively and physically 3840 pixels wide? They should be, judging by the specs, hence I don't get that setting.
 
Last edited:
have tested it. The actual physical pixels of the screen are 3840x2160 pixels, but both Windows and Linux accept 4096x2160 as a resolution. For a 50" the result is an increased clarity when it comes to text, because of the extra PPI, which I'm grateful for. It has a very slight impact on gaming performance, but I am ok with that.
 
have tested it. The actual physical pixels of the screen are 3840x2160 pixels, but both Windows and Linux accept 4096x2160 as a resolution. For a 50" the result is an increased clarity when it comes to text, because of the extra PPI, which I'm grateful for. It has a very slight impact on gaming performance, but I am ok with that.
You're downsampling. It's no different in principle to an SDTV that accepts an HDTV signal. You feed it 1080p, it shows the image it received shrunk to fit its 480p pixels. This will certainly look better than showing 480p native but worse than 1080p data on a 1080p display. this is far more prominent on projectors that typically support higher resolutions input than they have pixels to present them. Many cheap ones have 4k input and 1080p output.

Your 'increased clarity in text' is almost certainly how you are perceiving the associated 'blur'. That slight downsampling will break up perfect pixels and reduce jaggies. Many years ago I preferred using a VGA connection over DVI to a 1024x768 monitor because it was softer, which actually made the OS text more comfortable to look at than the raw pixels of the digital input.
 
Downscaling to is much trickier than upscaling where you have extra destination resolution to help in the process, particularly when using AI upscaling.
In this case, 4096 lines of generated video have to be squeezed onto 3840 physical lines. You are going to lose clarity on high frequency data and be susceptible to data dependent repeating artifacts. You might get something like the resolution artifact equivalent of 3/2 pulldown motion judder.
If you have monitor test pattern generator software (not compressed stills, needs to be software generated), try outputting animated zone plate at 3840x2160 and then at 4096x2160 to see the difference. That'll really test your TV scaler.
 
Surely this .png viewed full screen would be enough for a basic consideration without needing an app? It's 4096x2048 alternating black/white columns. Scaling should make a complete mess of it and illustrate what the monitor is really doing and how it's not adding clarity.

Image1.png
 
Shouldn't the opposite be true since its a compressed image (if it actually only has 3840x2160 pixels)

There are no extra ppi
text looks a bit less aliased though, as Shifty commented. I compared just using my own eyes and while not a huge upgrade it's noticeable using the exact same calibration settings.

Isn't that some kind of super sampling? The text looks a bit nicer to me, not huge but important.

You're downsampling. It's no different in principle to an SDTV that accepts an HDTV signal. You feed it 1080p, it shows the image it received shrunk to fit its 480p pixels. This will certainly look better than showing 480p native but worse than 1080p data on a 1080p display. this is far more prominent on projectors that typically support higher resolutions input than they have pixels to present them. Many cheap ones have 4k input and 1080p output.

Your 'increased clarity in text' is almost certainly how you are perceiving the associated 'blur'. That slight downsampling will break up perfect pixels and reduce jaggies. Many years ago I preferred using a VGA connection over DVI to a 1024x768 monitor because it was softer, which actually made the OS text more comfortable to look at than the raw pixels of the digital input.
now that you mention it, I experienced something similar when I had the Xbox 360, which accepted VGA cables. I had a chinese VGA cable -the official VGA cables were so difficult to find back then- that did the job and I remember buying a HD Ready TV which was 1366x768 native, to play X360 games and when I used the HDMI cable on it, aside from using top and down black borders 'cos the console only outputted either 720p or 1080p via HDMI so the resolution was 720p, when using the VGA cable the image looked MUCH MUCH better than using HDMI. The contrast especially. Maybe it was the colour space, a concept I didn't know at the time tbh, and the console probably outputted in the correct color space for a monitor? It was night and day. The clarity was also much nicer.
 
Downscaling to is much trickier than upscaling where you have extra destination resolution to help in the process, particularly when using AI upscaling.
In this case, 4096 lines of generated video have to be squeezed onto 3840 physical lines. You are going to lose clarity on high frequency data and be susceptible to data dependent repeating artifacts. You might get something like the resolution artifact equivalent of 3/2 pulldown motion judder.
If you have monitor test pattern generator software (not compressed stills, needs to be software generated), try outputting animated zone plate at 3840x2160 and then at 4096x2160 to see the difference. That'll really test your TV scaler.
many thanks for the explanation. Does your TV have one of those test pattern generator apps? Mine doesn't afaik.

So what do you think it could be better for general use? I mean productivity, and then gaming on Windows. Just curious... The text seems to look less aliased to me as mentioned before, and regarding the PPI maybe @Davros is right, 50" 4K is like 89PPI, not that great, and 4096x2160 on a physically 3840x2160 screen won't increase the PPI, right?


Surely this .png viewed full screen would be enough for a basic consideration without needing an app? It's 4096x2048 alternating black/white columns. Scaling should make a complete mess of it and illustrate what the monitor is really doing and how it's not adding clarity.

View attachment 13402
thanks Shifty for the idea. Tried that image. When I maximised it the difference didn't look huge. However, at 4096x2160 some lines appeared like vertically doubled? Dunno how to explain it. At 3840x2160p the image looked more uniform. Not a super huge difference though but there was something going on with how they look.

Would give it a second test later, my vision and my head feel kinda dense today, lots of work and I slept like 5 hours.
 
many thanks for the explanation. Does your TV have one of those test pattern generator apps? Mine doesn't afaik.

So what do you think it could be better for general use? I mean productivity, and then gaming on Windows. Just curious... The text seems to look less aliased to me as mentioned before, and regarding the PPI maybe @Davros is right, 50" 4K is like 89PPI, not that great, and 4096x2160 on a physically 3840x2160 screen won't increase the PPI, right?



thanks Shifty for the idea. Tried that image. When I maximised it the difference didn't look huge. However, at 4096x2160 some lines appeared like vertically doubled? Dunno how to explain it. At 3840x2160p the image looked more uniform. Not a super huge difference though but there was something going on with how they look.

Would give it a second test later, my vision and my head feel kinda dense today, lots of work and I slept like 5 hours.
No, I work in broadcast so use hardware test pattern generators. Many years ago when interfaces were still analogue, there used to be plenty of freeware PC monitor testing software that generated patterns, but I'm having difficulty finding any now.
Shifty's .png image example is fixed resolution, so you'll need to keep your PC at 4096x2160 to test the TV's scaling performance. If you lower the PC resolution to 3840x 2160 then you're testing the PC's software scaling. What you really need is a native 3840x2160 version of that image as well so you can carry out a scaled/unscaled comparison.

I would always choose native panel resolution for output. If you go higher, you'll get more text on the screen, but at the cost of artifacts and fringing.
 
My previous image was actually 2048 vertical. Doh. I'll add two here. These .png's compress down nicely so filesize isn't an issue. ;)
many thanks for the images Shifty. Wish I could say I noticed much of a difference but I didn't. Tried several times and well..., maybe I looked at the wrong place or pattern. Both display blocks are grey and darker vertical lines. Not a biggie though.

I set the resolution to the native 3840x2160. Compared to 4096x2160 the text looks like it has increased sharpness, it's not as antialiased as when you use 4096x2160 but the UI of the desktop or the apps, especially where images are involved -say Steam showing a cover of game in your library-, looks different.

If you look at the Steam UI when browsing your library, the image at 4096x2160 looks ok but it has some kind of flattening to it. At 3840x2160 it looks more stylised, if that's the word.
 
many thanks for the images Shifty. Wish I could say I noticed much of a difference but I didn't. Tried several times and well..., maybe I looked at the wrong place or pattern. Both display blocks are grey and darker vertical lines. Not a biggie though.

I set the resolution to the native 3840x2160. Compared to 4096x2160 the text looks like it has increased sharpness, it's not as antialiased as when you use 4096x2160 but the UI of the desktop or the apps, especially where images are involved -say Steam showing a cover of game in your library-, looks different.

If you look at the Steam UI when browsing your library, the image at 4096x2160 looks ok but it has some kind of flattening to it. At 3840x2160 it looks more stylised, if that's the word.
Somethings wrong, you shouldn't see any blocks or stripes if you're outputting the 3840 image at 3840 resolution. You are looking at it borderless full screen?
Do you have overscan enabled? That'll discard the outside edges of the image and cause a slightly scale to 3840. It's a legacy feature required for early analogue tv transmittions that's not needed now, but is often turned on by default.
Do you have any processing turned on?
 
many thanks for the images Shifty. Wish I could say I noticed much of a difference but I didn't.
To echo MrSpiggot, something is wrong. ;) Native will display alternating black/white columns, clearly visible up close across the screen and resulting in a uniform grey appearance when viewed at a distance. The 3840x2160 image on a native 3840x2160 display should look just like that, whereas any other image scaled to fit would have irregular light/dark stripes.

Here's a closeup of the native pattern on my monitor
Image1.jpg

And here's the pattern scaled
Image2.jpg

If you aren't seeing the first image on your display, there's something up with it.
 
You have to factor in how the image is being displayed. If a viewer, it might be cropped for a 1:1 display. You need a full-screen 4096x2160 display with the image shown on your 3840x2160 display and have it scale the image to fit. Again, if it's not, there's something amiss. It's mathematically impossible to present 4096 alternating BW columns at 1:1 with a 3840 column display, so something somewhere would be truncating.

Maybe I should add measurements to test for cropping?
 
Back
Top