1080p HDR image better than 4k non HDR ?

I went to Walmart the other day and compared 1080P displays vs. 4K displays. The 4K displays did look clearer and better, but not by a huge amount. I had to get very close (perhaps a foot away) to see the pixels in the 4K monitor. However, I also had to get pretty close to see them in the 1080P monitor/TV. At the normal distance I would be gaming if I were to get a console and monitor, I wouldn't notice a big difference.

If I were to go out and buy a display, I'd stick with 1080P unless I was wealthy and had money to burn.

My oppinion too. I can see the benefits of 4K, but to me they seem nowhere near the benefits we had from going from sd to hd. I see 4K as more of a refinement than a real need.
So if you ask me if I thing it is worth it spending money on a 4K TV if you have a good 1080p, I would say no. And the same for using extra Tflops to process at 4K instead of improved 1080p graphics with a good AA solution.

As for HDR, i see more benefits. But go to a large shop with lots of TVs on display and look at all of them.
Is the image qualitty the same on all, even if all are 1080p with no HDR?
No way. There are screens with better quality than others, and some of them have really, really better colors than others.
Besides, how many people really spend time fixing the settings on their TV for better image quality?
Most use default settings, or just change the color, brightness, sharpness, and others a little bit. But how many go deep into the configurations using the advanced settings?

I used to have two Samsung 40" 1080p TVs, and two years ago one of them had problems, so I replaced it with a 40" 1080p LG.
I was amazed by its image quality. Way superior than my series 6 Samsungs. Besides, it had an expert mode that I used and improved image even more. The image was so good people coming to my house noticed it.

Early this year, my father decided to buy a bigger tv, so I gave him my other Samsung, and went for a 55" 1080p LG (4K were already available and not expensive, but if I wanted to keep all the extras this TV had, the price would go up like crazy).
Although image was good, it was not as good as the old TV. And webos had not the same easy to use advanced options the old Lg OS had (they were there, but there was no step by step option assisted option as in the old tv). My wife kept telling me that the TV was not as good, and that she liked the other LG better.

It took me about one week of tweaking, but finally I managed to get both pictures very similar. And now both are excelent. And this just served to show me that the quality of a TV screen depends on a lot of variables, some we can adjust and control, and others we cannot.

At the moment I am super happy with both TVs, and have no intention of replacing them. I keep seeing 4K and HDR at stores, with some TVs showing super amazing quality, and others not so. But, overall, nothing that I find really worthy of exchange. Because, same as you, I find the diference, at normal viewing distancy, small.

Besides, HDR is at its infancy. Standard HDR uses 10 bits of colot, but Dolby Vision uses 12. I do not like plunging in techs that are still in development. My neigbour bought a 6000 euros Sony 4K Tv and a geforce Titan when they appeared, and it kept complaining about the mistake because it could not get more than 24 fps at 4K.

Take note that this is just my oppinion, based on my experience, on an upgrade from a good 1080p TV to a 4K, at the moment.
 
There are almost no non-4K displays sold over here in the store. The HDR and contrast are going to be the main factor both in perceived quality and cost.
 
There are almost no non-4K displays sold over here in the store. The HDR and contrast are going to be the main factor both in perceived quality and cost.
But you need an HDR TV and most of them are 4K (just did some searching)...
 
Well, that's my point basically. The additional cost of 4k vs 1080p is very likely negligible versus the cost of 1000+ nit HDR.
 
For those interested in a fine 4k TV that isn't extremely big -the table where I am going to have and my bedroom's space is limited-, this Eurogamer article might be interesting;

http://www.eurogamer.net/articles/digitalfoundry-2016-samsung-ku6400-4k-tv-review

"However, it's when gaming at 4K where the real benefits to the added resolution over 1080p really become clear - especially at smaller screen sizes. Overall sharpness is given a considerable boost, with the presentation appearing more like looking through a window rather than a digital display. At 40 inches the high pixel density makes images appear like they are running on a large iPhone retina display, and the sense of clarity is very impressive. Geometry edges in games also appear smoother and more refined, and overall this helps to make games using low levels of anti-aliasing look cleaner than a 1080p display."
 
I saw some hdr gaming demos on the flagship lg oled display. I'm very much sold to HDR in gaming. I would personally much rather take HDR over higher resolution. Unfortunately there is no choice and it even looks like not all hdr displays are equal. At the moment there is huge difference between different displays.
 
You can get very good 27" 4k screens for PC. Considering getting the LG myself. But HDR is not there yet for the smaller sizes.
 
I believe HDR when properly adjusted and displayed on a Full Array Local Dimming set with high nits count or a 2016 LG HDR Oled would should its best form. A FALD set adds so much to the HDR impact by minimizing blooming and skyrocket the contrast ratio to display the dynamic range than an edge lit set. Personally I haven't seen many Fald users and the only ones who are praising the HDR are some oled owners. Strange enough the edge lit owners are the ones frequently complaining about the effect.
So yeah it's important to couple yourself with a good set, not a cheap price to pay but well worth it.
 
I believe HDR when properly adjusted and displayed on a Full Array Local Dimming set with high nits count or a 2016 LG HDR Oled would should its best form. A FALD set adds so much to the HDR impact by minimizing blooming and skyrocket the contrast ratio to display the dynamic range than an edge lit set. Personally I haven't seen many Fald users and the only ones who are praising the HDR are some oled owners. Strange enough the edge lit owners are the ones frequently complaining about the effect.
So yeah it's important to couple yourself with a good set, not a cheap price to pay but well worth it.
Thanks for the useful advice. I can wait because I don't have a device at home which can run anything at 4k save for youtube videos, and hopefully next year screens with 1000 candelas or so are available at a fine price, and if you add to that that space in my room is limited maybe I can find a 4k HDR set within a price range that is acceptable for my battered economy.

on a different note, FF XV is going to support HDR on X1 S, Phil Spencer has confirmed.

.@FinalFantasy XV is confirmed to support HDR on #XboxOneS at launch. Can't wait to play on 11/29. @FFXVEN @SquareEnix @xbox
 
The price will come down for sure and the most bang for the buck FALD set currently out is the Samsung 65" KS9800, with ample peak brightness 1400+ nits, quantum dot for 94% of DCI P3 (awesome wide color gamut) and high contrast ratio. The only downside is the slight haloing in a dark room but that's hardly significant compared to an edge lit set. Depends on where you live, here in Australia TVs are way over priced compared to the States so think about the poor Aussies who have to endure a grand extra for the same item:(.

FFXV would sure look lovely with HDR especially with its vast use of magic and outdoor wildness.
 
I saw the Eurogamer article as well:

- 4K displays means that 1080p content will look softer due to the 2x upscaling (horizontal and vertical). On the other hand I'm watching lots of "TV series off the internet" in SD resolution and it looks fine for me (upscaled to 1080p). 4K will make it more blurry.
- HDR is a buzzword. You need either an OLED screen or an expensive LCD with local dimming, otherwise it makes no sense.

Gaming is not a reason for me. The 4000 euro TV has to come down to 1000 euro first.

I guess TV at 4K will come in 2030 here. The switch to HD took ages here. They finally managed to have decent TV studios and live broadcasts not until last year. However, linear TV is under pressure.... but 4K streaming also means you need > ADSL2+ speeds. We're not there yet.

I think I'll wait at least 2 years, unless my TV stops working.
 
Last edited:
- 4K displays means that 1080p content will look softer due to the 2x upscaling (horizontal and vertical).
1080p at the same screen size will have visible pixels. These will be 'softened' on the 4k set but even if bi-linearly filtered, it'll look no worse, and in reality you'll have a decent upscale algorithm filling in the missing data with something reasonable.
 
1080p at the same screen size will have visible pixels. These will be 'softened' on the 4k set but even if bi-linearly filtered, it'll look no worse, and in reality you'll have a decent upscale algorithm filling in the missing data with something reasonable.

How will they be visible?

If the screen in the same size 1080 image will use 4 pixels per input pixel but these 4 will be the same size as a 1080 screens single pixel surely?
 
How will they be visible?
Not sure what you mean.
If the screen in the same size 1080 image will use 4 pixels per input pixel but these 4 will be the same size as a 1080 screens single pixel surely?
No image displayed on your TV is going to consist of the same pixel duplicated 4x with nearest neighbour sampling (unless that's how low latency 'game mode' upscaling works?). If simple bilinear filtering is used, you'll have alternating true pixel and half-blend pixel values, producing a 'blur' or softness. However, these will be occupying a single 1080p pixel in area, so it's not like it'll be a noticeable softness unless the size of the 1080p pixel in your view is large enough that you can see the individual pixel edge. And in reality the upscale will be something cleverer than a bilinear filter so the softness will be even less.
 
And regularly beams the light level and viewing hours back to Samsung?

LOL. More than likely they getting the data from market research companies.

Comments below not directed at you.

Regardless, unless your light switch in your living room is broken to "on" or you live with a spotlight spilling through your windows every night, a dark room is an option to almost every one during the prime times where TVs are being heavily used.

That being said, I find it hard to believe that only 20% of the population ever bothers to pop some popcorn and flick their light switches off to spend an evening entertaining themselves with a movie theater like experience. I'd bet that number is a product of people spending a lot of time watching TV casually where a host of tv tech isn't readily enjoyed or even necessary. No one is buying 55-70 inch HDTVs to watch soap operas, morning shows, Judge Judy, the news or late night TVs even though these TVs are readily used for those purposes.

So what if HDR requires darkness to be readily enjoyed? I bet a lot of gamers are going to be playing more in the dark to experience what HDR offers.
 
- HDR is a buzzword. You need either an OLED screen or an expensive LCD with local dimming, otherwise it makes no sense. I guess TV at 4K will come in 2030 here. The switch to HD took ages here. I think I'll wait at least 2 years, unless my TV stops working.
Considering a broadcasting pay attention to HLG(Hybrid Log Gamma) specification we should see in two years from major tv manufacturers. Production pipeline and media content is using a dual format where SDR color and HDR color schemes are broadcasted in a same dvb signal. It gives older and fresh HLG enabled TVs work properly.
 
- HDR is a buzzword. You need either an OLED screen or an expensive LCD with local dimming, otherwise it makes no sense.

This isn't entirely true. You need a very expensive TV to see a significant benefit from HDR. You need a very expensive TV to see a benefit at all in a bright room. However, even a modest TV (like mine), sees a clear benefit from HDR content as long as the room is dark enough (at night, basically). The average picture brightness goes down noticeably in HDR mode on my TV, but this does allow the peaks to appear suitably bright in comparison and, since it handles dark content well by LCD standards, there is a noticeable improvement when viewing HDR content over non-HDR content. I A/B'd the Blu-ray and UHD Blu-ray of the original J.J. Abrams Star Trek reboot and there's a clear improvement.

I think I'll wait at least 2 years, unless my TV stops working.

Can't say this is a bad idea, though.
 
Last edited:
I think it's good that HDR is being coupled with with 4K. Makes it worthwhile.

If one had to choose, I would have agreed with you. It will increase the pressure on GPU performance but it will be a worthy target to strive for.

OTOH, I just feel HDR should be standard at *all* resolutions as fast as possible.
 
If one had to choose, I would have agreed with you. It will increase the pressure on GPU performance but it will be a worthy target to strive for.

OTOH, I just feel HDR should be standard at *all* resolutions as fast as possible.
after seeing things like this, I fully agree with you.

http://www.eurogamer.net/articles/digitalfoundry-2016-panasonic-dx750-4k-hdr-ultra-hd-tv-review

Non HDR

SDR.png


HDR

HDR.png
 
Last edited:
Back
Top