1080p HDR image better than 4k non HDR ?

Depends. Scart can't do 480p. Just 240p and 480i. For the 16bit consoles, it's the best you can get. PS2 and newer which can do progressive scan, component is better.
 
Depends. Scart can't do 480p. Just 240p and 480i. For the 16bit consoles, it's the best you can get. PS2 and newer which can do progressive scan, component is better.
Only some games supported progressive scan if I recall. One of the greatest omissions on the PS2 was not offering choice between 50hz/60hz and progressive scan as standard or at least for most games.
 
now Youtube officially supports HDR. The article features a picture to show the differences.

PmqyBTeo3DLGK8UePwCHftGSJ4W7vg8_W09TYiCgFEiVTtxlzo3R377_FP3FBrNZr-adL8f03RAJ27t96dfUYQBSqWF8iYtATs1TUM72aOEilllksoDvvT37OgRbpv-Lfo042grw

Simulated SDR vs HDR comparison (seeing true HDR requires an HDR display)

"Simply put, HDR unlocks the most spectacular image quality we've ever streamed."


https://youtube.googleblog.com/2016/11/true-colors-adding-support-for-hdr.html
 
Last edited:
According to DF, 1080p upscaled to 4K constantly looks worse than non upscaled 1080p on a 1080p display.

http://www.eurogamer.net/articles/digitalfoundry-2016-samsung-ku6400-4k-tv-review : "4K content looks stunning on the KU6400, but when it comes to 1080p and lower resolutions it's worth pointing out that 4K screens in general deliver a softer presentation than a native 1080p display
That's as entry 4K HDR model as it can be found today. They should repeat their test with some more competent TVs. Sony is well known for having the best upscalers in the TV business.
 
LOL, heise.de was told by AMD that PS4 Pro is outputting at YCbCr 4:2:2 only when enabling HDR at 4K:

https://translate.google.com/transl...eber-HDMI-nur-mit-8-statt-10-Bit-3488970.html

;)

They say that's because HDMI 2.0 can not do full YCbCr 4:4:4 and 4K and 10-bit simultaneously.

For 4K HDR with HDMI 2.0 you apparently have to make a tradeoff in going down to 8-bit if you want to presume full 4:4:4 (which they say Polaris is doing on PC) or you have to go down to 4:2:2 if you want to presume 10-bit (which they say Polaris is doing on PS4 Pro).

If anyone is wondering what going down from 4:4:4 to 4:2:2 or even 4:2:0 means, see this:

4:4:4:
rgbk3olr.png


4:2:0:
yuv420amq5v.png


and read this:

https://en.wikipedia.org/wiki/Chroma_subsampling

Going down from 4:4:4 to 4:2:2 in this case means reducing the chroma (color) resolution from 3840x2160 to 1920x2160...

;)
 
Last edited:
They say that's because HDMI 2.0 can not do full YCbCr 4:4:4 and 4K and 10-bit simultaneously.

For 4K HDR with HDMI 2.0 you apparently have to make a tradeoff in going down to 8-bit if you want to presume full 4:4:4 (which they say Polaris is doing on PC) or you have to go down to 4:2:2 if you want to presume 10-bit (which they say Polaris is doing on PS4 Pro).

It can't do it all or it can't do it above a certain frame rate?

2.0 supports 4K RGB and 4:4:4 up to 16 bits but only up to 30 fps. With 4:2:2 or 4:2:0 only support at 50 or 60 fps and above 8 bits.
 
Last edited:
That's as entry 4K HDR model as it can be found today. They should repeat their test with some more competent TVs. Sony is well known for having the best upscalers in the TV business.
i wonder, can PS4 be updated again via software to introduce 4k output?

like microsoft did ages ago for adding various "PC" resolution to xbox 360, including one that is higher than 1080p (i think it was 1200p or something).
 
now Youtube officially supports HDR. The article features a picture to show the differences.

PmqyBTeo3DLGK8UePwCHftGSJ4W7vg8_W09TYiCgFEiVTtxlzo3R377_FP3FBrNZr-adL8f03RAJ27t96dfUYQBSqWF8iYtATs1TUM72aOEilllksoDvvT37OgRbpv-Lfo042grw

Simulated SDR vs HDR comparison (seeing true HDR requires an HDR display)

"Simply put, HDR unlocks the most spectacular image quality we've ever streamed."


https://youtube.googleblog.com/2016/11/true-colors-adding-support-for-hdr.html

this also in reality comes with huge asterisk
*only look that much marvelous when you move from shitty TV to high-end HDR TV.

for example, i moved from 1080p SDR HDTV with rather good black. Now i moved to 4K HDR TV with shitty black level + shitty HDR and it looked worse, be it in SDR or in HDR (HDR vs SDR also shows no discernable difference except the increased input lag).
 
I'm out for an xbox one s and a new tv.
I don't really care for resolution, but all this hdr talking kicked in me, but I discovered that my budget can buy only an 8 bit hdr tv.
On top of that I've read that (with great surprise) if I want to watch on air tv the best solution is a full hd panel due to upscaling quality problems.
Oh come on! Tv you must just display an input pixel on 4 pixels! What can be so difficult?

So... 8bit hdr is still relevant and noticieable compared to sdr in your opinion, or is almost the same as a standard 1080p?
 
Almost all sub 1000€ UHD TV are listed as hdr 8 bit, and even the One S has a setting for 8bit (256 shades for color instead of 1024) hdr, but myknowledge ends here.
Are you saying that is not a standard or that is the same as SDR?
 
My understanding is that HDR means 10-bit panel. But to be honest, there's so many contradicting stories online that it's hard to know what's real and what's not.
 
I remember reading about that too, it seems there can be some processing inside the TV to increase backlighting to give HDR effects, ie no need for 10 bits panels.

Honestly very fuzzy about the thing, I just ended up having to remember to get an OLED or Quantum Dot TV when available and not to bother with anything else.
 
I remember reading about that too, it seems there can be some processing inside the TV to increase backlighting to give HDR effects, ie no need for 10 bits panels.

Honestly very fuzzy about the thing, I just ended remembering to get an OLED or Quantum Dot when available and not to bother with anything else.
I think in that case, you get the very bright peaks that comes with HDR brightness, but then you lose the wide colour gamut, grading and contrast due to the 8 bit panel. Which basically cuts the image from a good bit of the advantages of HDR.
 
I think the quality of the panel itself is a much better factor than "spec war" with 8 bits 10 bits, etc.

My uh6100 4k TV does have HDR and 10 bit panel. But it looks worse than my Samsung EH6030 with 8 bit panel and no HDR.

Edit, more details
My Samsung do have way better contrast ratio at almost 3000 :1. While my LG only around 1000:1.

My Samsung also have much vibrant colors, much better brightness uniformity, and the color stays the same from edge to edge (my LG is yellowish on upper right corner).

Basically look for these for better pictures
* contrast ratio
* color gamut
* uniformity

Screen with lower bits, lower resolution, and no HDR can look much better.

Edit :
I think, I basically just got a buyer's remorse
 
Last edited:
Back
Top