Best HDMI 2.1 4K+ HDR TV for Consoles [2020]

Status
Not open for further replies.
Yes, but what setting is right for which range?
High + Limited
Low + Full

Everywhere is recommended either both on Automatic (if tv has auto option), or Black low on TV and Limited on PS5
Limited does give me better shadow detail without washing out the picture much. Full looks to have more contrast.
 
I was rewatching Vincent's PS5 32 Gbps video and I'm looking at Video output information screen around 1:11 and why is under non-HDR available frequencies 120Hz listed as YUV422, when 8bit RGB at 120Hz is 25.82 Gbit/s so well within the bandwidth limit?
 
I was rewatching Vincent's PS5 32 Gbps video and I'm looking at Video output information screen around 1:11 and why is under non-HDR available frequencies 120Hz listed as YUV422, when 8bit RGB at 120Hz is 25.82 Gbit/s so well within the bandwidth limit?

There's a handy chart here which gives Eff. Data Rates for various resolution, refresh rates, color depth, and chroma.

Is there any TV that support 4K@120hz with 10-bit HDR + RGB 4:4:4(from both internal and external source)? | AVForums

4k@120Hz with 4:4:4 chroma at 8-bit color depth effectively requires 32.08 Gbps bandwidth. It might be possible with 4:2:0 or maybe 4:2:2 to get it down to ~25 Gbps, but I think HDMI chips in TVs may not allow 4k@120 Hz without a negotiated data rate of at least 32.08 Gbps between the TV and source.

Someone else here might know more about the nitty gritty details of HDMI chips in recent HDTVs.

Regards,
SB
 
There's a handy chart here which gives Eff. Data Rates for various resolution, refresh rates, color depth, and chroma.

Is there any TV that support 4K@120hz with 10-bit HDR + RGB 4:4:4(from both internal and external source)? | AVForums

4k@120Hz with 4:4:4 chroma at 8-bit color depth effectively requires 32.08 Gbps bandwidth. It might be possible with 4:2:0 or maybe 4:2:2 to get it down to ~25 Gbps, but I think HDMI chips in TVs may not allow 4k@120 Hz without a negotiated data rate of at least 32.08 Gbps between the TV and source.

Someone else here might know more about the nitty gritty details of HDMI chips in recent HDTVs.

Regards,
SB

25.82 Gbps figure for 8 bit 444 4k/120 came from https://en.wikipedia.org/wiki/HDMI. Some bandwidth is taken by audio so naturally it's more than 25.82 Gbps, but I didn't expect to go up to 32.08 Gbps.

But yeah, that would explain why PS5 can't do 444 at 120 Hz even without HDR.

I hope they can make it work via system software update, but I'm not holding my breath, maybe in the next PS5 revision along with bigger internal storage ;-)
 
25.82 Gbps figure for 8 bit 444 4k/120 came from https://en.wikipedia.org/wiki/HDMI. Some bandwidth is taken by audio so naturally it's more than 25.82 Gbps, but I didn't expect to go up to 32.08 Gbps.

Yeah, remember that the HDMI link has to also have the capacity to carry uncompressed audio which can get pretty large. There's likely other stuff as well like error correction, etc. And possibly even a bit of a buffer to account for sub-par cable quality or cable degradation.

Regards,
SB
 
I've been resisting so far on randomly picking up an LG OLED 65CX too. I'm trying to hold out until the C11 or whatever they roll with next.
 
I've been resisting so far on randomly picking up an LG OLED 65CX too. I'm trying to hold out until the C11 or whatever they roll with next.

Yeah I passed up the BF sales, hoping that there would be improvements to panels and to HDMI 2.1 implementation.

Now though with news of Warners movies on HBO Max in 4K and HDR for the next year, I may be tempted if I can find the BF prices again.

It might be the start of offering UHD HDR streams of Game of Thrones and other HBO content.
 
Does anyone have experience with VRR and how it compares to true 60fps?

Was just looking at benchmarks for my GPU and at the settings I'd like to play on it averages around 45fps which is within the VRR range of my c9 (lowest is 40hz). Would that feel pretty close to 60 or would a locked 30 be better?
 
Does anyone have experience with VRR and how it compares to true 60fps?

Was just looking at benchmarks for my GPU and at the settings I'd like to play on it averages around 45fps which is within the VRR range of my c9 (lowest is 40hz). Would that feel pretty close to 60 or would a locked 30 be better?

On my workplace asus laptop with vrr, 45 fps feels more like 60.

For easier comparison,
Btw did you watched hobbit at 48 fps on theater? To my eyes it looks similar to 60 fps.
 
Does anyone have experience with VRR and how it compares to true 60fps?

Was just looking at benchmarks for my GPU and at the settings I'd like to play on it averages around 45fps which is within the VRR range of my c9 (lowest is 40hz). Would that feel pretty close to 60 or would a locked 30 be better?

Depends on a couple of things.
  • How sensitive are you to swings in framerate WRT the feel of the control-control feedback loop?
  • How large is the variance in framerate?
If the answer to the first is "not much" then it doesn't matter too much. If you are, but the answer to second is "not much" then it won't be that noticable.

So, if that is the case then VRR will provide a better experience as you'll get better control-control feedback response and the variability in feedback response won't be that noticeable by you. Additionally, the presentation will "look smooth".

If that isn't the case and either you are sensitive (like me) and the variance is large, then try for as close to a locked framerate as you can get.

Speaking of locked framerates, you can also attempt to lock the framerate within the VRR zone of your display by using a combination of a max-framerate lock and adjusting the settings of the game so that the minimum drop is the same or very close to the the max-framerate lock. IE - you could attempt to lock framerate to 45, 50, 55, 59, whatever value you wanted.

Regards,
SB
 
hiya, can anyone with modern TV capable of passing thru dolby atmos (AFAIK all LG OLED since series 8) share their EDID info? i just need the info here like in this video


you can grab CRU from here https://www.monitortests.com/forum/Thread-Custom-Resolution-Utility-CRU

EDIT:
i want to manually add Dolby Atmos (DD) to my tv EDID and try it'll work or not. if work, hopefully cyberpunk can output surround audio without modded dll, thus hopefully eliminate the random audio bugs i experienced.
 
Just an FYI. Not sure that I ever mentioned it. Stay the hell away from the Vizio. Not only is it broken as hell but customer/ tech support spent an hour and a half trying to blame the video signal switching and detection problems on the Xbox or the PC, separately. Then they ignored the power issues which required me to unplug it form the wall and plug it back in to get it to boot. Literally the worst experience I have had since the early internet days. If you remember, they had some person with a checklist, this was like that. They had no idea what they were talking about.
 
Modern TVs seem pretty buggy. My XH9005 just decided to drop audio from my PS5 today, but switching the HDMI input between HDMI 2.0 and 2.1 seemed to get the audio back. Weird as hell.
 
Didn't know PS5 (and XBSX?) outputs 120Hz games as YCbCr color space instead of full RGB, apparently because of bandwidth limitations.
That's why you should not force your TV at High/Full black level, but use Automatic instead.
 
Didn't know PS5 (and XBSX?) outputs 120Hz games as YCbCr color space instead of full RGB, apparently because of bandwidth limitations.
That's why you should not force your TV at High/Full black level, but use Automatic instead.

I imagine the reason why consoles support YCbCr format because unlike RGB, YCbCr supports chroma subsampling. And a lot of TVs have a habit of only supporting RGB over PC mode.
 
Last edited:
Just an FYI. Not sure that I ever mentioned it. Stay the hell away from the Vizio. Not only is it broken as hell but customer/ tech support spent an hour and a half trying to blame the video signal switching and detection problems on the Xbox or the PC, separately. Then they ignored the power issues which required me to unplug it form the wall and plug it back in to get it to boot. Literally the worst experience I have had since the early internet days. If you remember, they had some person with a checklist, this was like that. They had no idea what they were talking about.

Vizio is basically just an importer. They have no clue what hardware is inside the TVs they sell.
Vizio support confirmed this for me over twitter when I asked them about one of their panels a year or so ago.
 
Status
Not open for further replies.
Back
Top