Best HDMI 2.1 4K+ HDR AVR (Audio Video Receivers) for Consoles [2020]

Information has started being released about the 2020 Denons. One HDMI 2.1 input. Maybe next year. Also it's only 40Gbps capable instead of the full 48Gbps allowed by the spec. This means you can "only" get 4k120hz with 4:4:4 chroma at 10 bit instead of 12 bit color. FWIW, the new LG OLEDs also adopted this lower spec, so it may become a common choice. :???:

https://www.avsforum.com/forum/90-r...denon-avr-owner-s-thread-faq-posts-1-8-a.html
 
Damnit. What's the point of specs if everyone is going to half-ass it.

Looks like 2021 at earliest for broader spread of HDMI 2.1?
 
Is there any commercial 12bit panels out?

Don't see how it makes a difference if the panel only supports 10bit anyway.
Amps are bit more different as I guess be nice to support whatever display its hooked up to.
Also think 40Gbps can support 12bit with Display Stream Compression (DSC)
 
Is there any commercial 12bit panels out?

Don't see how it makes a difference if the panel only supports 10bit anyway.
Amps are bit more different as I guess be nice to support whatever display its hooked up to.
Also think 40Gbps can support 12bit with Display Stream Compression (DSC)

Yep. It's a non issue. LG limited the bandwidth because the panel capabilities are limited anyway.
 
Nvidia consumer graphics cards can only (up til now at least) output 8bit or 12 bit. So, my concern would be for a future where I was connecting a PC and had accept compressed color information to get > 8 bit color. An extreme corner case, that goes away if Nvidia removes this limitation, but one that would not exist if the full capabilities of the spec were supported.
 
Damnit. What's the point of specs if everyone is going to half-ass it.

Looks like 2021 at earliest for broader spread of HDMI 2.1?


HDMI org, in an effort to gain support of as many manufacturers as possible, allowed all these different modes.

That means some cables can support full bandwidth of the specs while others don't.

We really would be better off if DP had gotten traction with AV equipment but people are locked into that HDMI plug.

USB4 could be an alternative but it wouldn't surprise me if they did the same dumb thing, have multiple specs allowed.
 
HDMI org, in an effort to gain support of as many manufacturers as possible, allowed all these different modes.

That means some cables can support full bandwidth of the specs while others don't.
I think this has always been a thing with HDMI. There is an overall maximum bandwidth of the interface (HDMI IN/OUT) and cable and a whole bunch of data streams in a variety of formats that can consume up to that maximum bandwidth, including some capacity (albeit not much) for things that are almost never used like ethernet over HDMI. When you connect one device to another they negotiate based on the supported standards available to both ends, and the effective supportive bandwidth, and off you go.

When you do hit a combination where the data you want to send will not cram into The available bandwidth, one or more data streams things will drop to a lower bandwidth standard, with lower bandwidth generally equating to lower quality to some degree.
 
At least they have the rest of the goodies like Variable Refresh Rate (VRR), Auto Low Latency Mode (ALLM), Quick Media Switching (QMS), and Quick Frame Transport (QFT).

But ... Would it really have been that difficult to go from 40 to 48? It's just 2 more bits. ;)

Most modern displays (TV or Projectors) are not capable of fully utilizing the maximum potential picture quality that could be provided by HDMI 2.1’s 48 Gpbs of bandwidth. Why? Currently the LCD and OLED panels used in higher-end consumer TVs are native 10-bit panels. To deliver 4K video at 120 fps with chroma 4:4:4 and 10-bit color requires 40 Gpbs. To deliver 8K at 60 fps with 10-bit color would also require a max of 40 Gpbs, uncompressed. (If DSC compression is applied, the necessary bandwidth drops to 18Gpbs.) As an example, all of LG’s 2020 OLEDs have HDMI 2.1 ports, yet they are limited to 40 Gbps, rather than 48 Gbps of bandwidth. But because they use native 10-bit panels, 40 Gbps is enough to max out the panels’ capabilities. Owing to these panel limitations, 40 Gbps can be considered the maximum bandwidth required for real-world scenarios — for now anyway.
https://www.audioholics.com/audio-video-cables/hdmi-2.1
 
Yeah I've been holding off on upgrading my old Denon until at least 2.1 is sorted out.

So no 4K for me either. However, I may be more inclined to jump later this year, though there isn't any 4K content that I feel I'm missing.

As more and more video content goes to streaming, we're going from DD 5.1 sound as a baseline to stereo in a lot of cases.

So no hurry on upgrading the AVR. In fact, I'll be curious about whether an Atmos sound bar might be a viable replacement, especially if they have at least a couple of HDMI 2.1 inputs.


I'm not too optimistic about expansion of ANY 4K HDR content outside of Netflix and maybe Amazon. Well Apple + and Disney + also support it but they don't have a lot of original content yet.

ATSC 3.0 is out but there's really no roadmap for them putting on 4K HDR video any time soon. Local stations would have to upgrade their infrastructure and they don't have the money. Nor are networks in a hurry to send 4K HDR content.

If they put all NFL, NBA, MLB and NHL broadcasts in 4K HDR, that would make a big difference but it wouldn't necessarily bring them more viewers than they already have or more revenues.

In short, if you want 4K or even 8K, move to Europe or Japan or South Korea. Maybe even China and some other Asian countries.
 
Warning: major run on sentence ahead.

Considering how most broadcast TV uses 720p and most cable companies are actually down sampling to even lower bitrate 720p signal even on 1080i content, and how few 4K content broadcast even exists, and how most live sports 4K content was merely 1080i deinterlaced and upsampled on their special 4K Streaming App (NBC), and how most cameras used for live sports are not even 4K HDR quality, I am not convinced ATSC 3.0 will matter for the next 5 to 10 years.

As you said, too much of the old infrastructure left in place, which includes the current owners and major shareholders of said companies.

One of the reasons I use YouTubeTV is I know they will be able to migrate to 4K HDR signals and content substantially faster than the cable companies. If nothing else, they will likely offer 4K HDR VOD for programs that have it. Now if they switched over to at least 5.1 / eac for broadcasts that would be great. In the meantime, I am enjoying the AVR stereo to surround adjustments that dolby and DTS provide (can't remember which mode I have selected).
 
This means you can "only" get 4k120hz with 4:4:4 chroma at 10 bit instead of 12 bit color. FWIW, the new LG OLEDs also adopted this lower spec, so it may become a common choice. :???:

I wonder if anyone would ever notice any difference in those panels, though. Aren't they 10bit panels?
 
I'll probably keep my Xbox One S as a UHD player if only because it is so much quieter than my launch PS4 Pro and while I suspect the PS5 won't be as loud as that I doubt it will hit the very impressive noise levels of the XB1S.

Yeah unfortunately HDMI 2.1 Auto Low Latency Mode (ALLM) support is very wonky right now, I'm still awaiting the launch of an AV amp that supports because it and Variable Refresh Rate support are not actually part of the 2.1 spec but optional expansions so it is quite possible to get "HDMI 2.1" devices without either
 
I'll probably keep my Xbox One S as a UHD player if only because it is so much quieter than my launch PS4 Pro and while I suspect the PS5 won't be as loud as that I doubt it will hit the very impressive noise levels of the XB1S.

Yeah unfortunately HDMI 2.1 Auto Low Latency Mode (ALLM) support is very wonky right now, I'm still awaiting the launch of an AV amp that supports because it and Variable Refresh Rate support are not actually part of the 2.1 spec but optional expansions so it is quite possible to get "HDMI 2.1" devices without either

The recently revealed AVR's from Denon and Marantz has ALLM and VRR, will soon be available. New AVR's from Yamaha should be revealed around end of this month from rumor. Be aware there are only one "full speced" HDMI 2.1 input for Marantz and Denon with 8K 60Hz, 4K 120Hz, unknown for Yamaha.
 
The recently revealed AVR's from Denon and Marantz has ALLM and VRR, will soon be available. New AVR's from Yamaha should be revealed around end of this month from rumor. Be aware there are only one "full speced" HDMI 2.1 input for Marantz and Denon with 8K 60Hz, 4K 120Hz, unknown for Yamaha.

The Denon are available now... ot sure on the Marantz.
Only 1 40Gbps port tho.
 
Yes. I was quite surprised at how economical they are. Well at least compared to the past where most new models with new features started at $1400 and up.
 
Someone on AVS downloaded the manual and looked:

#316

HDMI 2.1 inputs in a firmware update. ON the low end model, all 4 inputs will have 2.1 after firmware update. On the higher end model, it's 3 out of 7 inputs which will support 2.1.

Kind of perfect for what I'll need in the short to mid-term and the price is a pleasant surprise.
 
Back
Top