Best HDMI 2.1 4K+ HDR TV for Consoles [2020]

Discussion in 'Console Industry' started by AzBat, Aug 21, 2020.

  1. AzBat

    AzBat Agent of the Bat
    Legend Veteran

    Joined:
    Apr 1, 2002
    Messages:
    7,499
    Likes Received:
    4,373
    Location:
    Alma, AR
    TLDW?

    Tommy McClain
     
    DSoup likes this.
  2. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    18,905
    Likes Received:
    21,327
    For those who use physical media, the PS5 BluRay is currently better.

    At launch Xbox Series X had other issues that are now fixed. So it's possible for Xbox to fix these new issues as well.
     
    AzBat likes this.
  3. wco81

    Legend

    Joined:
    Mar 20, 2004
    Messages:
    6,674
    Likes Received:
    538
    Location:
    West Coast
    No Dolby Vision for either.

    HDTVtest also did a comparison.
     
    AzBat likes this.
  4. vjPiedPiper

    Newcomer

    Joined:
    Nov 23, 2005
    Messages:
    118
    Likes Received:
    72
    Location:
    Melbourne Aus.
    No YUV 12bit output on Xbox, just RGB 10bit. ( Both 4:4:4)
    In theory the YUV 12bit, is a better 1:1 match for the data on the discs.
    Also very slightly broken playback for pure 24fps Blu-rays, vs the more common 23.976 fps.

    Would be nice to see MS update this in future, they fixed the black levels he made a video about a few months back, hopefully they get onto this.
     
    BRiT likes this.
  5. Silenti

    Regular

    Joined:
    May 25, 2005
    Messages:
    689
    Likes Received:
    398
    Wait a sec. I thought you needed a 12-bit panel to actually use 12-bit color? And all the displays, now and for the near term foreseeable future, are 10-bit panels.
     
  6. vjPiedPiper

    Newcomer

    Joined:
    Nov 23, 2005
    Messages:
    118
    Likes Received:
    72
    Location:
    Melbourne Aus.
    Well, sort of...
    To fully make use of a 12-bit signal, you would need a panel that has 12 bit precision.
    But the issue here is that the Xbox is converting the YUV data on the disc, into RGB, and sending it down the HDMI in RGB.
    When transmitting it as in YUV is a better match for the original data.

    In terms of the actual result on the Panel, they could probably send 10Bit YUV data, and it would likely be visually imperceptible from 12-bit YUV data,
    on anything but the absolute highest quality panels. But this is as much about the raw data values sent on the wire, and in that case 12bit will always be better than 10.
     
    tuna, Silenti and BRiT like this.
  7. AzBat

    AzBat Agent of the Bat
    Legend Veteran

    Joined:
    Apr 1, 2002
    Messages:
    7,499
    Likes Received:
    4,373
    Location:
    Alma, AR
    Thought some Series X owners might find this useful if you don't have an HDMI 2.1 TV like me...



    Here's the cool calculator he uses in the video...

    https://www.murideo.com/cody.html

    Tommy McClain
     
    BRiT likes this.
  8. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    15,006
    Likes Received:
    11,112
    Location:
    London, UK
    So... on or off?
     
  9. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    18,905
    Likes Received:
    21,327
    Yes.
     
    Rangers, Nesh, DSoup and 1 other person like this.
  10. AzBat

    AzBat Agent of the Bat
    Legend Veteran

    Joined:
    Apr 1, 2002
    Messages:
    7,499
    Likes Received:
    4,373
    Location:
    Alma, AR
    The answer is it depends on individual TVs. Some TVs handle 10-bit better, some handle 12-bit better. So you will have to test it yourself to see how it affects sky or backgrounds with gradients in HDR games. Whichever looks better is the one you should stick with. If anybody has game suggestions for checking this let me know.

    Tommy McClain
     
  11. Nesh

    Nesh Double Agent
    Legend

    Joined:
    Oct 2, 2005
    Messages:
    12,962
    Likes Received:
    3,106
    I have a question about HDR. The video of that asian guy suggests Dynamin Tone Mapping to HGIG. But I font see any diffetence between HGIG and off. Dynamic tone mapping at On instead of HGIG looks better. HGIG is too dark
    I dont get the point
     
  12. Reynaldo

    Newcomer

    Joined:
    Jan 24, 2008
    Messages:
    102
    Likes Received:
    105
    Both dynamic tone mapping and off will clip whites and you’ll loose bright details. Dynamic also brightens shadows a lot. If you prefer a “flashier” image, go with dynamic. HGIG is more “correct”, but yeah it can look a bit dark sometimes, specially if the room is bright.
     
    BRiT and orangpelupa like this.
  13. orangpelupa

    orangpelupa Elite Bug Hunter
    Legend Veteran

    Joined:
    Oct 14, 2008
    Messages:
    9,311
    Likes Received:
    2,578
    can confirm that is indeed the case for me on LG CX.

    for a bright room, "dynamic" is a must tho. Otherwise dark areas are simply too hard to discern.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...