GPUs with full 4k@60 4:4:4 support?

Discussion in 'Architecture and Products' started by green.pixel, Sep 3, 2016.

  1. green.pixel

    Veteran

    Joined:
    Dec 19, 2008
    Messages:
    1,831
    Likes Received:
    295
    Location:
    Europe
    Doesn't matter if it's AMD or Nvidia. Thanks in advance!
     
  2. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    Intel, Nvidia, and AMD have supported 4K@60Hz with RGB 24bits for years over DisplayPort.
    You need to be more specific.
     
  3. Wynix

    Veteran Regular

    Joined:
    Feb 23, 2013
    Messages:
    1,052
    Likes Received:
    57
    Some searching got me to >this< page, which indicates that AMD will be bringing 4K@60Hz 4:4:4 to the RX480/470 cards in a driver update soon.
    But if you need a card ASAP then the GTX 1060 supports it.
     
  4. green.pixel

    Veteran

    Joined:
    Dec 19, 2008
    Messages:
    1,831
    Likes Received:
    295
    Location:
    Europe
    It's for HTPC duties over HDMI. This slide says it supports 4:2:2 at 4k over HDMI.

    Will it come to RX460 as well?

    What about the Nvidia, are GTX 9xx cards compatible? And Intel's integrated bunch?
     
  5. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    16,158
    Likes Received:
    5,096
    For HDMI all that is required is for the card to have HDMI 2.0 or greater. This does not in itself guarantee 4:4:4 at 4k, however. Some HDTVs that advertise HDMI 2.0 don't do 4:2:2 at 4k.

    However, for 4:4:4 at 4K on HDMI 2.0 you are restricted to 8-bit output. If you set the output to 10 bit, you will be limited to 4:2:2. If you set the output to 12 bit, you will be limited to 4:2:0.

    For 10 bit and 12 bit color output at 4:4:4 you will need both an HDMI 2.1 output as well as a display that supports HDMI 2.1.

    This is the same for both Nvidia and AMD as it's a limitation of the HDMI standards.

    Regards,
    SB
     
  6. green.pixel

    Veteran

    Joined:
    Dec 19, 2008
    Messages:
    1,831
    Likes Received:
    295
    Location:
    Europe
    What are the advantages of 10 and 12 bit output, reduced banding?
     
  7. Anarchist4000

    Veteran Regular

    Joined:
    May 8, 2004
    Messages:
    1,439
    Likes Received:
    359
    Yeah, although they can also increase compression ratios for certain sources (anime). Doesn't necessarily help output to a display, but if you start with 10/12 bit you might as well send it.
     
  8. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    Yes, in theory. But you'd need source data that has it. And a GPU that willing to send it. (Nvidia at some point only allowed this for Quadros.)

    And since you're talking 4K at 60Hz: 10 or 12 bit requires 25% or 50% more BW. I'm not sure HDMI 2.0 has enough BW to even support that.
     
  9. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    16,158
    Likes Received:
    5,096
    It doesn't. Hence why it drops to 4:2:2 at 10 bit and 4:2:0 at 12 bit. HDMI 2.1 is required for 4:4:4 at 10/12 bit.

    Regards,
    SB
     
    Kej and silent_guy like this.
  10. green.pixel

    Veteran

    Joined:
    Dec 19, 2008
    Messages:
    1,831
    Likes Received:
    295
    Location:
    Europe
    So that basically means that for UHD HDR BluRay or streaming with Dolby Vision or HDR10 currently there is no PC card that support them over HDMI?

    And what's the situation with Intel's integrated gpus, are there any that can output 8bit 4k 60Hz 4:4:4?
     
  11. Ryan Smith

    Regular

    Joined:
    Mar 26, 2010
    Messages:
    611
    Likes Received:
    1,052
    Location:
    PCIe x16_1
    Keep in mind that all of this stuff is encoded 4:2:2 or 4:2:0 to begin with. No consumer content is coming at 4:4:4. Full subsampling is only used for mastering and for desktop work, as the latter requires sub pixel precision for text.

    In any case, I don't believe there's any kind of consumer content accessible on the PC with HDR10 right now. But if there was, then it would just be displayed at 4:2:2 on the latest generation GPUs, which given the above would work just fine.
     
  12. AlBran

    AlBran Ferro-Fibrous
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    20,733
    Likes Received:
    5,825
    Location:
    ಠ_ಠ
    What does alien isolation's 10-bit mode do?
     
  13. green.pixel

    Veteran

    Joined:
    Dec 19, 2008
    Messages:
    1,831
    Likes Received:
    295
    Location:
    Europe
    Yeah, I know, desktop use is why I asked about 4:4:4 in the first place. The goal is to have lowest power consumption possible while meeting those requirements hence the Intel question. Can APUs also do it?
     
  14. Ryan Smith

    Regular

    Joined:
    Mar 26, 2010
    Messages:
    611
    Likes Received:
    1,052
    Location:
    PCIe x16_1
    They would have AMD's current generation display controller, so yes.
     
  15. cho

    cho
    Regular

    Joined:
    Feb 9, 2002
    Messages:
    416
    Likes Received:
    2
    Is there any video player support HDR metadata output?
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...