HDMI/Displayport - display specs confusion for HDR gaming

Discussion in 'Architecture and Products' started by gongo, Jul 8, 2018.

  1. gongo

    Regular

    Joined:
    Jan 26, 2008
    Messages:
    582
    Likes Received:
    12

    But those 27" ROG HDR panels are 8bit only natively IIRC..

    Proart 32UC is a true 10bit panel, without FRC, but it inputs at Displayport 1.2a.
    4K/60hz/4:4:4 HDR requires around 23gbps bandwidth i think. The 1.2a specs are limited to 21gbps, hence i am confused if Asus drops down the chroma subsampling to make HDR works on the Proart. Btw the Proart do not have quantum dot, the original prototype model listed it, but the final product quietly remove it and its now model number ending -UC, instead of original -U, sneaky way to extend product line.
     
  2. gongo

    Regular

    Joined:
    Jan 26, 2008
    Messages:
    582
    Likes Received:
    12
    VA is better for content consumption imo!

    This model reaches 1000nits surprisingly, from the reviews i read, claims it is better than the 384 zones Proart in HDR. I am conflicted! I want a good HDR model, but at 43 inches, is rather large for desktop, but it is also half the price of Proart! Neither Proart or this use Displayport 1.4...to save the money and go for intermediary HDR monitor solution now, but still get good VA contrast and 1000nits HDR experience?

    How does 1000nits looks like at 2.5 feet from my retinas... :strokechin
    IIRC only those highest end TV can do 1000nits HDR!
     
  3. snarfbot

    Regular Newcomer

    Joined:
    Apr 23, 2007
    Messages:
    418
    Likes Received:
    150
    the momentum has 3 settings for hdr, it only reaches 1000 nits on the highest setting at the expense of black level in large areas around the light source. i believe it was pcmonitors.info who reviewed it and based on their hdr testing methodology, it is much much better in the medium setting. where it hit a little over 500 nits in the brightest parts of an image but did not elevate the black level on surrounding pixels, i believe the term used for it is flashlighting. lol. anyway, if it bothers you to make it unnoticeable you only get 500 nits. which is still plenty good imo, but not for 1000 bucks.

    you are still missing out on the high refresh rate though, but at least it isn't $2000 either.
     
    pharma and DavidGraham like this.
  4. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,571
    Likes Received:
    2,121
    I disagree, it makes for a much better black especially in scenes like the night sky with a single source of light like the moon or a star. An edge lit display will show the light source with halos around it, a FALD will minimize that halo significantly. An OLED will not show any halos.
     
    pharma and BRiT like this.
  5. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    15,491
    Likes Received:
    4,405
    Yeah, getting close, but not quite there yet. The Samsung 2018 QLED TVs support 4k, HDR, 120 Hz, and Freesync (AdaptiveSync), but not 4k and 120 Hz at the same time. That's likely due to the set not supporting full bandwidth HDMI 2.1. I imagine if they had full bandwidth HDMI 2.1 that they'd also support 120 Hz at 4k.

    Of course, they're also significantly cheaper than those Gsync displays (the 49" sets start at around 1k USD). I'd likely get one right now to replace my 49" monitor if they supported 4k and 120 Hz simultaneously.

    It'll be interesting to see what Samsung does with their next round of PC displays as their QFxN series of TVs make for great gaming displays.

    Once there are 4k, HDR, 120 Hz freesync displays, it's goodbye NVidia for me unless they support adaptive sync. There is no way in hell I'm paying the Gsync tax.

    Regards,
    SB
     
    #25 Silent_Buddha, Jul 16, 2018
    Last edited: Jul 16, 2018
    snarfbot likes this.
  6. snarfbot

    Regular Newcomer

    Joined:
    Apr 23, 2007
    Messages:
    418
    Likes Received:
    150
    The Samsung's can do 1440p at 120hz as well only 55" and up tho i believe. next year hopefully for the first hdmi 2.1 compliant devices.
     
  7. DmitryKo

    Regular

    Joined:
    Feb 26, 2002
    Messages:
    607
    Likes Received:
    411
    Location:
    55°38′33″ N, 37°28′37″ E
    I still doubt there would be any sizeable difference on a 4K display; unfortunately I cannot find a way to change framebuffer and display link formats in Windows.

    How can you possibly accomodate for a 4:2:0 display link with game assets, unless you are using sprite graphics and bitmap fonts? Not to mention that internal rendering resolutions often do not match the final display for performance reasons.

    I guess these monitors set a few fixed timings that do not strictly follow VESA CVT formula. You are correct that for 10 bit 4K using CVT-R2 timings it is possible to set the refresh to 96/97 Hz at max, but not 98 Hz.

    No, it doesn't. Neither with CVT timings used by DisplayPort monitors, nor with CTA-861 timings used by HDMI displays.
     
    #27 DmitryKo, Jul 18, 2018
    Last edited: Sep 27, 2018
  8. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    15,491
    Likes Received:
    4,405
    https://www.rtings.com/tv/learn/chroma-subsampling

    Rtings go into a little bit. On AMD drivers when I had the 290, it was easy to change via a drop down box in the control panel. I can't find anything similar in the NVidia drivers.

    Choice of text size, color of text and background color are chosen to minimize the potential effect of 4:2:0 chroma subsampling if a user has a display that is using it.

    I was aware of this years ago with a friend who got a 4k "HDMI 2.0 Ready" TV a few years ago to use as a desktop monitor. It only supported 4:2:0 Chroma subsampling in 4k, and his desktop and many applications featured text that either looked wrong or was just plain unreadable.

    Regards,
    SB
     
  9. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    7,948
    Likes Received:
    1,655
    Location:
    Finland
    The Asus/Acer 144Hz 4K displays required subsampling issues were already mentioned, but I didn't spot anyone saying that at least the Asus and I assume Acer too actually has a damn fan inside to cool down the FPGA G-Sync-module. That's right, you just built your dream PC, all silent, and then your display has a damn fan that's audible, too.
     
  10. gongo

    Regular

    Joined:
    Jan 26, 2008
    Messages:
    582
    Likes Received:
    12
    Sorry i don't get this?

    What is the display bandwidth required for 4K HDR 60Hz 4:4:4, it is below Displayport 1.2 but above HDMI2.0?

    This is so confusing.
     
  11. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,734
    Likes Received:
    1,467
    https://www.pcper.com/reviews/Graph...144Hz-G-SYNC-Monitor-True-HDR-Arrives-Desktop
     
  12. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    7,948
    Likes Received:
    1,655
    Location:
    Finland
  13. Malo

    Malo YakTribe.games
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    6,674
    Likes Received:
    2,711
    Location:
    Pennsylvania
    You shouldn't have to turn your monitor off. How long has it been since we've done that? 2-3 decades?

    So the g-sync module doesn't have a sleep state for low power during idle? Or is it just a design flaw not having a fan linked to the power state?
     
    BRiT likes this.
  14. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    7,948
    Likes Received:
    1,655
    Location:
    Finland
    snarfbot, BRiT and Lightman like this.
  15. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    15,491
    Likes Received:
    4,405
    Thank goodness I don't have an HDR monitor...yet. I wonder if it's a driver issue that can be fixed, or if it's intrinsic to how the hardware does HDR.

    Well, I guess that's one way to make Gsync useful. Enable HDR, get lower framerates, and cover it up with variable refresh. :p

    Also, I'm surprised that their Gsync modules require active cooling. That'd be a no-go for me. My PC when not gaming is 100% passively cooled and much farther from my ears than my monitor, not to mention behind a sound baffle to further reduce noise it produces when gaming.

    Regards,
    SB
     
    #35 Silent_Buddha, Jul 21, 2018
    Last edited: Jul 21, 2018
    Lightman likes this.
  16. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,571
    Likes Received:
    2,121
    Not next to nothing, there are cases where the drops are noticeable too, Destiny 2 drops 9% on Vega 64. Far Cry 5 drops 6% on it as well. NVIDIA drops more, but It shouldn't be like this when consoles do it with practically 0% hit.

    Right now I am hearing it's a Windows 10 bug in the latest build that causes a lot of tonemapping to be done unnecessarily.
     
  17. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,753
    Likes Received:
    1,376
    Ignorant question of the day: how would you measure they on a console?

    Do console games have frame rate counters?
     
  18. BRiT

    BRiT (╯°□°)╯
    Moderator Legend Alpha Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    11,358
    Likes Received:
    7,157
    Location:
    Cleveland
    Some comparisons can be made by several controlled tests, considering Xbox One can not do HDR but One S can be toggled to do or not do HDR depending on output.
    1. Compare game on Xbox One vs Xbox One S without HDR
    2. Compare game on Xbox One vs Xbox One S with HDR
    3. Compare game on Xbox One S without HDR enabled and Xbox One S with HDR enabled.
    First tests might indicate if the game is already artificially capped by lower GPU settings. If might be useful to toss in tests on Xbox One X as well.


    Yes, the Xbox One X Developer Kit console does have frame rate counters.

     
  19. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,571
    Likes Received:
    2,121
    In addition to the developer kits, Digital Foundry also measures performance to a fairly accurate degree, and they always find no difference fps wise between HDR on and HDR off on consoles.
     
  20. gongo

    Regular

    Joined:
    Jan 26, 2008
    Messages:
    582
    Likes Received:
    12
    I am still searching for DP1.2 max capabilities and came across this Eizo documents

    https://www.eizoglobal.com/support/db/files/manuals/03V27068A1/UM-03V27068A1-AL.pdf

    DP1.2 can support 4K 10bit RGB 4:4:4 at 59.997Hz but not 60Hz?

    Can one calculate why a few more decimal drops it to 8bit at 4:4:4?

     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...