HDMI/Displayport - display specs confusion for HDR gaming

Discussion in 'Architecture and Products' started by gongo, Jul 8, 2018.

  1. gongo

    Regular

    Joined:
    Jan 26, 2008
    Messages:
    582
    Likes Received:
    12
    Until HDMI2.1 is out, we have to live with the confusion of whether our HDTV or PC monitor fully support base HDR gaming?

    HDR 10bit/60hz/4:4:4/60fps/4K - this being the base target to hit.

    There is a new Asus Pro-art FALD monitor which i plan to buy for PC gaming, but it is only displayport 1.2a....but why Asus? You want $2K but do not use displayport 1.4? I am better off sticking to my OLED C7? (i am looking for desk bound HDR PC gaming along with good ppi for desktop use!) I love my OLED but no for long term PC use!

    I hear conflicting posts that there are some HDTV can support 4:4:4 60Hz/4K HDR through HDMI2.0, which is only 18Gbps..? How? Native downsampling behind our backs...? Displayport 1.2a has 21.6Gbps, so the Asus will work..?

    Then there are now the even more expensive 27" Asus/Acer 144Hz HDR FALD monitors, displayport 1.4, but using 8 bit panels?

    I am confused, anyone following?
     
    #1 gongo, Jul 8, 2018
    Last edited: Jul 8, 2018
  2. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    15,375
    Likes Received:
    4,283
    If you don't need G-sync or smaller displays, you're far better off going with a good HDR TV, IMO.

    NVidia is doing everything it can to make PC gaming irrelevant by pushing proprietary shite.

    Regards,
    SB
     
  3. DmitryKo

    Regular

    Joined:
    Feb 26, 2002
    Messages:
    545
    Likes Received:
    337
    Location:
    55°38′33″ N, 37°28′37″ E
    Why exactly do you think DisplayPort 1.2a is a limit, and what HDMI 2.1 has to do with HDR computer monitors at all?

    PA32UC, as well as PA329Q and PA34V, use 60 Hz panels, so there is no need for additional bandwidth provided by HBR3 mode of DisplayPort 1.3+ or HDMI 2.1, and their HDR (both DisplayHDR and FreeSync 2 HDR) capabilities are exposed through industry standard EDID/DisplayID structures, and not some protocol-specific embedded message like HDMI Infoframe or DisplayPort 1.4 metadata.
     
    Silent_Buddha likes this.
  4. willardjuice

    willardjuice super willyjuice
    Moderator Veteran Alpha Subscriber

    Joined:
    May 14, 2005
    Messages:
    1,365
    Likes Received:
    215
    Location:
    NY
  5. DmitryKo

    Regular

    Joined:
    Feb 26, 2002
    Messages:
    545
    Likes Received:
    337
    Location:
    55°38′33″ N, 37°28′37″ E
    The original poster's performance target was "HDR 10bit/60hz/4:4:4/60fps/4K" - that's absolutely possible with DP 1.2a in the Asus Pro Art PA32UC monitor.

    A gaming monitor with 100 Hz refresh does require at least DisplayPort 1.3 (HBR3 mode); further, 120 Hz would have to use 4:2:2 chroma subsampling and 144 Hz would have to use 4:2:0 subsampling.
    That said, 10-bit 4:2:0 format is used in UHD Blu-ray releases, so I'd think the practical impact of this subsampling would be minimal.
     
  6. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    15,375
    Likes Received:
    4,283
    4:2:0 Chroma subsampling is mostly fine for motion video, but shows artifacts in general PC use. This is most easily and noticeably seen with text on a flat color, hence why it is undesirable in a PC monitor.

    Games that feature a lot of text display can also suffer from this to a greater or lesser degree depending on how much text is displayed and how often the user needs to read the text. Although that said, most AAA games are ported over from consoles where 4:2:0 displays need to be accommodated so they take that into consideration at the game level and design around it.

    Regards,
    SB
     
    Kej, Lightman and DavidGraham like this.
  7. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,763
    Likes Received:
    131
    Location:
    New York, NY
    That Asus monitor does indeed look tasty, but $2000 is more than I can justify spending for such a small improvement over my current 1440p, 144Hz G-Sync display. Here's hoping prices for these kinds of displays get a lot more reasonable within a year or two.
     
  8. Frenetic Pony

    Regular Newcomer

    Joined:
    Nov 12, 2011
    Messages:
    261
    Likes Received:
    60
    Your problem here is, unfortunately, Nvidia. There's a similar Freesync 2 monitor out for half the price now, but Nvidia refuses to support Freesync (it gets a cut of all G-sync monitors).

    Unfortunately it doesn't look like this'll change anytime soon, nor is there probably any other HDR monitor with adaptive sync support on the horizon this year. Maybe you'll have better luck next year : /
     
    BRiT likes this.
  9. CarstenS

    Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    4,730
    Likes Received:
    1,957
    Location:
    Germany
    Which one is that? Genuinely interested on behalf of my Vega 56!
     
    pharma, Lightman and BRiT like this.
  10. willardjuice

    willardjuice super willyjuice
    Moderator Veteran Alpha Subscriber

    Joined:
    May 14, 2005
    Messages:
    1,365
    Likes Received:
    215
    Location:
    NY
    I was just answering the "Then there are now the even more expensive 27" Asus/Acer 144Hz HDR FALD monitors, displayport 1.4, but using 8 bit panels?" part of his post. That was (slightly) incorrect. It only drops down when the user (stupidly?) sets the refresh rate above 98hz (I thought I did the math long ago and came to 96hz for dp 1.3/1.4, but reviews say 98hz so I guess I can't do math).

    There isn't one. I'd still take those two monitors (sans freesync/gsync) over any other monitor currently on the market. My guess is the reason it's $2k is not gsync, but rather the panel (4k, DisplayHDR 1000, high refresh rate, etc.). But yeah these monitors aren't cheap...
     
    DavidGraham likes this.
  11. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,597
    Likes Received:
    1,338
    Nice. Those support the G-Sync HDR standard which I believe is the VESA DisplayHDR 1000 standard, or 1000cd/m2 brightness levels.
     
  12. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,341
    Likes Received:
    1,588
    You are partially true, HDR1000 is a rarity among PC monitors, 4K @144Hz/120Hz is also a rarity. The Asus is also a Quantum Dot IPS with Full Array Backlit (384 zones), so it's a high quality panel already. However the GSync module costs a significant 500$ alone! The module is powered by an FPGA made by Altera, and is equipped with 3GB of DDR4 RAM, which increases the cost even further.
    https://www.techpowerup.com/245463/nvidia-g-sync-hdr-module-adds-usd-500-to-monitor-pricing
     
    Lightman, willardjuice and BRiT like this.
  13. willardjuice

    willardjuice super willyjuice
    Moderator Veteran Alpha Subscriber

    Joined:
    May 14, 2005
    Messages:
    1,365
    Likes Received:
    215
    Location:
    NY
    Ah interesting! Thank you for the information. That's actually borderline insane! Honestly not even remotely close to being worth it. I'm really only in for the 4k/DisplayHDR 1000/high refresh rate. :razz: :mrgreen:
     
  14. Frenetic Pony

    Regular Newcomer

    Joined:
    Nov 12, 2011
    Messages:
    261
    Likes Received:
    60
  15. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,763
    Likes Received:
    131
    Location:
    New York, NY
    That's not really comparable. I'm sure it's an amazing display, but yes, 144Hz matters. Even if the most graphically-intensive games have a hard time going above 60Hz at 4k, many older games have no problem whatsoever.
     
    DavidGraham likes this.
  16. CarstenS

    Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    4,730
    Likes Received:
    1,957
    Location:
    Germany
    Thank you, but that is not a 120 Hz+ display, which is what I guess makes the super-expensive ones so special. I am looking for that combination (4K+120=Hz), but apparently I'll have to wait just a little longer yet. HDR, meh, i really don't care about much.
     
  17. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,763
    Likes Received:
    131
    Location:
    New York, NY
    HDR really is amazing as well (I have an HDR TV). In terms of quality, I'd definitely choose HDR over higher resolution (though not over high refresh rate).

    The only problem is: very few PC games these days take advantage of it. Eventually that will change. But for now it's one of those things that is rarely beneficial.

    Hopefully by the time HDR support is common, monitors that let you have it all will have become much cheaper.
     
  18. willardjuice

    willardjuice super willyjuice
    Moderator Veteran Alpha Subscriber

    Joined:
    May 14, 2005
    Messages:
    1,365
    Likes Received:
    215
    Location:
    NY
    I agree, to me HDR > 4k on my OLED.
     
    DavidGraham, Lightman and BRiT like this.
  19. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,341
    Likes Received:
    1,588
    Yup, me too! It's a shame that owning almost all high end monitor features only comes from GSync displays with such a high commanding price at this moment, but it is what it is.
    It's VA and It's edge lit, not really comparable to an IPS FALD. Especially when it comes to HDR. Worse yet it's FreeSync doesn't support LFC. Lacking this feature makes it really overpriced at 1000$.
     
    #19 DavidGraham, Jul 15, 2018
    Last edited: Jul 15, 2018
    pharma likes this.
  20. Frenetic Pony

    Regular Newcomer

    Joined:
    Nov 12, 2011
    Messages:
    261
    Likes Received:
    60
    Right now FALD is bure marketing BS for HDR. Human vision is exponential in nature, on a log2 basis. 2 nits looks twice as bright as 1 nit. But 510 nits is barely distinguishable from 500 nits. When you combine this with screen reflectance, which is above 5% even on an excellent screen, it means the difference between a 3000 to 1 contrast ratio and a 20,000 to 1 contrast ratio is nonexistent. Your minimum black level is going to be 5 nits at the lowest for even a relatively dark room, doubled out you get a little over 7 levels or "stops" of brightness from a 1000 nit display, less than a 3k to 1 contrast ratio gives you ideally, and less than a 20k to 1 obviously. But you literally cannot see enough of the darks to tell the difference between the two displays unless you're in a totally blacked out room.

    That being said the lack of low framerate compensation does hurt : (
     
    #20 Frenetic Pony, Jul 15, 2018
    Last edited: Jul 15, 2018

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...