HDMI/Displayport - display specs confusion for HDR gaming

I was just answering the "Then there are now the even more expensive 27" Asus/Acer 144Hz HDR FALD monitors, displayport 1.4, but using 8 bit panels?" part of his post. That was (slightly) incorrect. It only drops down when the user (stupidly?) sets the refresh rate above 98hz (I thought I did the math long ago and came to 96hz for dp 1.3/1.4, but reviews say 98hz so I guess I can't do math).



There isn't one. I'd still take those two monitors (sans freesync/gsync) over any other monitor currently on the market. My guess is the reason it's $2k is not gsync, but rather the panel (4k, DisplayHDR 1000, high refresh rate, etc.). But yeah these monitors aren't cheap...


But those 27" ROG HDR panels are 8bit only natively IIRC..

Proart 32UC is a true 10bit panel, without FRC, but it inputs at Displayport 1.2a.
4K/60hz/4:4:4 HDR requires around 23gbps bandwidth i think. The 1.2a specs are limited to 21gbps, hence i am confused if Asus drops down the chroma subsampling to make HDR works on the Proart. Btw the Proart do not have quantum dot, the original prototype model listed it, but the final product quietly remove it and its now model number ending -UC, instead of original -U, sneaky way to extend product line.
 
Yup, me too! It's a shame that owning almost all high end monitor features only comes from GSync displays with such a high commanding price at this moment, but it is what it is.

It's VA and It's edge lit, not really comparable to an IPS FALD. Especially when it comes to HDR. Worse yet it's FreeSync doesn't support LFC. Lacking this feature makes it really overpriced at 1000$.

VA is better for content consumption imo!

This model reaches 1000nits surprisingly, from the reviews i read, claims it is better than the 384 zones Proart in HDR. I am conflicted! I want a good HDR model, but at 43 inches, is rather large for desktop, but it is also half the price of Proart! Neither Proart or this use Displayport 1.4...to save the money and go for intermediary HDR monitor solution now, but still get good VA contrast and 1000nits HDR experience?

How does 1000nits looks like at 2.5 feet from my retinas... :strokechin
IIRC only those highest end TV can do 1000nits HDR!
 
the momentum has 3 settings for hdr, it only reaches 1000 nits on the highest setting at the expense of black level in large areas around the light source. i believe it was pcmonitors.info who reviewed it and based on their hdr testing methodology, it is much much better in the medium setting. where it hit a little over 500 nits in the brightest parts of an image but did not elevate the black level on surrounding pixels, i believe the term used for it is flashlighting. lol. anyway, if it bothers you to make it unnoticeable you only get 500 nits. which is still plenty good imo, but not for 1000 bucks.

you are still missing out on the high refresh rate though, but at least it isn't $2000 either.
 
Thank you, but that is not a 120 Hz+ display, which is what I guess makes the super-expensive ones so special. I am looking for that combination (4K+120=Hz), but apparently I'll have to wait just a little longer yet. HDR, meh, i really don't care about much.

Yeah, getting close, but not quite there yet. The Samsung 2018 QLED TVs support 4k, HDR, 120 Hz, and Freesync (AdaptiveSync), but not 4k and 120 Hz at the same time. That's likely due to the set not supporting full bandwidth HDMI 2.1. I imagine if they had full bandwidth HDMI 2.1 that they'd also support 120 Hz at 4k.

Of course, they're also significantly cheaper than those Gsync displays (the 49" sets start at around 1k USD). I'd likely get one right now to replace my 49" monitor if they supported 4k and 120 Hz simultaneously.

It'll be interesting to see what Samsung does with their next round of PC displays as their QFxN series of TVs make for great gaming displays.

Once there are 4k, HDR, 120 Hz freesync displays, it's goodbye NVidia for me unless they support adaptive sync. There is no way in hell I'm paying the Gsync tax.

Regards,
SB
 
Last edited:
The Samsung's can do 1440p at 120hz as well only 55" and up tho i believe. next year hopefully for the first hdmi 2.1 compliant devices.
 
4:2:0 Chroma subsampling is mostly fine for motion video, but shows artifacts in general PC use. This is most easily and noticeably seen with text on a flat color, hence why it is undesirable in a PC monitor.
I still doubt there would be any sizeable difference on a 4K display; unfortunately I cannot find a way to change framebuffer and display link formats in Windows.

most AAA games are ported over from consoles where 4:2:0 displays need to be accommodated so they take that into consideration at the game level and design around it.
How can you possibly accomodate for a 4:2:0 display link with game assets, unless you are using sprite graphics and bitmap fonts? Not to mention that internal rendering resolutions often do not match the final display for performance reasons.

I thought I did the math long ago and came to 96hz for dp 1.3/1.4, but reviews say 98hz so I guess I can't do math
I guess these monitors set a few fixed timings that do not strictly follow VESA CVT formula. You are correct that for 10 bit 4K using CVT-R2 timings it is possible to set the refresh to 96/97 Hz at max, but not 98 Hz.

4K/60hz/4:4:4 HDR requires around 23gbps bandwidth i think
No, it doesn't. Neither with CVT timings used by DisplayPort monitors, nor with CTA-861 timings used by HDMI displays.
 
Last edited:
I still doubt there would be any sizeable difference on a 4K display; unfortunately I cannot find a way to change framebuffer and display link formats in Windows.

https://www.rtings.com/tv/learn/chroma-subsampling

Rtings go into a little bit. On AMD drivers when I had the 290, it was easy to change via a drop down box in the control panel. I can't find anything similar in the NVidia drivers.

How can you possibly accomodate for a 4:2:0 display link with game assets, unless you are using sprite graphics and bitmap fonts? Not to mention that internal rendering resolutions often do not match the final display for performance reasons.

Choice of text size, color of text and background color are chosen to minimize the potential effect of 4:2:0 chroma subsampling if a user has a display that is using it.

I was aware of this years ago with a friend who got a 4k "HDMI 2.0 Ready" TV a few years ago to use as a desktop monitor. It only supported 4:2:0 Chroma subsampling in 4k, and his desktop and many applications featured text that either looked wrong or was just plain unreadable.

Regards,
SB
 
The Asus/Acer 144Hz 4K displays required subsampling issues were already mentioned, but I didn't spot anyone saying that at least the Asus and I assume Acer too actually has a damn fan inside to cool down the FPGA G-Sync-module. That's right, you just built your dream PC, all silent, and then your display has a damn fan that's audible, too.
 
No, it doesn't. Neither with CVT timings used by DisplayPort monitors, nor with CTA-861 timings used by HDMI displays.

Sorry i don't get this?

What is the display bandwidth required for 4K HDR 60Hz 4:4:4, it is below Displayport 1.2 but above HDMI2.0?

This is so confusing.
 
It's worth noting that despite the monitor having an always-on active cooling fan, it was only noticeable with no other ambient noise in the room, and in close proximity to the display. With any other noise in the room, such as our test PC turned on, the fan noise was drowned out.

Still, if you shut down your PC when not using it, and keep your PC setup in a quiet room, the best course of action would be to completely turn off the monitor when not using it.
https://www.pcper.com/reviews/Graph...144Hz-G-SYNC-Monitor-True-HDR-Arrives-Desktop
 
You shouldn't have to turn your monitor off. How long has it been since we've done that? 2-3 decades?

So the g-sync module doesn't have a sleep state for low power during idle? Or is it just a design flaw not having a fan linked to the power state?
 
Thank goodness I don't have an HDR monitor...yet. I wonder if it's a driver issue that can be fixed, or if it's intrinsic to how the hardware does HDR.

Well, I guess that's one way to make Gsync useful. Enable HDR, get lower framerates, and cover it up with variable refresh. :p

Also, I'm surprised that their Gsync modules require active cooling. That'd be a no-go for me. My PC when not gaming is 100% passively cooled and much farther from my ears than my monitor, not to mention behind a sound baffle to further reduce noise it produces when gaming.

Regards,
SB
 
Last edited:
New While we're on the subject of HDR gaming, looks like NVIDIA takes a big hit from HDR-rendering, while AMD takes next to none
Not next to nothing, there are cases where the drops are noticeable too, Destiny 2 drops 9% on Vega 64. Far Cry 5 drops 6% on it as well. NVIDIA drops more, but It shouldn't be like this when consoles do it with practically 0% hit.

Right now I am hearing it's a Windows 10 bug in the latest build that causes a lot of tonemapping to be done unnecessarily.
 
Ignorant question of the day: how would you measure they on a console?

Do console games have frame rate counters?

Some comparisons can be made by several controlled tests, considering Xbox One can not do HDR but One S can be toggled to do or not do HDR depending on output.
  1. Compare game on Xbox One vs Xbox One S without HDR
  2. Compare game on Xbox One vs Xbox One S with HDR
  3. Compare game on Xbox One S without HDR enabled and Xbox One S with HDR enabled.
First tests might indicate if the game is already artificially capped by lower GPU settings. If might be useful to toss in tests on Xbox One X as well.


Yes, the Xbox One X Developer Kit console does have frame rate counters.

 
Back
Top