GPUs with full 4k@60 4:4:4 support?

Some searching got me to >this< page, which indicates that AMD will be bringing 4K@60Hz 4:4:4 to the RX480/470 cards in a driver update soon.
But if you need a card ASAP then the GTX 1060 supports it.
 
Intel, Nvidia, and AMD have supported 4K@60Hz with RGB 24bits for years over DisplayPort.
You need to be more specific.

It's for HTPC duties over HDMI. This slide says it supports 4:2:2 at 4k over HDMI.

Some searching got me to >this< page, which indicates that AMD will be bringing 4K@60Hz 4:4:4 to the RX480/470 cards in a driver update soon.
But if you need a card ASAP then the GTX 1060 supports it.

Will it come to RX460 as well?

What about the Nvidia, are GTX 9xx cards compatible? And Intel's integrated bunch?
 
It's for HTPC duties over HDMI. This slide says it supports 4:2:2 at 4k over HDMI.

For HDMI all that is required is for the card to have HDMI 2.0 or greater. This does not in itself guarantee 4:4:4 at 4k, however. Some HDTVs that advertise HDMI 2.0 don't do 4:2:2 at 4k.

However, for 4:4:4 at 4K on HDMI 2.0 you are restricted to 8-bit output. If you set the output to 10 bit, you will be limited to 4:2:2. If you set the output to 12 bit, you will be limited to 4:2:0.

For 10 bit and 12 bit color output at 4:4:4 you will need both an HDMI 2.1 output as well as a display that supports HDMI 2.1.

This is the same for both Nvidia and AMD as it's a limitation of the HDMI standards.

Regards,
SB
 
What are the advantages of 10 and 12 bit output, reduced banding?
Yes, in theory. But you'd need source data that has it. And a GPU that willing to send it. (Nvidia at some point only allowed this for Quadros.)

And since you're talking 4K at 60Hz: 10 or 12 bit requires 25% or 50% more BW. I'm not sure HDMI 2.0 has enough BW to even support that.
 
Yes, in theory. But you'd need source data that has it. And a GPU that willing to send it. (Nvidia at some point only allowed this for Quadros.)

And since you're talking 4K at 60Hz: 10 or 12 bit requires 25% or 50% more BW. I'm not sure HDMI 2.0 has enough BW to even support that.

It doesn't. Hence why it drops to 4:2:2 at 10 bit and 4:2:0 at 12 bit. HDMI 2.1 is required for 4:4:4 at 10/12 bit.

Regards,
SB
 
So that basically means that for UHD HDR BluRay or streaming with Dolby Vision or HDR10 currently there is no PC card that support them over HDMI?

And what's the situation with Intel's integrated gpus, are there any that can output 8bit 4k 60Hz 4:4:4?
 
So that basically means that for UHD HDR BluRay or streaming with Dolby Vision or HDR10 currently there is no PC card that support them over HDMI?

And what's the situation with Intel's integrated gpus, are there any that can output 8bit 4k 60Hz 4:4:4?
Keep in mind that all of this stuff is encoded 4:2:2 or 4:2:0 to begin with. No consumer content is coming at 4:4:4. Full subsampling is only used for mastering and for desktop work, as the latter requires sub pixel precision for text.

In any case, I don't believe there's any kind of consumer content accessible on the PC with HDR10 right now. But if there was, then it would just be displayed at 4:2:2 on the latest generation GPUs, which given the above would work just fine.
 
Keep in mind that all of this stuff is encoded 4:2:2 or 4:2:0 to begin with. No consumer content is coming at 4:4:4. Full subsampling is only used for mastering and for desktop work, as the latter requires sub pixel precision for text.

Yeah, I know, desktop use is why I asked about 4:4:4 in the first place. The goal is to have lowest power consumption possible while meeting those requirements hence the Intel question. Can APUs also do it?
 
Back
Top