AMD: Speculation, Rumors, and Discussion (Archive)

Status
Not open for further replies.
Correct me if I'm wrong, but I think the biggest challenge in getting HDR mainstream is not with engines, but rather with artists and monitors they use. I really want to play your next game in HDR, but I'm worried because there will be a big hassle to support both 8bit SDR + sRGB and 10bit HDR + DCI/P3 displays.
Yes. Engine support is trivial. Everybody nowadays has HDR (usually 16 bit float) pipeline from the start to the end. In the very end there is tonemapping + color grading. Color grading is usually done in LDR space (after tone mapping), so it needs to be changed, but that's pretty much it. Displays are a big problem right now. All artists and rendering programmers need (properly calibrated) HDR displays to author HDR content. I understood properly all certified HDR displays will be 4k, so it takes time until reasonably priced models are widely available). HDR also adds testing costs, as it is one extra hardware configuration for the test matrix. Plus some testers also need new expensive displays.
 
I don't suppose devs could include a better in-game calibrator (brightness + colour?), knowing how every consumer has different display types let alone settings.
 
Yes. Engine support is trivial. Everybody nowadays has HDR (usually 16 bit float) pipeline from the start to the end. In the very end there is tonemapping + color grading. Color grading is usually done in LDR space (after tone mapping), so it needs to be changed, but that's pretty much it. Displays are a big problem right now. All artists and rendering programmers need (properly calibrated) HDR displays to author HDR content. I understood properly all certified HDR displays will be 4k, so it takes time until reasonably priced models are widely available). HDR also adds testing costs, as it is one extra hardware configuration for the test matrix. Plus some testers also need new expensive displays.
How much 'till 10-bit texture compression and hardware decoding? I'm not sure if
BC6H is enough,..
 
How much 'till 10-bit texture compression and hardware decoding? I'm not sure if BC6H is enough,..
Why isn't it good enough? In any case intel skylake already supports astc hdr profile (albeit I don't know if that would be good enough neither...)
 
From MSDN: "BC6H uses a fixed block size of 16 bytes (128 bits) and a fixed tile size of 4x4 texels" which means 1 byte per texel.
 
I know it. But all current texture compression format are designed for 8-bit input textures. "Standardized" 10-bit "HDR" monitors are a great occasion to move to 10-bit source/input textures.
 
You don't really need more than 8 bit for most source textures. Roughness (specular exponent) would need more for water, etc.

If the output needs to be 32 bit, RGB10 is not the best choice (it has 2 unused bits). R11G11B10 would be much better (one less blue bit doesn't really matter, since it is perceptually the least important color). RGB9E5 (sharedexp) would be even better output format for display. It has 32 stops of dynamic range (4 billion to one contrast, assuming we only count normalized numbers).
 
For most or completely hand-painted texture I completely agree, 8-bit per channel textures are enough. However, a lot of texture of "natural" things (ie: plants) always feels kinda fake. Maybe it's just me, but it's like seeing displayed a 8-bit JPEG compressed photo on a common 8-bit TN screen vs a RAW 10-12-14 bit photo on a professional screen. 10-bit textures would be amazing - in my opinion - for skin too (I know they are actually mostly hand-painted!).

And I also recognize we are digress to much from OP :p
 
If the output needs to be 32 bit, RGB10 is not the best choice (it has 2 unused bits). R11G11B10 would be much better (one less blue bit doesn't really matter, since it is perceptually the least important color). RGB9E5 (sharedexp) would be even better output format for display. It has 32 stops of dynamic range (4 billion to one contrast, assuming we only count normalized numbers).
There aren't really any normalized/denormalized numbers with that format (as it has implied leading zero). Albeit of course you still start losing all the mantissa bits once you've approached 2^-15. Can't see that as a viable output format though (the shared exponent makes it too problematic to be renderable). R11G11B10 has pretty much much the same range in any case, though I suppose accuracy isn't all that great.
 
To gongo:

I'd say it's a collaborative effort. Yes, like you said, we had bits and pieces that could have brought HDR revolution sooner, but the momentum to push new technology simply wasn't there IMO. For a new tech to take off, we need capital, and a mean to capitalize from it. We've had an HDR display as early as 2004 (Brightside which was later acquired by Dolby) but lack of experience led to experience that was simply too bright, and Kuro owners weren't impressed. In 2000, Digital Cinema Initiative (DCI) was standardized among theaters to standardize upon 12 bit grayscale and of course, "DCI/P3" gamut, but as you know, projectors only went up to 48 cd/m2 at that time so we had technologies already. We just didn't get to combine the best of them.

Legacy clawback was also the reason. We've been with gamma and 8bit grayscale for so long nobody dared to bring a change without a good reason. Any attempt for technological advance must also be met with compatibilty issues so that must have been addressed too.

UHD broadcast (ATSC , DVB) was the catalyst to all this. We could've had improvement from digital broadcast transition like Joe Kane said, but at that time, just moving from analog to digital was a tremendous effort so industry people didn't want any more headaches. Then, multicasting in both analog and digital has ended with the death of analog broadcasting and broadcast companies all over the world got leftover spectrum. They thought, "Hey, since we've got extra broadcast spectrum, how about we combine it into current HD stream to increase bitrate, use the latest HEVC codec instead of MPEG2, then go for 4K broadcasting?"

Display manufacturers thought this was a perfect opportunity to capitalize and with few broadcast companies like NHK and BBC prepping for UHD, we all knew we would be going to higher resolution than HD, but nobody knew what kind of shape it would take. In the beginning, it was simply a resolution increase no different from that of going from DVD to Bluray. But other people argued if we're going for another increase in resolution, there has to be a good justification, so people thought about adding a value proposition to a simple resolution increase. They looked at Hollywood's DCI spec and wanted to add to the UHD specification. CE manufacturers didn't care in the beginning. Consumers are already trained to know higher resolution means superior image quality, so all they needed was 4K resolution TVs. For the UHD Bluray, there may not have been an external format war like HD-DVD and Bluray, but its internal battles were fierce nontherless. From HEVC loyalty to exact UHD spec, those differences have delayed launching of UHD Bluray by two years. CE manufacturers wanted a simple resolution increase from current Bluray, arguing it will simplify encoding procedure. Hollywood wanted to add features they already use in theater, and Hollywood has won in the end. DCI color gamut and 10 bit grayscale was standardized into UHD BD. Then industry people finally set down for a serious standardization for UHD and thus, Rec.2020 was born.

However, one piece of puzzle was missing from Rec.2020 and that was HDR. It may sound funny, but HDR was a byproduct of an attempt to replace gamma. Gamma is very wasteful in higher bits. It was actually a compromise born to support 8bit 256 levels of grayscale back in the day. Digital Cinema had to use up to 12bits because they kept using gamma and elimination of banding was a priority. People thought "We no longer use CRT, so why should we use gamma designed for CRT?" Other people chimed in. "Use of CRT and gamma also has restricted dynamic range of only 0.1~100 cd/m2. Nowadays, people watch LCDs on 200 cd/m2 and over." So decision was made to finally put CRT's legacy to rest and since Dolby was the strongest proponent to this change, HDMI and UHD BD has accepted PQ EOTF curve as a replacement for gamma. Unfortunately, broacast stations couldn't afford such luxury, so some stations like NHK and BBC have decided to keep gamma for their OETF solution, which is called Hybrid Log Gamma.

The best thing about current PQ based HDR system is that it's made to be very future-proof. First, it uses the largest base spec which is 10000 cd/m2. Then each year, new metadata is release to improve tone mapping, add new features such as dynamic metadata or color volume.

So yeah. Each of these components alone wasn't enough to bring a revolution, but combined, they do. I used to be a resolution skeptic, didin't really care for higher grayscale and DCI gamut in theater, but combining them all together with HDR, that's what really brings the magic and some enthusiasts no longer call this "Ultra HD", instead opting for "Immersive Video". True, because when I've first seen in on an OLED, I thought for the first time I felt about "being there", just like hifi audio.[/QUOTE]
 
There aren't really any normalized/denormalized numbers with that format (as it has implied leading zero). Albeit of course you still start losing all the mantissa bits once you've approached 2^-15. Can't see that as a viable output format though (the shared exponent makes it too problematic to be renderable). R11G11B10 has pretty much much the same range in any case, though I suppose accuracy isn't all that great.
With "normalized numbers" I meant colors where the largest RGB mantissa doesn't have any leading zeros. You can store more dynamic range by having leading zeros in all mantissas, but that loses precision.

I don't think R9G9B9E5 is hard to write. RGBA16f output only needs these changes: Pick the largest channel. Take its exponent as the shared exponent (both formats have 5 bit exponents). Write implicit 1 to the mantissa high bit of that channel, copy 8 high bits from the 16f mantissa to fill the rest of that mantissa. No math so far, just copying bits. Then calculate N[0,1] = exponentOther[0,1]-takenExponent, and shift down the mantissas by N[0,1] and take N highest bits. Again only bit shifting and masks (=shuffling bits around).

Conversion from R9G9B9E5 -> 16f requires converting unnormalized numbers to normalized numbers. That requires bit scans (how many leading zeros). GPUs already support sampling R9G9B9E5, so this is already supported by the hardware. Blending of course could be problematic, as the R9G9B9E5 -> 16f conversion is slow. But there are other formats that do not support blending either, making this a moot point. We would only need to output to R9G9B9E5 once at the end of the pipeline -> HDR display.
 
Last edited:
Really really cool info KOF! Are you the AMD guy in the video? :D

I like to seek your expertise about this new standards....did the industry just came up with all this REC2020, 12bit PQ EOTF, literally..? I mean they are all just numbers, but behind them what we want to see are the displays being able to output the darkest blacks, the brightest whites and saturatedest hues. Why did it take them soooo long to come up with, what seems will be the biggest leap in motion pictures since....motion pictures itself? Is the technology just not feasible back when we were migrating to BR/FHD 10 years ago?

But i remember back then, we already have zonal backlit CCFL LCDs, my 5 years old laptop firepro can output 10-bit, we have plasmas that can do 0.001 cd/m2 and also output billions of colors, Sony has triluminious displays years back, heck some old Sony FHDTV have 10-bit panels....all these were available 5-10 years ago, but why did it us so long to move forward? Was it a breakthrough in some fields like QD that make these tech better? OLEDs still seem some years off..

How will these new displays measure up against the Kuro? Will 10-bit triluminious panels of yesteryears be able to display the same levels of colours fidelity as the new ones? What was the industry doing for the last decades? This new standards looks feasible back then, so why now?

Sorry, long post so I had to cut it in half and somehow I messed up

(continues)


I wouldn't truly call it immersive video without yet another crucial piece of puzzle: OLED. You simply can't have a real life production with an LCD, local dimming or not. I'll explain why.

Have you ever had a moment where you've heard a very fine produced music on a high end speaker, and when the guitarist is plucking strings, you go "I swear I felt the string being plucked!" ? I did and it was an amazing experience. Audio is already at the "Being there" level and on a very good system, you can feel the texture of instrument despite only using ears.

Same with our eyes. Our eyes are very complex pieces of optical machinery. There are two types of light-sensitive cells on the retina: cones and rods. Cones that are further divided by three types, each receiving Red, Green, and Blue, so it accepts color. Rods accept luma, so it accepts black and white images. However, there are 1.2 billion rods while there are only 6 million cones, meaning our eyes are much more sensitive to changes in luminance than in color, that's why chroma subsampling is an accepted practice in broadcast/Bluray. Sebbi prefers higher luminance bits for the very same reason.

Light is travelled to our eyes in this manner: light source -> surface of object -> observer. What we are finally seeing is reflected light of the object cast from the source. Any changes in these three will change the final output. Light is a very complex beast. Depending on surface and medium, it can reflect, refract, absorb, and etc. And surface matters a lot for these events as well. Is it uniform? Or rough? glossy? diffuse? Same with audio, our eyes can tell these various textured surfaces without even touching them with great precision.

What are two of the most famous devices for emulating human optical system? : camera and display. Camera acts as an observer in this case and accepts light (photons) during exposure. Photons are entered into CCD/CMOS sensor's photosites (cavity array) and are accessed of how many have fell into each cavity. If there are numerous photons collected in a cavity, it's recorded to have higher luminance intensity. If there are only few, then low luminance intensity is recorded. Between two pixels, if one has collected one photon, and the other has collected 1000 photons, then the dynamic range between two pixels is 1000:1.

For scenes with large incident light and uniform surface, number of photons on each pixels should be fairly uniform. However, for high variations in reflectivity, like rough rock, water wave, etc, the dynamic range between adjacent pixels will be much greater. And how much these dynamic range varies depends on reflecting surface, and there are enormous variation of it. This is how our 1.2 billion rods distinguish surface textures.

On the display side, OLEDs faithfully reproduce these variations with 0.00001 cd/m2 (last recorded black level) black level. Samsung's top of the line SUHD LCD, the JS9500 only has 0.05 cd/m2 (at 120 cd/m2 backlight luminance) which means an OLED TV has at least 5000 times precision when it comes to resolving luminance variation. However, with HDR contents, average luminance has moved from 100~120 cd/m2 to 400 cd/m2, and LCDs, being transmissive, has to settle with black level hit. So the Samsung JS9500's black level is now 0.11 cd/m2 at 400 cd/m2, and that now means 11000 times less precision than the OLED TV. This is why next to plasmas and OLEDs, LCDs look flat and has no depth because light is resolved in more flat manner than should be. I own a Panasonic plasma that has 0.001 cd/m2 black level and I'm still amazed with its life-like presentation. Despite only being 8bit, I still like it better than the top of line HDR LCDs from Samsung and Sony. Some very well made Bluray movies has really good dynamic range and on one movie, I was seeing windows and I was very surprised I could actually feel its Fresnel effect (reflection/refraction occuring at the same time) in a really convincing manner. I've never seen something like that on LCDs.

This behaviour is also why even local dimming LCDs are ineffective in bringing convincing life-like image. Local dimming can only help with painting black area black, but for lit zones, it's subject to panel's native contrast ratio and thus will still look flat, and this will get worse as brightness increases even further towards 1000 cd/m2. Which is too bad as real life lava not only has great luminance, but also high variations in reflectivity.

Pioneer was correct in saying black is a canvas for displays. Zero black level means no unwanted light is added to each pixels and higher black level will rob saturation. This was painfully evident on Samsung SUHD quantum dot LCDs as their 91% DCI coverage wasn't as saturated as I liked. The LG OLED on the other hand really brings saturation and transparency to the colors. The net result of OLED + 10bit HDR + P3 gamut + 4K? The most harmonized picture I've ever seen in my life. Every 8 million pixels composed of very colorful colors are moving smoothly! In one scene, there was three water fountains, one on the back, one in the middle, on in the front, all spouting water at different timing and I could actually feel their distance! No 3D glasses needed!
 
Yes. Engine support is trivial. Everybody nowadays has HDR (usually 16 bit float) pipeline from the start to the end. In the very end there is tonemapping + color grading. Color grading is usually done in LDR space (after tone mapping), so it needs to be changed, but that's pretty much it. Displays are a big problem right now. All artists and rendering programmers need (properly calibrated) HDR displays to author HDR content. I understood properly all certified HDR displays will be 4k, so it takes time until reasonably priced models are widely available). HDR also adds testing costs, as it is one extra hardware configuration for the test matrix. Plus some testers also need new expensive displays.

I understand. Do publishers see value proposition in adding HDR though? PC game publishers might be more reluctant as HDR is really about highlights and no consumer monitors are local dimming variants, and costs in making one is atronomical I don't see monitor manufacturers to bother as their margins are already thin. It may gain traction in consoles though as Sony pictures is one of pioneers in making HDR possible, and Microsoft has already signed up with Dolby Vision. (they've recently sent their multimedia team to Dolby Vision events)
 
I don't think R9G9B9E5 is hard to write. RGBA16f output only needs these changes: Pick the largest channel. Take its exponent as the shared exponent (both formats have 5 bit exponents). Write implicit 1 to the mantissa high bit of that channel, copy 8 high bits from the 16f mantissa to fill the rest of that mantissa. No math so far, just copying bits. Then calculate N[0,1] = exponentOther[0,1]-takenExponent, and shift down the mantissas by N[0,1] and take N highest bits. Again only bit shifting and masks (=shuffling bits around).
Thinking about this, you are right it shouldn't be too much of a problem. The GL extension (gl_ext_texture_shared_exponent) even mentions it could potentially be supported as a framebuffer format (by another extension). Haven't seen such an extension, though :).

Conversion from R9G9B9E5 -> 16f requires converting unnormalized numbers to normalized numbers. That requires bit scans (how many leading zeros). GPUs already support sampling R9G9B9E5, so this is already supported by the hardware.
Conversion shouldn't be much of a problem I think. Pretty easy to do in software too...
 
For most or completely hand-painted texture I completely agree, 8-bit per channel textures are enough. However, a lot of texture of "natural" things (ie: plants) always feels kinda fake. Maybe it's just me, but it's like seeing displayed a 8-bit JPEG compressed photo on a common 8-bit TN screen vs a RAW 10-12-14 bit photo on a professional screen. 10-bit textures would be amazing - in my opinion - for skin too (I know they are actually mostly hand-painted!).

And I also recognize we are digress to much from OP :p

Well, JPEG isn't a very good compression method. BC7 or minimal compression ASTC might be enough for higher color gamut albedo textures. It'd be interesting to experiment a bit and see.

I understand. Do publishers see value proposition in adding HDR though? PC game publishers might be more reluctant as HDR is really about highlights and no consumer monitors are local dimming variants, and costs in making one is atronomical I don't see monitor manufacturers to bother as their margins are already thin. It may gain traction in consoles though as Sony pictures is one of pioneers in making HDR possible, and Microsoft has already signed up with Dolby Vision. (they've recently sent their multimedia team to Dolby Vision events)

Monitor manufacturers, tvs, movie theaters, phones, everything will support HDR if it sells. Screens are a super high competition industry, as you pointed out margins are razor thin as it is. Anything that slows sales could tip a manufacturer into bankruptcy, and if everyone is suddenly buying HDR screens and yours isn't you can shut down right quick. So really it'll be up to consumers whether they think the whole "HDR!" thing is worth it.

Personally I love it and will buy an "HDR!" monitor as soon as I find a good one I can afford. I've already got an Adobe RGB pro monitor and edit photos on it just because it looks better, even if no one else can see it really. And much higher luminance range will be a godsend for certain types of art, photography included. I hate how much you have to squeeze the range of each photo, an overcast day usually needs to use up most of the available luminance range to get good contrast as it is, same as a bright sunny day. Of course the same goes with games, movies, and etc.
 
AMD more or less confirmed, that Polaris (11?) will replace R9 290X and will be cheaper than $349:
At 16:00 Roy Taylor defined minimum specs for VR (R9 290X or GTX 970, both for $349) and than described new 14nm product, which will replace them:
"We can now produce GPUs which will run at minimum specs of VR at lower cost, in larger volume, consuming less power and running faster" (18:00-18:15)
 
Aaaaww yeah I called it first in this forum (I deserve extra internet points and self-pattings). The two Polaris chips are set to replace Bonaire+Pitcairn and Hawaii.
This is excellent strategy IMO.

Lower cost might mean manufacturing cost, not necessarily selling price, though I hope it's both.

His point during the video is that the userbase for the cards that fall within or over the minimum spec for VR is still rather small at 7.5 million users. So for VR to take off and increase the number of possible VR adopters, they need to come up with a cheaper replacement for cards at that performance range, which is where Polaris comes in.
In this case, I think he definitely means lower cost to the consumer (meaning lower than $350), otherwise his whole point wouldn't make sense.

jUpeALv.png
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top