AMD: Speculation, Rumors, and Discussion (Archive)

Status
Not open for further replies.
So, that Polaris chip that was shown off is smaller than Cape Verde and ~the speed of the console GPUs (so around Bonaire or higher) with low power consumption. What I take away from this is that the next generation is upping the minimum bar for graphics quite significantly. That's fairly exciting for a budget level discrete graphics card.

I look forward to potentially putting this into my HTPC to replace my aging 5450, depending on what the comparable product from Nvidia does.

Regards,
SB
 
@KOF So AMD's choice of 10bit will probably be a good compromise for quite a while, and the P3 colour gamut, which I was not aware of, should be available on displays sooner than later. Thanks for the summary.

Oh yes! 10bit/1000 cd/m2/P3 is actually the baseline standard for UHD Bluray and HDMI 2.0a, so AMD is simply following the already established SMPTE standard which is a good thing.

Improvements to the P3 color will come in two forms. First is enlarging the color gamut from P3 to Rec.2020 which most people are familiar with. Still more room for improvement from the Rec.2020, and that's where color volume (AKA 3D color gamut) comes in.

b0066645_55e06f5d61198.jpg


Simply enlarging the color primaries at the color gamut level (2D) isn't enough. We also need to feed dynamic range/contrast ratio to truly make colors independent. Without 3D gamut, Colors veering towards very bright whites (which will be common in HDR) will get absorbed by that very white, so saturation is lost. The same is true for the black. HDR also improves black level (Current 8bit SDR source = 0.1 cd/m2 / 10bit HDR source = 0.01 cd/m2 / 12bit Dolby Vision = 0.005 cd/m2) so colors will get absorbed by black. Employing 3D color gamut will allow colors to keep their saturation from the blackest blacks and whitest whites. Currently, only Dolby Vision supports 3D gamut but HDR10 baseline standard will also get this starting HDMI 2.1, but obvious, for Rec.2020, 12bit color depth is more ideal than 10bit, because to get 100% 3D gamut coverage of Rec.2020, Dolby argues you really need 10000 cd/m2 of dynamic range which can only be provided by 12bit PQ EOTF.

Speaking of the PQ (Perceptual Quantizer) EOTF (Electro-Optical Transfer Function), this is perhaps the most important factor in making HDR a reality, because current CRT-based gamma (which is now more than 70 years old) is very inefficient in increasing dynamic range. 8bit 256 steps can only provide 1000:1 (0.1~100 cd/ms) dynamic range, and simply increasing color depth to 10bit (1024 steps) would have only increased dynamic range by 4 times, 12bit (4096) by another 4 times and so on. Think of 12bit PQ EOTF as a lossy compression which compresses 15~16bit, 20fstops of dynamic range down to 12bit. A Hollywood colorist named Joe Kane also argues you pretty much need 16bit color depth to use up entire Rec.2020, that's why 12bit PQ curve is pretty much required for full Rec.2020 coverage. If we have stayed with gamma (CRT), increasing data infrastructure up to 16bit would have been a colossal challenge.

AMD is going to officially support 10bit PQ EOTF so that's a good thing, but obviously, game creators now need to utilize PQ EOTF before we start seeing HDR games being released. We all know today's game engines have no problem handling dynamic range in excess of 20fstops, but deliberately had to tone map it down to 8bit standard dynamic range because the monitors the majority of artists currently use are...8bit SDR. I asked one console developer if it's possible to convert Xbox360/PS3 era games into HDR by intercepting tone mapping call and he said that would be extremely challenging as gamma and EOTF curve is simply not compatible. (The artists have already done the work in tone mapped/gamma based graphics)
 
Really really cool info KOF! Are you the AMD guy in the video? :D

I like to seek your expertise about this new standards....did the industry just came up with all this REC2020, 12bit PQ EOTF, literally..? I mean they are all just numbers, but behind them what we want to see are the displays being able to output the darkest blacks, the brightest whites and saturatedest hues. Why did it take them soooo long to come up with, what seems will be the biggest leap in motion pictures since....motion pictures itself? Is the technology just not feasible back when we were migrating to BR/FHD 10 years ago?

But i remember back then, we already have zonal backlit CCFL LCDs, my 5 years old laptop firepro can output 10-bit, we have plasmas that can do 0.001 cd/m2 and also output billions of colors, Sony has triluminious displays years back, heck some old Sony FHDTV have 10-bit panels....all these were available 5-10 years ago, but why did it us so long to move forward? Was it a breakthrough in some fields like QD that make these tech better? OLEDs still seem some years off..

How will these new displays measure up against the Kuro? Will 10-bit triluminious panels of yesteryears be able to display the same levels of colours fidelity as the new ones? What was the industry doing for the last decades? This new standards looks feasible back then, so why now?
 
That entry is for the GL44 Hawaii variant (FirePro W9100), thus the 16GB of GDDR5 and 6 mDP ports noted.

That makes sense (though that would probably cost a fair bit more than a 290X for comparison sake).

Any idea on how the poster teased out that the other two entries belonged to Polaris?
 
That makes sense (though that would probably cost a fair bit more than a 290X for comparison sake).

Any idea on how the poster teased out that the other two entries belonged to Polaris?

Kind of just by hoping? The dates seem to match potentially as well as the amount of money (translate to roughly $716). Pretty thin evidence, but might match up to some $700-650 type high end consumer card. Not that this is exactly surprising, considering the Fury X and 980ti launched at about that price last year and inflation has been pretty low.
 
Kind of just by hoping? The dates seem to match potentially as well as the amount of money (translate to roughly $716). Pretty thin evidence, but might match up to some $700-650 type high end consumer card. Not that this is exactly surprising, considering the Fury X and 980ti launched at about that price last year and inflation has been pretty low.

Are we confident that it's even an AMD GPU? I'm not familiar with the site - would it have Nvidia boards as well?

Also, would there be any way to tell that this isn't Gemini? Perhaps it's a little cheap for Gemini, but anything else?
 
It did look like that AMD was holding back the chip displayed at CES till mid-2016 due to bigger dies being on the way as well.

Fury samples were flying around by nov 2014, hopefully it's not that big of gap to release this time.
 
It did look like that AMD was holding back the chip displayed at CES till mid-2016 due to bigger dies being on the way as well.

Fury samples were flying around by nov 2014, hopefully it's not that big of gap to release this time.

Half a year is standard time.
 
HDR monitor? 100% srgb first on entry level TNs please. Moreover, we need 10-bit support on the compositor for the presentation onto desktop, actual buffers are always 8-bit represented.
 
Engineering sample for high end Polaris GPU is apparently up and running and was shown off at CES behind closed doors: http://wccftech.com/amd-shows-enthusiast-polaris-ces/

Seems like AMD is being really cagey about what exactly their GPU specs are this time around, to the point of not even wanting to show it off in public.

Not really out of the ordinary with some of the first silicon they get back from the fabs.

With the news of Nano dropping to $499USD, I think that hints at what performance range they are probably aiming at.
 
Engineering sample for high end Polaris GPU is apparently up and running and was shown off at CES behind closed doors: http://wccftech.com/amd-shows-enthusiast-polaris-ces/
Unless I'm missing something, neither the WTFtech article, nor the Tweaktown source article it is using, mention anything about high end Polaris GPU "up and running".

All the TT writer mentions is seeing an undetermined AMD next-gen GPU/card, but being completely unable to find out any information about it - something any self respecting enthusiast could likely have bettered ( GPU size approximation if an actual GPU was shown, card size and cooling arrangement if covered, output arrangement etc.). If it were a black box demonstration what are the chances that AMD were indulging in their favoured strategy of allowing the press to jump to conclusions? Show a high end card, then show a demo stating that a Polaris arch card is running it and allowing the writer to fill in the blanks via assumption?
 
Unless I'm missing something, neither the WTFtech article, nor the Tweaktown source article it is using, mention anything about high end Polaris GPU "up and running".

All the TT writer mentions is seeing an undetermined AMD next-gen GPU/card, but being completely unable to find out any information about it - something any self respecting enthusiast could likely have bettered ( GPU size approximation if an actual GPU was shown, card size and cooling arrangement if covered, output arrangement etc.). If it were a black box demonstration what are the chances that AMD were indulging in their favoured strategy of allowing the press to jump to conclusions? Show a high end card, then show a demo stating that a Polaris arch card is running it and allowing the writer to fill in the blanks via assumption?


Well, just need to ask him ... Its not the first time that AMD is showing stuffs to limited peoples under the door of CES or other conference show.
 
Status
Not open for further replies.
Back
Top