AMD: Speculation, Rumors, and Discussion (Archive)

Status
Not open for further replies.
AMD told us that they are seeing a real world density increase like one would expect going from 28 nm to 20 nm due to FinFET making transistors more bulky than planar technique.
 
AMD told us that they are seeing a real world density increase like one would expect going from 28 nm to 20 nm due to FinFET making transistors more bulky than planar technique.
That makes no sense. The compromise in density largely lies in the modest metal layer scaling. Either the AMD you talked to wasn't really into the finer details of lithography, or he/she assumed that you weren't.
 
That makes no sense. The compromise in density largely lies in the modest metal layer scaling. Either the AMD you talked to wasn't really into the finer details of lithography, or he/she assumed that you weren't.
Probably the latter, yeah.

To expand on that: What he specifically said was that they won't get as many (transistors) as they normally would because they'd cost a little more. There would be differences between their (in the metal stack) rather t-shaped transistors and Intel's v-shaped ones, because they'd not be as dense. They would stay thin in the metal layer until they reached the redistribution layer. They (AMD) thus would have higher density and better power efficiency in exchange for a little in peak frequency. From what was said, he compared their FinFET technique to Intels 22 nm.
 
Last edited:
Maybe the truth is that AMD won't be as aggressive in squeezing more transistor density for FinFet as they were for their latest 28nm GPUs, namely Fiji and Tonga.
They now have a lot more breathing room to focus on avoiding leakage currents and perhaps longer pipelines for higher clocks.
 
Last edited by a moderator:
Reducing leakage is one of the things FinFETs do best.
The change in the ratio between static and dynamic consumption could figure into the "designed for FinFET" claim, since implementation and power management would be adapted to a different environment from 28nm.
 
Yes, I want one of those. That decides my next monitor purchase! I won't be upgrading until I can get a HDR display in 21:9 curved format at around 3k res in the 30+ inch range.
 
Yes, I want one of those. That decides my next monitor purchase! I won't be upgrading until I can get a HDR display in 21:9 curved format at around 3k res in the 30+ inch range.

Make sure to wait until you can get a full BT 2020/12bit monitor and not one of these bullshit P3/10bit things that "officially" meet the new UHD group standard. The standards were lowered mostly because a few screen makers pushed this stuff out as early as possible to make a quick buck and now don't want their inferior stuff to not meet the new standards (even though they shouldn't).

Fortunately the standard also requires content to be mastered in BT 2020, so getting a screen that supports it is still a good idea.
 
I'm way more interested in HDR and rec 2020 than I am in 4k, but how long is it going to be before there are panels that can actual display that full colour space? How useful is that feature in this videocards going to be? Seems more like a futureproof type feature.
 
Yes, I want one of those. That decides my next monitor purchase! I won't be upgrading until I can get a HDR display in 21:9 curved format at around 3k res in the 30+ inch range.

+ Adaptive Synch.
 
Untill my monitors decide to die, i think i will wait thoses type of monitors too.. Now i will need to see a comparaison with a PLS ( or IPS ) monitors as i own one... I dont know if you have allready see one, but its a bit similar on effect when you compare them side by side with a standard LCD panel.
 
I'm way more interested in HDR and rec 2020 than I am in 4k, but how long is it going to be before there are panels that can actual display that full colour space? How useful is that feature in this videocards going to be? Seems more like a futureproof type feature.

Rec.2020 is more of a container now than ready-to-use color gamut, so we're currently going to use its sub-container which is DCI-P3. In fact, the whole specification of UHD TV resolves around this container approach.

1. The maximum resolution for UHD is 8k, but we also have 4K.
2. The maximum framerate specified is 240Hz, but we're also using 60Hz and 120Hz (native input will be allowed through HDMI, starting version 2.1)
3. The maximum color gamut is Rec.2020, but we're using P3 for now.
4. The maximum color depth, along with photopic dynamic range is 12bit color at 10000 cd/m2 (maximum luminance of the PQ EOTF curve), but obviously that's way too far into the future, so we've settled with 10bit @ 1000 cd/m2 (HDR10) for now, with current Dolby Vision contents being graded at 12bit @ 4000 cd/m2. (Dolby's mastering monitor limited, not PQ limited)

No consumer TVs and monitors have 100% coverage for DCI-P3 yet, let alone Rec.2020. The closest we have is the 2016 LG G6 Signature OLED TV which is claimed to have 99% of DCI coverage. Laser projectors are currently the closest with 98% of Rec.2020. The next best was supposed to be the quantum dots, but removal of cadmium has seriously dampered their ability to map wider color gamut. While some of Sony's Hollywood-grade cameras (F35,F65) have even wider colorspace than Rec.2020 (S-Gamut), until professional displays catch up, Rec.2020 support will remain limited even among Hollywood movie studios. The P3 will be with us for a very long time.

HDR and WCG will be very useful in videogames. I've been waiting for HDR games for a long time and when I've seen HDR10 & P3 on HDR capable LG OLED TV, it was very sick. P3 colors were really popping out...kind of like anime..only better because anime never had this kind of color pallette before. Contrast ratio, they really do animate now. You can practically feel sun shining up in gradual movement, fire crackles, neon lights blaring up and down. In fact, after having seen the HDR OLED, I was driving home and when I saw flashing carlights in the dark, I was like "Wow, that's HDR."
 
No consumer TVs and monitors have 100% coverage for DCI-P3 yet, let alone Rec.2020

Sorry just a minor correction. The Vizio Reference do support 120% of P3, my bad! They were released at the right timing though before cadmium restriction wasn't so severe.
 
Good point, and yes, for sure.

I think we can assume that any monitor supporting DP1.3 will be adaptative-sync capable ? ( if not enabled ) Untill the manufacturer decide to keep DP1.2 i dont see any reason to believe that new monitors ( with DP1.3 ) are not freesync, adaptative capable.

Today there's allready 25 monitors on the market who support freesync/ + some TVs.. this dont count the monitors not yet released but announced, and this is allready 3 time more monitors than G-sync one.

With the adoption of DP 1.3, the rate of monitor capable of it will accelerate a lot..
 
Last edited:
Pretty pretty pixels.


Wow...this man...he is awesome!
He articulate wells, knows his stuffs and just very passionate about display quality, yes it is time for better pixels!

I am sick of waiting for widespread use of 10-bit color software, and now we have HDR bonus, which means monitors are not going to be just thin frames with shitty picture quality no more!

Thank you AMD for pushing image quality. Dell 4K OLED is sick but it is 5K TOO sick for me...

I hope AMD sits down with the industry players and work on making really good affordable 27-35" VA monitors that are compatible with this new display standards!

I am excited for 2016! Everything needs to go full 10-bit and back-lit HDR goodness!
 
@KOF So AMD's choice of 10bit will probably be a good compromise for quite a while, and the P3 colour gamut, which I was not aware of, should be available on displays sooner than later. Thanks for the summary.
 
Status
Not open for further replies.
Back
Top