Intel 22nm 3-D Tri-Gate Transistor Technology

DSC

Regular
Banned
http://newsroom.intel.com/docs/DOC-2032

Intel announces a major breakthrough and historic innovation in microchips: the world's first 3-D transistors in mass production

The transition to 3-D continues the pace of technology advancement, fueling Moore's Law for years to come.

An unprecedented combination of performance improvement and power reduction to enable new innovations across a range of future 22nm-based devices from the smallest handhelds to powerful cloud-based servers.

Intel demonstrates a 22nm microprocessor -- code-named Ivy Bridge -- that will be the first high-volume chip to use 3-D transistors.

http://www.youtube.com/watch?v=SB706hhCDZc

http://www.youtube.com/watch?v=YIkMaQJSyP8
 
The purported power scaling for the process seems to indicate a nice drop in voltage from 32 to 22, enough for a halving of power at low voltages. This would be a bump over the expected 20-30% improvement that would have been expected otherwise.
I'm curious if this is a one-time boost similar to the one-time improvment in gate leakage done by HiKMG. If it weren't for tri-gate, leakage would have worsened at 22nm.

Semiaccurate claimed this was going to be applied to the caches first, and that the logic was still planar. Was this expounded upon?
(edit. It makes sense since one of the challenges was variability for the fins. The caches are highly regular and Intel has already disclosed tuning transistors used to fine-tune cache banks for SB, which would help compensate for any difficulties for IB)

Unless any of Intel's fab competitors show something improved beyond catching up to what it was doing 2 generations ago, it is going to lap the field again (again again).
 
FD-SOI is a realistic alternative to finfets ... and I still think it's likely that that's what the rest of the industry will go with (going non-planar is a bigger departure from what the fabless companies are comfortable with than SOI AFAICS).

This is not the technology which will give Intel the edge while scaling downwards I think, their patents in computational lithography I think will be.
 
Is there any indication that FD-SOI is going to make an appearance any time soon? I think the plan for Intel's compeitors is hopefully FD-SOI at 15nm or below.
What are they going to do in the meantime?

Perhaps by the time PD-SOI is realized, Intel will be ready to move from from finFET to something new within 8 months.
 
3dilettante: On the Q&A Mark Bohr has said that they plan to use Tri-Gate technology on all of the transistor types, not just SRAM. It's also fully depleted, but not FD-SOI.
 
TSMC is definitely going FinFET on 14nm, but Intel beating them by 4 years and a full process node is very impressive - typical Intel, really, but still! :) As for FD-SOI, I wouldn't exclude the possibility that IBM/GF/Samsung goes down that road on 14nm. We'll see.

28nm at TSMC/UMC and IBM/GF/Samsung are already extremely different (gate-late vs gate-first, RDR vs not, etc.) whereas on 65nm things were quite similar for everyone to facilitate multi-sourcing. It would certainly be interesting if things got even more different over time. And this news does make the 'Apple using Intel 22nm as a foundry service' theory even more interesting.
 
And this news does make the 'Apple using Intel 22nm as a foundry service' theory even more interesting.

I can't see how this makes sense for Intel at all, if they are pursuing it. If Atom needs to be on their own process technology to maintain a lead, allowing Apple as a foundry would basically signal a deathknell for their Atom line. They also said that they don't want to go the route of major foundry service, which makes me even more convinced that its nothing more than baseless rumor.
 
How will AMD survive this, seriously?
The last core i3 xxxK are already doing as well in most case as their new phenom X4 980BE, while power consumption is not in the same ballpark... at all. From what we heard about Bulldozer... It's not good at all.

EDIT

Do you think that this news and this technology advancement may have an impact on the timeline Intel set (internally at least) for its larrabee project for example?
 
Last edited by a moderator:
How will AMD survive this, seriously?
The last core i3 xxxK are already doing as well in most case as their new phenom X4 980BE, while power consumption is not in the same ballpark... at all. From what we heard about Bulldozer... It's not good at all.

EDIT

Do you think that this news and this technology advancement may have an impact on the timeline Intel set (internally at least) for its larrabee project for example?

Charlie over at Semiaccurate has said Larrabee has gone back for a complete redesign. He was initially expecting a debut as the onboard graphics within Haswell but now does not think you will see it for another 3 - 4 years.
 
Charlie over at Semiaccurate has said Larrabee has gone back for a complete redesign. He was initially expecting a debut as the onboard graphics within Haswell but now does not think you will see it for another 3 - 4 years.
I remember that, I did a search and Aswell is expected by H1 2013. From infos I gather it's a "tick" ie a new architecture on an old process (by 2013 for Intel 22nm will be old...).
Clarlie spoke about it about a year almost whe nIntel officially announced that they canned Larrabee as it was and annouced "xxx knight" and SCC. Intel may have made the decision earlier and have started the redesign a while ago.
Anyway if Charlie's true and larrabee V3/V4 cores are include into Haswell (so 2013) it may be an option for systems launching by 2014. Anyway I believe costs will prevent manufacturers to do so.
 
How will AMD survive this, seriously?

Hm?
So Intel will have the upper hand in CPU for the next few years. Big deal.. they've had it since the C2D came out.


And how is Intel going to survive the APUs?
They carry a lot more computing potential than Intel's competition.
With the rise of OpenCL and DX Compute shader, how is Intel going to compete against TFLOPs-capable APUs?

AMD has admittedly gone a different way, and in the consumer market, it's certainly a lot better IMO.



Besides, this breakthrough should only be showing its true advantages in handhelds. An Atom built in 22nm Tri-Gate should finally be able to put down any ARM solution in the pipeline so far.
Let's just hope they don't do it too late, as they did with 32nm.
 
Honestly AMD has been surviving loosing a bunch of money, saling its fabs, etc. They seemed to recovered in the last quarters, but it's all done it looks like they will possibly in a worse situation than they were. I'm waiting to see how llano performs. Better than SnB for graphics is a given but Intel is catching up. In low power CPU (not mobile but laptop) it would not surpirsed me in the slightest if Intel beat crippled llano sku (due to bad power characteristic Vs Intel CPU).
 
Cool stuff. When I read Tri-Gate though I assumed they had come up with some way to give a switch 3 values so not quite a cool as I'd first envisioned :D

Still, definately a great step forward. Ivy Bridge is the successor to Sandy Bridge right? If so It looks like I'm holding out for that! I guess I can expect significantly higher clockspeeds and/or reduced power usage?
 
How will AMD survive this, seriously?

After windows8 Intel wont have it better either. For hundred milion average windows users who use the computer for internet, multimedia or basic work a ARM desktop could be a good elternative, without loosing anything from windows experience. On top of that much better competition without monopoly could drive the prices down (no more AMD chasing Intel :p).

Its even questionable if AMD needs to focus on Intel.
 
After windows8 Intel wont have it better either. For hundred milion average windows users who use the computer for internet, multimedia or basic work a ARM desktop could be a good elternative, without loosing anything from windows experience. On top of that much better competition without monopoly could drive the prices down (no more AMD chasing Intel :p).

Its even questionable if AMD needs to focus on Intel.

Thats not really any different to the current situation. Intel will still be untouchable from a performance point of view, there will just be more competition in the low cost/low performance arena. If anything it's AMD that stands to lose more from Windows running on ARM.
 
How will AMD survive this, seriously?
The last core i3 xxxK are already doing as well in most case as their new phenom X4 980BE, while power consumption is not in the same ballpark... at all. From what we heard about Bulldozer... It's not good at all.

EDIT

Do you think that this news and this technology advancement may have an impact on the timeline Intel set (internally at least) for its larrabee project for example?

IMO AMD cannot fail b/c of monopoly issues. It is like apple and microsoft in the byegone era. AMD will stick around somehow for the near future regardless of their competitive ability.
 
AMD is still great on the cheap desktop, the athlon X2 runs cool enough, is full featured, takes place on cheap motherboards and is like a midrange core2duo with an upgrade path.

soon the motherboards will be compatible with bulldozer and will feature iommu, then lowest end llano, later "bulldozer llano" will fill the cheap high performance role.

ARM cpus are still years away from that two-year-old competition. they will gain traction, I hope to see them on micro-ATX boards with PCIe slots, UEFI, DIMM slots and all that.
 
So how much more advantage does this offer over HKMG? in the high voltage/high performance arena it doesn't seem to provide the same extra boost HKMG did. So GF is getting HKMG for the first time and intel is getting finfets, seems for the majority of the X64 market theres not a huge amount changing, intel is just continuing to increase there lead but its a slow consistent march.
 
Back
Top