nano Technology !!

Oushi

Newcomer
i have a question ? what a will happen in the future when the industry hit 32nm process and after.
will we reach a point that we can't make a smaller process? what will happen by then.
is there other technologys to solve this issue if found ?
 
I wouldn't be so sure about that. Things can and will be done, but from 32nm downwards they need to start doing things quite differently.

There are some experimental propositions, but nothing that would be useable to simply shrink a current design down.
 
You can create transistors and charge carriers right down to the molecular level. The problem is failure density/rate, traditional redundancy methods will probably be insufficient to deal with the level of failure if they go that small ... I think all ICs will become forms of FPGAs at that point so you can route around failures (possibly even dynamically).
 
You can create transistors and charge carriers right down to the molecular level.
Yes, but there is quite a difference between "growing" molecules that have certain characteristics and simply depositing, doping and etching away material. And using a long string of molecules to pass single electrons around is quite different from using wires.

The problem is failure density/rate, traditional redundancy methods will probably be insufficient to deal with the level of failure if they go that small ...
Not only that, but they have to stop thinking in conductors and isolators, and start building 3D models of the (mostly electromanetical) energy potentials at each state. Because those will become predominantly the thing that determines what paths the electrons travel.

I think all ICs will become forms of FPGAs at that point so you can route around failures (possibly even dynamically).
In that case, it might not make a lot of sense to downscale, when your die becomes bigger and slower than the previous one.
 
You can create transistors and charge carriers right down to the molecular level. The problem is failure density/rate, traditional redundancy methods will probably be insufficient to deal with the level of failure if they go that small ...
Agree, and maybe we will see the Moore´s law make an S curve while we adapt in a technology transition sometime by 2025.

I think all ICs will become forms of FPGAs at that point so you can route around failures (possibly even dynamically).
Agree. For most applications yes, specially because today some system design and fabrication already are going to that direction. But this has to be done with fast CLBs.
 
In that case, it might not make a lot of sense to downscale, when your die becomes bigger and slower than the previous one.
Not to start with, but in the end the difference in scale between the smallest functional fin-FET and a single molecule transistor can overcome quite a bit of architectural inefficiency.
 
Not to start with, but in the end the difference in scale between the smallest functional fin-FET and a single molecule transistor can overcome quite a bit of architectural inefficiency.
Now we only have to find out how to grow those molecules. ;)
 
hm... I think we may see such technologies applied to memory devices first - ordered arrays are just so much easier than complex processing logic, and in thin films, it's relatively trivial to achieve ordered arrays of nanoporous thin films. How you'd turn those into cells, I don't know...
 
~32 times as many transistors as @90nm, and ~5 times the speed.

Are you ready for a 32 Teraflops PS5 ? :smile:

@90nm we have 680 million transistors this should be about 21 billion transistors@16nm
it's a big number, it will be hard time to cool it down :devilish: . i wish by then we can play games that look like real life.
 
Not only that, but they have to stop thinking in conductors and isolators, and start building 3D models of the (mostly electromagnetic) energy potentials at each state. Because those will become predominantly the thing that determines what paths the electrons travel.

The problem that arises is that design methodology used now is based on voltages and currents, instead of energy potentials. So this type of modeling implies that most of the theory used now can be thrown into the bin. On the other hand, it is quite a big problem to fit the nano-devices (such as SET/RTD's) into classical theory, not even speaking about more complex devices. What a dilemma... :cry:
 
A silicon atom is 111 picometers across. Even if they all (somehow) bond together in a straight line, that's 144 silicon atoms in a 16nm space. If the silicon atoms are just sitting in a crystal, it's more like 75 silicon atoms per 16nm.

We're already beginning to approach the molecular level, not much more you can shrink once you're down to single molecules. (unless someone invents quark transistors :p)
 
Back
Top