Future complexities of chips and the side effects

arjan de lumens said:
[*]All-optical circuits: Look here for an introduction to optical circuits and how they actually work. This technology hasn't come very far, with basic AND/OR/XOR/NOT gates being made to work only very recently. So far, it seems that gates must be made physically very large (to the point where you can see the individual gates) to work correctly, so it is unknown if this technology can ever reach the degrees of integration that silicon chips have reached.[/list]
I'd just like to make a quick comment here on optical circuits. The size of the circuit would pretty much be determined by the wavelength of the light (the circuit must be much larger than the wavelength). This means that if smaller optical circuits are to be designed, shorter wavelength light must be used.

The potential problem here is that the shorter the wavelength of the light, the greater chance that that light will damage the substance that makes up the optical circuit, which will likely impose a circuit size limitation.
 
Is that the wall thats stopping this technology from evolving further?...how large is the team of scientists that is working on this project?...
 
"IceKnight, you have no right to attack Uttar and crap in my thread. "

Maybe I cant use rude language, but atack a point of view? what are forums for?

"Uttar can state his opinion regardless if you like it or not.
You will recieve a warning from a moderator soon. "

Uttar can state his opinions I agree. I can state my opinion too. You dont have to like it either. I got my warning, reason: vocabulary.

"Can we PLEASE get back on track?

Topic: Alternatives to silicon. "

It wasn't me who brough the nVidia lesser product buying plans.

Micron, if I get banned, no big deal, just look at how often I post. :rolleyes:
 
I dont think we are going to use any of this technologies in the future.

The way we develop circuits and mechanical devices is with a lot of brute force and no or little eficiency. The is a fisical limit to everything we can build, it being the size of the smallest particle. We will get to that limit sooner or later. The technologies that have being stated before in this thread have this limit in commmon with current cmos devices.

I think in the future our answer is protein like devices with massive parallel execution units. In our constant pursue of perfection, we will end up imitating nature and living matter.
 
It's correct that in any technology there will be a hard limit to the number of gates or transistors possible per unit area, as dictated by atomic/quantum-mechanical considerations. But that doesn't mean that there isn't room for substantial speed improvements by going to other technologies than silicon CMOS - in CMOS technology, signals tend to travel through gates and wires at speeds MUCH lower than the speed of light (about 100x for on-chip interconnect, 10000x for the logic gates), so there should be ample room for speedup by going to other technologies.

As for biological (DNA/protein) computing, the current problem is that you can run only very specific calculations (which must be planned carefully in advance), and even then only with a setup time of several hours or days.

Massive parallellism is the future, but making good use of it in general-purpose programming is hard, especially with present-day languages.
 
arjan de lumens said:
It's correct that in any technology there will be a hard limit to the number of gates or transistors possible per unit area, as dictated by atomic/quantum-mechanical considerations. But that doesn't mean that there isn't room for substantial speed improvements by going to other technologies than silicon CMOS - in CMOS technology, signals tend to travel through gates and wires at speeds MUCH lower than the speed of light (about 100x for on-chip interconnect, 10000x for the logic gates), so there should be ample room for speedup by going to other technologies.

As for biological (DNA/protein) computing, the current problem is that you can run only very specific calculations (which must be planned carefully in advance), and even then only with a setup time of several hours or days.

Massive parallellism is the future, but making good use of it in general-purpose programming is hard, especially with present-day languages.

Wasn't there a break through in Israel about bilogical DNA chips?
The chip was very fast, only problem was that it doesn't know how many 1's and 0's there are. Hence, silicon might be used in conjunction with a biological CPU/GPU.
 
Chalnoth said:
I'd just like to make a quick comment here on optical circuits. The size of the circuit would pretty much be determined by the wavelength of the light (the circuit must be much larger than the wavelength). This means that if smaller optical circuits are to be designed, shorter wavelength light must be used.

The potential problem here is that the shorter the wavelength of the light, the greater chance that that light will damage the substance that makes up the optical circuit, which will likely impose a circuit size limitation.

LOL not at you at the Idea of gamma ray computers as they have a smaller wavelength.

to tech support:My computer is glowing and my hair is falling out
automated response: Update the drivers, if this does not fix your problem please email us @ likewecare.com

You are right btw, but wrong, b/c the size is determined by the gear the interpretes and propagates the light as well, in an ideal world where we could build whatever we wanted at a molecular level, then perhaps you would be right.
 
Sxotty said:
Chalnoth said:
I'd just like to make a quick comment here on optical circuits. The size of the circuit would pretty much be determined by the wavelength of the light (the circuit must be much larger than the wavelength). This means that if smaller optical circuits are to be designed, shorter wavelength light must be used.

The potential problem here is that the shorter the wavelength of the light, the greater chance that that light will damage the substance that makes up the optical circuit, which will likely impose a circuit size limitation.

LOL not at you at the Idea of gamma ray computers as they have a smaller wavelength.

to tech support:My computer is glowing and my hair is falling out
automated response: Update the drivers, if this does not fix your problem please email us @ likewecare.com

You are right btw, but wrong, b/c the size is determined by the gear the interpretes and propagates the light as well, in an ideal world where we could build whatever we wanted at a molecular level, then perhaps you would be right.

OMG LMFAO!!! I can imagine that. :LOL:
 
arjan de lumens said:
The main alternatives to silicon CMOS logic that I am aware of:
  • Silicon BJT logic: About 2-4x faster than CMOS, but each gate is large and draws so much more power than a CMOS gate that it's essentially useless in logic designs as large as GPUs. Mainly used for high-speed inter-chip interfaces.
  • Gallium Arsenide: About 5-10x faster than CMOS, but also draws a lot of power per gate, making it mostly useless except for very fast inter-chip interface circuits. Most commonly used for optical<->electrical signal conversion and multi-gigahertz signal lines.
  • Other semiconductors, such as SiGe (silicon-germanium) and InP (indium phosphide) have their own problems as well, limiting them to niche applications.
  • Diamond and silicon carbide have been mentioned as potentially good semicondictors as well, given that they can resist extremely high temperatures, conduct heat very well, and could potentially be faster than silicon. But AFAIK, no functioning logic gate has been demostrated with either material yet.
  • Superconductors: The RSFQ superconductor logic family has AFAIK been demonstrated to work at 770 GHz, but superconductors that don't require liquid nitrogen cooling seem to be rather far off still. Look here for a company that actually makes a living out of manufacturing superconductor chips today.
  • All-optical circuits: Look here for an introduction to optical circuits and how they actually work. This technology hasn't come very far, with basic AND/OR/XOR/NOT gates being made to work only very recently. So far, it seems that gates must be made physically very large (to the point where you can see the individual gates) to work correctly, so it is unknown if this technology can ever reach the degrees of integration that silicon chips have reached.
  • Carbon nanotubes: Allow for some extremely small and fast transistors to be built, and can apparently be useful for on-chip optical interconnect. Still in its early stages - structures more complex than a few transistors haven't been built yet.
edit: added carbon nanotubes to the list.

You've got to think outside electromagnetism.

Don't forget molecular gate logic (switching speed in picoseconds) and quantum computing (poorly understood as to how to generalize it today, but field is moving rapidly), and 3-dimensional logic.

IBM recently designed the first working molecular logic http://www.newsfactor.com/perl/story/19781.html, but nanotech literature is full of designs. If it comes to fruition, feature size will undergo a few orders of magnitude reduction, and clock speeds can hit 1terahertz.

Moreover, molecular logic is easier to reverse, which allows for reversible computing. Everytime you have to "erase" information in logic (say, an AND operation, takes 2 inputs, outputs 1, so you "erase" 1 input) the laws of thermodynamics dictate you must spend kT ln 2 energy. However, with reversible logic elements, you can run the computation backwards and recover energy from all the "garbage" bits you throw away in CMOS logic design. This allows extremely power efficient design at a cost in performance.


15 year predictions are insane at this point, but as Feynman might say "there's still plenty of room at the bottom"
 
Sxotty said:
LOL not at you at the Idea of gamma ray computers as they have a smaller wavelength.

Gamma rays have short wavelengths, sure. However, they don't tend to reflect to a very large degree, they go mostly straight through matter, so building a chip using this tech does seem very difficult to me. :)

I wouldn't worry about being irradiated, after all we sit in front of what is essentially an X-ray vacuum tube all day long most of us (the good 'ol CRT). The amount of radiation would be minimal and could easily be shielded. The big problem would be getting it to work at all! :D


*G*
 
BTW, IBM announced this today

http://www.bayarea.com/mld/mercurynews/5769467.htm

IBM makes nanotech breakthrough
By Glennda Chui
Mercury News


IBM researchers have created the world's smallest solid-state flashlight --
a tube 50,000 times thinner than a human hair. It emits a glow that is
invisible to our eyes, but ideal for devices that use light to send data in
fiber-optic cables and the like.

The advance, reported in today's issue of the journal Science, is years away
from practical use -- if, indeed, it can ever be commercialized.

But it represents a step toward creating electronic devices whose parts are
the size of molecules, one of the major goals of the burgeoning field of
nanotechnology.

``This is a fantastic achievement,'' said Peidong Yang, a physical chemist
at the University of California-Berkeley who has been leading a separate
effort to coax light out of nanoscale devices. ``People have been trying to
do this for a long time.''

Another expert in the field, David Tomanek of Michigan State University,
said that while the work is at an early stage of development, ``It's a
fundamentally new phenomenon and I believe it has great potential.''

The flashlight itself is a carbon ``nanotube'' -- a sheet of pure graphite
rolled into a long, skinny cylinder that is, in fact, a single molecule.
These tubes can be single-walled or multi-walled, with one cylinder nested
inside another. And depending on their exact arrangement, they can function
as metals or semiconductors -- a characteristic that makes them intriguing
candidates for the next generation of electronic devices.

In this case, the nanotubes are just 1.4 nanometers in diameter. One
nanometer is a billionth of a meter. That's 10 times the diameter of a
hydrogen atom or about as far as a fingernail grows in one second.

Four years ago, scientists at IBM's research division in Yorktown Heights,
N.Y., reported that they had fashioned single-walled nanotubes into
field-effect transistors, the workhorse devices that are jammed by the
millions onto computer chips to control the flow of electrical current.

By applying a small voltage to the nanotube, they could turn the current on
or off. Later they created logic gates, the basic circuits used in
computing.

Now they have taken the work a step further, turning nanotube transistors
into light sources. It works this way: They introduce electrons at one end
of the tube and holes -- which are positively charged gaps that have been
robbed of their electrons -- at the other. When the electrons and holes meet
in the middle, they recombine and give off light.

The light is in the infrared part of the spectrum, and at about the same
wavelength as the light now used in optical communications. Light is a good
choice for this work, Tomanek said, because it carries more information per
second than a flowing current of electrons can, yet gives off less heat.

This isn't the first time nanotubes have been used to generate light. A team
from Rice University reported last year that it had persuaded a vat of
carbon nanotubes to fluoresce by shining laser light onto them.

Other groups have generated light from nanowires -- very thin wires made of
various semiconducting materials. Yang's group at Berkeley produced
ultraviolet laser light, a wavelength that can be used to store data at very
high densities. Another team at Harvard University has produced visible
light from nanowires.

The tube used in the IBM flashlight is a fraction of the diameter of
nanowires that have been used so far as light emitters.

What makes it notable is that ``it's very small, it's solid-state, and it's
a transistor, which allows full control of its properties,'' said Phaedon
Avouris, manager of nanoscale science at IBM Research and an author of the
report.

As electronic components move to a molecular scale -- something most
observers think is inevitable if devices are to keep getting smaller,
cheaper and faster -- ``it would be valuable to have integrated photonic
sources on the same chip,'' Avouris said. ``Our demonstration shows that
this would be possible.''
 
Gamma rays tend to be strongly ionizing, gradually destroying the molecular structure of whatever matter they pass through. This becomes a bigger problem the smaller wavelengths you wish to use, starting already with ultraviolet light.

Molecular gate logic and quantum computing currently seem to be extremely sensitive to thermal noise - time will tell if this obstacle can be overcome.
 
Don't forget molecular gate logic (switching speed in picoseconds) and quantum computing (poorly understood as to how to generalize it today, but field is moving rapidly)...

A company called D Wave systems is already working on tools for various simulations and future program development for quantum computing (they did a demonstration of the tools as part of a quantum computing seminar one of their employees/college allumni setup).
 
Back
Top