In the last half-century, transistors have increased millions of times in power - but that's not good enough.
Manufacturers are developing ways to manufacture chips with transistors so minuscule, so delicate, that present methods will seem as crude as cutting a diamond with a chain saw.Nowadays, microelectronics engineers manufacture chips via "optical lithography." In this technique - akin to photography - images of different parts of the chip are projected onto a special plate. The plate is bathed in chemicals and "developed," like a negative in a photo lab. Modern chips have several "layers," like multistory buildings.
Chips are getting so small that their individual components are smaller than the wavelength of visible light, so sub-optical lithography is needed.
Some scientists are investigating forms of lithography that "cut" a superfine microchip with beams of electrons. Others are developing lithography that uses radiation at X-ray and "extreme ultraviolet" wavelengths, which are shorter than those of visible light.
By about 2002 or 2003, scientists at Bell Laboratories/Lucent Technologies expect to shrink chip components to about 0.13 microns. A micron is a millionth of a meter.
At 0.13 micron, optical lithography "just gets harder and harder," says William Brinkman, the company's vice president of physical sciences research. At that point, Lucent will have to shift to electron-beam lithography - assuming it works.
And if it doesn't? "We'll be stuck," Brinkman admits. "This is a thing that has to happen."
Another problem arises as transistors shrink to atomic scale. At that level, the weird world of "quantum physics" starts to take over: The movement of electrons becomes much harder to predict, and is governed by principles of statistics rather than the neat, clean exactitudes of Newtonian physics.
Some fear that chips with components comparable in size to atoms could be hopelessly unreliable.
But quantum effects may be an opportunity, not a barrier. "Quantum computers" could be stunningly faster than anything currently available.
Some scientific problems are so complex that no existing computer can crunch the necessary numbers in a reasonable time. In some cases, the computer would require a period of time equal, in seconds, to the numeral 1 followed by 40 zeros, says Stanley Williams of Hewlett-Packard in Palo Alto, Calif.
The present age of the universe in seconds is believed to be the numeral 1 followed by 17 zeros.
Quantum computing already exists, in rudimentary form.
"Exactly a year ago, if I had been asked how long it would be before the simplest quantum computation could be done, I'd have said 10 years," says Williams, who runs Hewlett-Packard's Quantum Structures Research Initiative. "And now, it has already been done at UC-Berkeley, Oxford, Los Alamos, in actual experiments."
One type of quantum computer might store information in hydrogen atoms, the lightest element. Ordinarily, a single electron orbits an hydrogen atom. By firing a micro-laser at the atom, the electron can be switched from a high orbit to a low orbit, back and forth, like flicking a light switch on and off. To the quantum computer, the low-orbit electron would represent "0" and the high-orbit electron would be "1." In this way the computer could "calculate" using the same binary code - long strings of 0s and 1s that symbolize data - now used by computers.
Intel co-founder Gordon Moore is more skeptical.
"Maybe I've been in this business too long," he acknowledges, "but from what I've seen of quantum computing, it looks to me more like a way to show some of the counter-intuitive effects of quantum mechanics than like a practical route to making a useful device." Still, Moore adds with a laugh, "Predicting something won't work is one of the most dangerous things to do in this business."