18 seems to be the magic number in today’s manufacturing process. Intel and AMD both boast their upgraded production, and note that it will lead to ever increasing speeds and capabilities. Quietly, however, there is a growing consensus among the scientific community that silicon based-chips are on their way out. Tiny, molecular computers are becoming more and more feasible, and may do to silicon what transistors did to vacuum tubes. Across the world, universities and government institutions are making advances in nano-technology that could shatter today’s concept of electronics.
As far as speed and memory are concerned, the results may be incomprehensible to consumers and businesses alike. Consumers are routinely fooled by the false-security of a megahertz rating. Most buyers think an extra 50Mhz is appealing, despite a $75-$100 increase on the price tag. True, a 550 Pentium 3 has a 10% speed advantage over a 500 Pentium 3, but it realistically only performs a 5% increase in most applications. Consumers need to understand that speed and performance are mutually exclusive. An extra 100 bucks is hardly worth the 10-12-millisecond improvement when launching Microsoft Word.
Still, an 800 Athlon this Quarter, a 900 next Quarter, seems to signal the dominance of silicon-based computers for some time. Most computer-chip manufacturers estimate that they will have plenty of business until 2014 when they expect to reach their theoretical limit in silicon-based computers: . 10 microns. The translation meaning processors and other components would be built at 100-billionths of a meter, or 100 nanometers- 100 nanometers being the distance between each transistor. Now, realize that with nanotechnology we could shrink components down to . 001 microns- one nanometer.
Chips would be exponentially faster, more efficient, and powerful than anything on the drawing board today. While some labs, like the ones at UCLA, IBM, and HP are well publicized, many are working under top-secret conditions and have supposedly made several prototypes of working nanotechnology. One such rumor is of a molecular device capable of functioning as RAM in a nano-computer. The impact would be tremendous on the scientific and commercial communities. A near-term application in 2-5 years might be a DVD-like movie stored in a space half the size of today’s semiconductor chips.
If nanotechnology were to exist today, it would make every CEO in the computer chip industry cringe at the costs they have endured to produce the latest and greatest chips. Current chips are made in multi-billion dollar fabrication plants (fabs) that use light waves to etch layers of circuitry onto a silicon wafer. It is an enormously expensive process, mostly because of the conditions in which the “clean rooms” must be maintained. Any dust or particles in the room would contaminate the chips produced.
Nano-produced computer components would not require any such plant. While the current trend provides that with the advances in computer technology, the more ‘finicky’ they are to produce. Molecular computers would have their components produced in vast numbers without such hindrances. One such idea involves massive “self-assembling” vats that produce the chips using chemical reactions at a fraction of current costs. The idea behind nanotechnology is in reproducing what nature already does: produce things atom-by-atom, molecule-by-molecule.
Not only would this allow humans to control properties like color, texture, and density, but also it might be possible to create things that repair themselves when damage occurs. Self-Assembly seems to be a key concept in the nanotechnology movement, which was revolutionized only ten or so years ago. While the idea is not new, advanced microscopes and computer software have brought it from theories and crazy ideas to feasibility. For instance, in 1990 IBM brought nanotechnology to the headlines when it spelled IBM with 35 atoms of the element xenon.
Nanotechnology carries with it the idea of building anything imaginable, from a diamond coat to paint over your car (to prevent scratches), to diagnosing illnesses from one droplet of blood. In 1998, the White House Science and Technology Council created the Interagency Working Group charged with developing ideas for future nanotechnology in 10-20 years from now. They have drawn up ideas about curing cancers and legions on the body with nanoparticles traveling through the body to fix it from the inside.
Artificial limbs could be made up in batches and its prospective owner would personalize prosthetic limbs for use. Of course, memory and storage would be millions, if not billions, of times faster and larger. Unfortunately, we are still in the blueprint and laboratory stage. An Interagency Working Group report noted that nanotechnology today is where transistors were in the 1950’s. Problems persist, not with the application of such technology, but with the execution of it. No one, for instance, has discovered a way to link all the nano-particles, which process data as 1’s and 0’s together.
And it was only recently that UCLA was able to get the components to repeatedly work. Basically, it could only work with data once, and could not switch back and forth between the 1’s and 0’s. As with any other technology, the bumps in the road and problems to come will meet with new questions and innovative solutions. But the most exiting part of the whole nanocomputer idea is that it will require radically different architecture that would look alien to any computer engineer working in today’s laboratories.
At the Massachusetts Institute of Technology, they are working on architectures that resemble their biological counterparts in mammal brains. The idea is to assemble trillions of circuits and then map out and identify the good and bad pathways- much like the human brain. A simplified comparison might be declaring faulty sectors on hard-drives off limits for reading/writing. Possibly, that could mean every nanocomputer would be unique and personalized-much like the human brain. The ideas are innovative and most go out on a tangent from current doctrine.
A program manager at the Pentagon’s Advanced Research Projects Agency noted, “We don’t want to be standing on the shoulders of silicon. ” Recently, IBM showed how the circuitry of atomic scale computing could be achieved. Called a “quantum image” the technique demonstrates that it could one day be technologically practical to make a nano-circuit. Heat would be virtually eliminated and batteries that never die out might power the computers. The IBM researchers found that they could project the image of one cobalt atom (about 20 nanometers) onto a second point within the same area.
This experiment proves that it is possible to read and write 1’s and 0’s without the benefit of wiring. The research is the benefactor of the increasing wealth of IBM and thus its augmented Research and Development budget. All this comes as the Clinton Administration, along with bipartisan support in Congress, proposes an increase from $260 million to $487 million dollars in nanotechnology research. The increase will mostly benefit University research and joint ventures like the UCLA-Hewlett Packard alliance.
Federal agencies like the National Science Foundation, the Department of Defense, the Energy Department, NASA, the National Institute of Health and the Commerce Department will all have earmarked funding for nanotechnology development. Will the new millennium herald us into a new era of computing and personal electronics? Yes. Will we see nanotechnology tomorrow? No. In 10-20 years, however, supercomputers might be the size of calculators and consumer computers may fit on your watch.
No one can predict the new abilities we will have or the upcoming products that will fill our closets after we don’t want to use them anymore. In the end, the only real question that remains is: What kind of games will run on these bad boys? What objects we commonly know should disappear because of nanotechnology? People living before and through the transition – at first and because of prejudice for things we know and because people have not imagined the variety and super rich realm of new possibilities — the objects failure to everyday life will be sought by the public and reproduced by assembler technology.
People will still want cotton beach towels, although the cotton farmer will no longer be needed when fibers can be manufactured atom by atom from carbon in the air or from limestone. Lots of familiar items will appear “traditional” on the outside, yet posses a multitude of new tricks and functionality made possible with MNT — cars with Utility Fog crash protection for instance. Of course, MNT Smart Materials can look like anything, yet perform “magic”.
Now, the next generation and generations to follow, born into the age of nanotechnology will a “clean slate” without concrete historical prejudices, will design objects and lifestyles that take advantage of the new wealth of possibilities and I should expect design objects and “environments” that would appear bizarrely alien, extraordinarily novel to even the most advanced nano tinker today. The general concept is familiar in science fiction, only now we have a clear engineering path to make real, the stunning constructs of uninhibited imaginations and those yet to be born.
The wild card to consider and the reason that frankly, it is ludicrous to project past a few decades — or more than say, one generation or so, is the effect nanotechnology will have on intelligence enhancement efforts. Once these efforts are even mildly successful, the the “experimenters” will spend much of their time amplifying intelligence enhancement efforts and the valve controlling what is imaginable and what can be engineered opens at a geometric rate. By definition, what can and will be is unimaginable now, and I’m not even addressing the issue of machine intelligence in the equation. The curve approaches vertical.