Your name takes up a kilobyte of space. With the latest $300 16-gigabyte hard drive in your computer, you have 15,999,999,000 bytes left for Starcraft, Hover, and Myst, not to mention all those e-mails you absolutely must keep, or the 700 bookmarks on your Netscape browser, and the programs for greeting cards and spreadsheets and slide shows that you hardly ever use, even, if you work for this magazine, the 20,000 names and addresses of your subscribers and eight complete back issues, in case you can't remember what you wrote. And it all fits on your desk.
Fifty years ago, a computer filled a room (with another room next door dedicated to its air conditioners). "It was a monster," said Nitin Samarth, associate professor of physics, of the famous ENIAC. Samarth gave the third in this year's Frontiers of Science lectures at Penn State on "How Computers Keep Getting Smaller, Faster, Cheaper."
"If you want to know why they've gotten cheaper, you'll have to ask somebody else," he joked. "I don't have the foggiest. It probably has to do with marketing and stuff that I don't understand."
But smaller and faster are the result of advances in basic physics and some very inventive engineering. "Open up your computer and look at the number of different advanced technologies that go into something costing less than $2,000. You'll be amazed."
Whether in chips or the old ENIAC's vacuum tubes, the guts of a computer are its on-off switches. "What does your laptop do? It represents data as bits, or binary numbers, strings of ones or zeros," Samarth said. "There are devices in this machine that are either on or off, one or zero, and on those you perform the software functions."
Until semiconductor transistors came along, these binary switches were simply "way too big" to make a practical PC. "I wonder how many people have seen a vacuum tube? They're huge, clunky things." ENIAC had 18,000 of them; it could carry out about 5,000 arithmetic operations every second. A modern semiconductor chip the size of a dime does the same kind of things, but at speeds up to 10,000 times faster.
A semiconductor is a material between a conductor and an insulator. "We all know what a conductor is, a metal wire that you run electrons through to light up a lightbulb. An insulator—that's wood. It doesn't conduct electricity at all. A semiconductor doesn't want to conduct electricity, but you can make it."
To make a microchip, you start with a tiny seed of silicon and grow it into a crystal. These crystals are "extremely pristine" and so strong you can slice one into membrane-thin wafers that can bend without breaking. Stick a tiny bit of an impurity in the crystal, and suddenly it will conduct electricity.
To make the switches (the transistors) and the rest of the circuitry on a chip can take another 250 steps. First you oxidize the surface of the silicon wafer. Then you coat it with a material called a photoresist. On top of that you put a mask, which has a pattern cut in it. When you shine ultraviolet light onto the wafer, the places not hidden under the mask change in chemical structure. Expose the chip to certain chemicals, and these weakened places etch away. The surface of the wafer now has a pattern of ridges and valleys. Repeat the masking and etching steps until the wafer's surface is highly textured, and the electrons can flow through it only in a precise and controllable way. Finally, dice the wafer into 200 chips.
The etched pattern, said Samarth, "is the basis of everything that goes on in your computer. Imagine that instead of electrons, you had water. It flows from here to there. Now imagine you had a sluice-gate that could stop the flow of the water. That's what these semiconductor transistors—or Si-MOSFETs—do. They control the flow of electrons."
In 1971, the Intel 4004 chip had a "whopping" 2,300 transistors, Samarth said. "The lines on it, defined by this really scientific measure called a 'hair,' were one-tenth of a hair wide."
The Pentium 2 chip, released in 1997, holds 7.5 million transistors, with lines 1/300th of a hair.
In 1999, IBM expects to market a chip with lines as narrow as 1/500th of a hair.
"It's very hard to go much lower than that," Samarth said, without finding a new way to carve the pattern into the chip.
If you tried simply to cut smaller lines on your mask, the pattern would come out blurry unless you changed to a shorter wavelength of light. "IBM seriously considered using x-rays instead of ultraviolet light," Samarth said, but so far it hasn't worked as a manufacturing process. "Electron beams are another possibility, but they're also very difficult to use in large-scale production."
Other problems are caused by "the humble wire" that connects the switches. "Think of a very small water hose. You start pushing more and more water through it and eventually it'll burst. In technical terms, this is called ëelectromigration.'" Leakage is also a problem. When transistors and other devices are packed too close, they start influencing each other.
"Those are not fundamental problems of physics," said Samarth, "but of engineering—and device engineers keep coming up with clever solutions.
"But there is a limit," he added. "When you get to very small scales—roughly below 50 nanometers—you start getting into a different regime of physics." Throw a ball at a wall, it bounces back. Throw it at an open window, it goes through. "That's classical physics. But when you throw an electron at a wall, it has a ghost-like probability of going through the wall. If you throw it at a window, it has a possibility of not going through. When you get to very small scales, you enter the strange world of quantum mechanics, and the usual device schemes we've been using don't work."
The Quantum Computer
Quantum mechanics hasn't affected microprocessors so far, Samarth noted. But it has helped hard disks and CD-ROM drives.
"In a hard disk, data is essentially stored in the form of little magnets called ëdomains.' These magnets can point in either of two directions. That's all you need to make ones and zeros. The smaller the magnets, the more memory.
"Now, you can construct very dense memory by making materials with very small domains, but there was no way to read their information using the older read heads. You needed a new sensor technology."
In 1988, physicists doing basic research—they were studying the quantum mechanical behavior of thin magnetic films—found that a sandwich made of certain materials was very sensitive to a magnetic field. "This invention was taken advantage of by an excellent materials physicist at IBM," said Samarth, "who learned how to make these sandwiches using a cheap, scalable process called sputtering. That led to the 16-gigabyte hard disk."
Quantum mechanics research has also led to CD players, laser pointers, cell phones, and telecommunications satellites. Components in these technologies draw on the idea of the "electron in a box." Put a marble in a box. "Can you make it move at the rate of one meter per second? Yes, shake it. But if the marble is an electron, there are restrictions. An electron is more like a violin string than a marble. It can vibrate only at certain resonances—and these depend on the size of the box and the mass of the electron. You can change the size of the box to ëtune' the resonance of the electron."
Making these electron boxes, Samarth said, is "like playing with atomic Legos. It's a real thrill to take atoms and arrange them in the order you want." Because of quantum mechanics, "this electron, just like a ghost, can tunnel through the walls of the box and hop onto a wire." In a CD player, for example, "a small laser translates the result of electrons hopping into and out of boxes into a stable light beam of precise color and focus. That light beam can then read music encoded as tiny bumps on the surface of a disk.
"Now imagine you could make a box like that, and a switch that could hop a single electron in and out of the box as memory. It sounds like a crazy idea, but it's being done." Except that it's being done at temperatures close to absolute zero using liquid helium. "People are trying hard to make it work reliably at reasonable temperatures."
An even more tempting idea to a physicist like Samarth is to make a fully "quantum computer." "In a quantum computer, the bits are not definite states like zero and one, but combinations of the two." Instead of just "on" and "off," these switches could be "sort of on" and "partly off." Theorists creating the mathematical framework for these strange computers say that some kinds of mathematical operations would be speeded up "immensely." But you won't be buying one for your desktop tomorrow. "I'd be really surprised if I saw a working quantum computer in my lifetime," said Samarth.
Associate Director 2DCC-MIP, Characterization Lead
Professor of Physics
George A. and Margaret M. Downsbrough Department Head
104 Davey Lab
University Park, PA 16802