Advances in technology have improved computers but what's next?
By John Yaukey
Gannett News Service
What an ungainly little contraption: a tiny slab of germanium, a few specks of gold foil and a paper clip held together with a few pieces of rough-cut plastic and gobs of solder.
The first transistor wasn't much to look at.
In fact, after it was unveiled at a thinly attended press conference in 1948 at Bell Labs in Murray Hill, N.J., The New York Times buried the story.
The transistor went on to change computing, and indeed, all of communications.
Consider cutting-edge computing in 1946: The ENIAC (electronic numerical integrator and computer), with its 17,468 vacuum tubes, 70,000 resistors, 10,000 capacitors, 1,500 relays, and 6,000 manual switches, occupied an entire room. That kind of computing power can now reside on a chip the size of an aspirin, thanks to the incredible shrinking transistor and advances in the technologies now used to make them at a cost of pennies per billion.
"The transistor is so embedded in modern life that it's done a disappearing act," said physicist Michael Riordan, who chronicled the history of the transistor in his book, "Crystal Fire." "They're like the cells in your body, you just don't think about them."
The epochal story of the transistor underscores the central themes of the digital revolution smaller, faster, cheaper and raises intriguing questions about where the next big breakthroughs will launch from and where they will take us.
In principle, "as computer intelligence shrinks, it's spreading out into smaller and smaller capillaries of life," said Michael Hawley, a professor of media and technology at Massachusetts Institute of Technology's famed Media Lab.
But specifically, how is this happening?
What are the technologies that will transform the way we communicate and exploit information in other words, the way we live?
Quantum computing doesn't depend on shrinking dimensions, but rather exploits the bizarre properties of subatomic particles.
It's based on quantum mechanics, the theory of physics that explains the erratic world of fundamental matter. In this Lilliputian realm, basic matter behaves nothing like the familiar tangible objects it comprises in macroreality.
Particles, for example, can exist in high and low energy states simultaneously, like being happy and sad at once. Indeed, think of quantum matter as having multiple personalities that can all act out at once.
If scientists can focus these quantum personalities on a single computing job, they can develop computer processors capable of doing multiple calculations at once instead of just one at a time the way current sequential processors work.
Where traditional computers rely on transistors switching on and off to perform tasks the switch is either a 1 for "on" or a 0 for "off" quantum transistors could be both off and on at once.
At the end of last year, IBM scientists announced that they were able to get a test tube of specially designed molecules to perform quantum calculations using radio frequency pulses to "program the computer."