The multiple tales in Turing’s Cathedral: The Origins of the Digital Universe take
place in America during the days of spies, World War II, the hydrogen
bomb, and the century’s most constructive invention: the birth of the
computer industry.
A fellow named John Von Neumann and a dozen others, including Albert
Einstein, formed the Institute for Advanced Study (IAS), at Princeton
University and began planning for the future by writing 1’s and 0’s on a
sheet of paper.
Yes, today’s digital universe started with one computer in one
laboratory in one building, and with a tiny five kilobytes of memory.
As simple as it started, George Dyson’s book generously lists over
eighty names of principal characters involved in this technology
adventure, including Vannevar Bush, Richard Feynman, Julian Bigelow, and
Kurt Gödel, along with Alan Turing, a British logician and
cryptologist, creator of the Universal Machine. This creation moved
technology beyond the counting machines in use in the early 1940s. It
made the difference between numbers that mean things and numbers that do
things.
In its simplest form, data is a bit, which can either be a one or a
zero. That is the essence of the digital universe, for logic and
mathematics. Today, as we dawdle through our days with gigabytes of
memory and processing power, we can hardly appreciate that in March 1953
there were exactly 53 kilobytes of high-speed random-access memory in
the world. Five of them were in use at the IAS lab at Princeton, 32
kilobytes were divided among eight clones of the IAS computer, and 16
kilobytes were unevenly distributed across six other machines. Without
cross-communication, Dyson considers them: “Each island in the
archipelago constituted a universe unto itself.”
Alan Turing, on the IAS team, configured machines to read, write,
remember and erase marks on tape, in both directions, giving birth to
the Universal Computing Machine.
Dyson takes us through the 20th century with a fascinating American
history of mathematics at a time when sugar was rationed and
super-calculation power was more experiment than science. It was also a
time of using accelerated knowledge to help design weapons and power
explosives.
As the U.S. was immersed in WW II, there was an urgent need for
computing power. Engineers were busy building computers using
electro-static storage tubes and vacuum-tube technology, equivalent to
modern silicon memory chips. Suddenly the effort involved for a human
to calculate ammunition trajectories could be done within minutes,
instead of hundreds of hours.
These hand-built, room-size, machines also fostered next-generation
nuclear weapons, and led to development of the Internet, the
microprocessor and multiple-warhead ICBMs. Soon, the ENIAC computer,
occupying a 33 by 55 foot room, built with 500,000 hand-soldered joints,
had the power of twenty human processors, and remained in use until
1955.
The work of John Von Neumann and Alan Turing gave birth to software
and established principles that would guide the future of computers.
As Alan Turing enters the story, he says goodbye to his family and
sails in steerage from London to New York in 1936, heading for Princeton
to work with Von Neumann. He carried with him a heavy brass sextant and
soon after arrival delivered his 35-page paper, “On Computable
Numbers,” said to symbolize the powers of digital machines. The men
worked together for two years while Turing completed a fellowship. His
paper described a Universal Machine, able to compute any computable
number.
Turing’s Cathedral should be required reading for today’s
techies, who will delight in every new development along the way,
including a high-speed wire drive, coiled via bicycle wheels, running at
90,000 bits per second. A memorable forerunner to the tape cartridges
and removable drives that came along in the late 20th century, indeed.
As we know today, technology also brought about the ability to conduct
computer-assisted weather forecasting, Monte Carlo simulation
statistics, and grew exponentially to include many thousands of
innovations.
The history of computers and statistics is part of the history of the
U.S., WW II, immigration, university life, weather, Los Alamos, and
beautiful story telling. Through a well-told story and rare photos,
Dyson’s book is both a history lesson and a tribute to the pioneers of
technology who changed the world.
Subscribe to:
Post Comments (Atom)
1 comment:
These articles and blogs are genuinely sufficiency for me for a day.
best-tablet-reviews.weebly.com
Post a Comment