Freeman Dyson reviews James' Gleick's The Information: A History, a Theory, a Flood in the March 10 2011 issue of THE NEW YORK REVIEW OF BOOKS.  In this passage he introduces us to Claude Shannon, founding father of information theory.
For a hundred  years after the electric telegraph, other communication systems such as the  telephone, radio, and television were invented and developed by engineers  without any need for higher mathematics. Then Shannon supplied the theory to  understand all of these systems together, defining information as an abstract  quantity inherent in a telephone message or a television picture. Shannon  brought higher mathematics into the game. 
 When Shannon was a boy growing up on a farm in Michigan, he built a homemade  telegraph system using Morse Code. Messages were transmitted to friends on  neighboring farms, using the barbed wire of their fences to conduct electric  signals. When World War II began, Shannon became one of  the pioneers of scientific cryptography, working on the high-level cryptographic  telephone system that allowed Roosevelt and Churchill to talk to each other over  a secure channel. Shannon’s friend Alan Turing was also working as a  cryptographer at the same time, in the famous British Enigma project that  successfully deciphered German military codes. The two pioneers met frequently  when Turing visited New York in 1943, but they belonged to separate secret  worlds and could not exchange ideas about cryptography.
 In 1945 Shannon wrote a paper, “A Mathematical Theory of Cryptography,” which  was stamped SECRET and never saw the light of day. He  published in 1948 an expurgated version of the 1945 paper with the title “A  Mathematical Theory of Communication.” The 1948 version appeared in the Bell  System Technical Journal, the house journal of the Bell Telephone  Laboratories, and became an instant classic. It is the founding document for the  modern science of information. After Shannon, the technology of information  raced ahead, with electronic computers, digital cameras, the Internet, and the  World Wide Web.
 According to Gleick, the impact of information on human affairs  came in three installments: first the history, the thousands of years during  which people created and exchanged information without the concept of measuring  it; second the theory, first formulated by Shannon; third the flood, in which we  now live. The flood began quietly. The event that made the flood plainly visible  occurred in 1965, when Gordon Moore stated Moore’s Law. Moore was an electrical  engineer, founder of the Intel Corporation, a company that manufactured  components for computers and other electronic gadgets. His law said that the  price of electronic components would decrease and their numbers would increase  by a factor of two every eighteen months. This implied that the price would  decrease and the numbers would increase by a factor of a hundred every decade.  Moore’s prediction of continued growth has turned out to be astonishingly  accurate during the forty-five years since he announced it. In these four and a  half decades, the price has decreased and the numbers have increased by a factor  of a billion, nine powers of ten. Nine powers of ten are enough to turn a  trickle into a flood.
In 1949, one year after Shannon published the rules of information theory, he  drew up a table of the various stores of memory that then existed. The biggest  memory in his table was the US Library of Congress,  which he estimated to contain one hundred trillion bits of information. That was  at the time a fair guess at the sum total of recorded human knowledge. Today a  memory disc drive storing that amount of information weighs a few pounds and can  be bought for about a thousand dollars.
Labels: information society, information theory