"The it from bit."
Does more information necessarily lead to increased knowledge or wisdom. Have we reached a point of diminishing returns with the level of information overload ushered in by the computer age? (GW)
Little Bits Go a Long Way
The more surprising a message, the more information it contains.
By JOHN HORGAN
Wall Street Journal
March 2, 2011
In 1989, I traveled to a Boston suburb to interview Claude Shannon, the inventor of information theory. Then 73, Shannon was shy and his memory was poor, so his wife, Betty, answered many of my questions. Shannon seemed to enjoy himself most when showing off his collection of games and gadgets, including a juggling W.C. Fields robot, a maze navigated by a mechanical mouse and seven chess-playing machines.
My two-page profile in Scientific American didn't come close to doing justice to Shannon, who died in 2001. After all, this playful polymath—whose work bridged electrical engineering, mathematics, computer science, physics and even philosophical logic—was among our era's most influential thinkers. His work, especially his 1948 paper "The Mathematical Theory of Communication," helped spawn today's digital devices and communications technologies. Information theory has also inspired a radical new scientific worldview, which proposes that reality is composed not of matter but of bits of information.
James Gleick's "The Information" gives Shannon his due and much more. As promised in its subtitle, "The Information" describes Shannon's achievement ("a theory") and helps us appreciate it by tracking information's myriad manifestations ("a history") from 5,000-year-old cuneiform records of barley sales all the way up to today's digital super-abundance ("the flood"). In the course of informing us about information, Mr. Gleick illuminates the histories of mathematics, artificial intelligence, quantum mechanics, genetics and other fields that we have come to understand better thanks to Shannon's theory.
What, exactly, is information? Prior to Shannon, Mr. Gleick notes, the term seemed as hopelessly subjective as "beauty" or "truth." But in 1948 Shannon, then working for Bell Laboratories, gave information an almost magically precise, quantitative definition: The information in a message is inversely proportional to its probability. Random "noise" is quite uniform; the more surprising a message, the more information it contains. Shannon reduced information to a basic unit called a "bit," short for binary digit. A bit is a message that represents one of two choices: yes or no, heads or tails, one or zero.
Shannon's simple formulation provided a framework for coding information digitally and hence more efficiently. Information theory also turned out to have deep connections to other Big Ideas. Entropy, the core concept of thermodynamics, measures the disorder of systems and, paradoxically, their potential for yielding information. The insights of information theory also have helped shed light on the interplay between randomness and order in so-called chaotic phenomena, as well as on the uncertainty principle of quantum mechanics.
The Information: A History. A Theory. A Flood.
By James Gleick
Pantheon, 544 pages, $28.95
In the 1990s, the physicist John Wheeler, whom Mr. Gleick says was "the last surviving collaborator of both Einstein and Bohr," proposed that all of physics could be recast in terms of information theory. Wheeler dubbed his idea "the it from bit." By "it," Wheeler meant all the components of physical reality, including particles, forces and even space and time. Every "it," he wrote, "derives its function, its meaning, its very existence" from "answers to yes or no questions, binary choices, bits." Today other physicists are beginning to think of the entire universe as a cosmic computer. "Increasingly," Mr. Gleick comments, "the physicists and the information theorists are one and the same."
No author is better equipped for such a wide- ranging tour than Mr. Gleick. Some writers excel at crafting a historical narrative, others at elucidating esoteric theories, still others at humanizing scientists. Mr. Gleick is a master of all these skills. As he traces the evolution of intertwined ideas, he provides vivid portraits of Shannon and other pioneers of our Information Age, including Charles Babbage, whose unbuilt 19th-century "Analytical Engine" anticipated modern computers, and Alan Turing, whose machines helped the Allies crack German codes during World War II.
A key theme throughout "The Information" is that of self- referentiality. When did humans first think about thinking? Write about writing? Conceptualize concepts? What dictionary first defined the word "define"? (It was an English dictionary published in 1582, according to Mr. Gleick.) The ancients knew self-referential conundrums such as the liar's paradox. ("This statement is false.") But in the 1930s Kurt Gödel pinpointed a similar logical knot at the heart of mathematics. His notorious Incompleteness Theorem demolished any hope of discovering a foolproof method for yielding mathematical truth—for, in effect, distinguishing signal from noise.
The inherent limitations of reason loom large toward the end of "The Information," when Mr. Gleick switches his focus to "the flood"—the torrent of information released by our digital technologies. He ponders the same questions that Nicholas Carr fretted over in his recent book, "The Shallows": Is the amount of information inversely proportional to wisdom? As information surges, will noise swamp signals? Will the false and trivial overwhelm the true and meaningful?
Every information technology, Mr. Gleick reminds us, has aroused these sorts of concerns. Plato worried that writing would lead to mental laziness, but without writing we might not know that Plato ever existed. Mr. Gleick concludes on an upbeat note. "Meaningless disorder is to be challenged, not feared," he writes. "We can be overwhelmed or emboldened." Neither information theory—as Shannon often emphasized—nor any other methodology can find meaning for us. Each of us has to discover—or create—meaning on our own.
Mr. Horgan, director of the Center for Science Writings at Stevens Institute of Technology, is the author of "The End of Science" and "Rational Mysticism."
0 Comments:
Post a Comment
<< Home