Complexity_ A Guided Tour - Melanie Mitchell [25]
FIGURE 3.2. Slot machine with three rotating fruit pictures, illustrating the concepts microstate and macrostate. (Drawing by David Moser.)
A type of microstate, for example, “pictures all the same—you win” versus “pictures not all the same—you lose” or “molecules clumped together—we can’t breathe” versus “molecules uniformly spread out—we can breathe,” is called a macrostate of the system. A macrostate can correspond to many different microstates. In the slot machine, there are many different microstates consisting of three nonidentical pictures, each of which corresponds to the single “you lose” macrostate, and only a few microstates that correspond to the “you win” macrostate. This is how casinos are sure to make money. Temperature is a macrostate—it corresponds to many different possible microstates of molecules at different velocities that happen to average to the same temperature.
Using these ideas, Boltzmann interpreted the second law of thermodynamics as simply saying that an isolated system will more likely be in a more probable macrostate than in a less probable one. To our ears this sounds like a tautology but it was a rather revolutionary way of thinking about the point back then, since it included the notion of probability. Boltzmann defined the entropy of a macrostate as a function of the number of microstates that could give rise to that macrostate. For example, on the slot machine of figure 3.2, where each picture can come up “apple,” “orange,” “cherry,” “pear,” or “lemon,” it turns out that there are a total of 125 possible combinations (microstates), out of which five correspond to the macrostate “pictures all the same—you win” and 120 correspond to the macrostate “pictures not all the same—you lose.” The latter macrostate clearly has a higher Boltzmann entropy than the former.
FIGURE 3.3. Boltzmann’s tombstone, in Vienna. (Photograph courtesy of Martin Roell.)
Boltzmann’s entropy obeys the second law of thermodynamics. Unless work is done, Boltzmann’s entropy will always increase until it gets to a macrostate with highest possible entropy. Boltzmann was able to show that, under many conditions, his simple and intuitive definition of entropy is equivalent to the original definition of Clausius.
The actual equation for Boltzmann’s entropy, now so fundamental to physics, appears on Boltzmann’s tombstone in Vienna (figure 3.3).
Shannon Information
Many of the most basic scientific ideas are spurred by advances in technology. The nineteenth-century studies of thermodynamics were inspired and driven by the challenge of improving steam engines. The studies of information by mathematician Claude Shannon were likewise driven by the twentieth-century revolution in communications—particularly the development of the telegraph and telephone. In the 1940s, Shannon adapted Boltzmann’s ideas to the more abstract realm of communications. Shannon worked at Bell Labs, a part of the American Telephone and Telegraph Company (AT&T). One of the most important problems for AT&T was to figure out how to transmit signals more quickly and reliably over telegraph and telephone wires.
Claude Shannon, 1916–2001. (Reprinted with permission of Lucent Technologies Inc./Bell Labs.)
Shannon’s mathematical solution to this problem was the beginning of what is now called information theory. In his 1948 paper “A Mathematical Theory of Communication,” Shannon gave a narrow definition of information and proved a very important theorem, which gave the maximum possible transmission rate of information over a given channel (wire or other medium), even if there are errors in transmission caused by noise on the channel. This maximum transmission rate is called the channel capacity.
Shannon’s definition of information involves a source that sends messages to a receiver. For example, figure 3.4 shows two examples of a source talking to a receiver