Week 6: Cherry's Chapter 5

Reading Questions, by Zach Tomaszewski

for CIS 702, Fall 2004, taught by Dr. Martha Crosby


1. Cherry explains the purpose of chapter 5 as "conveying to the reader some notion of the nature of the subject of statistical communication theory" (p. 167). How would you explain the purpose or role of statistical analysis in support of Cherry's statement: "Communication theory is a scientific theory; it is not a vague descriptive treatment of everyday ideas of 'information'" (pp. 167-8)?

Statistical communication theory concerns itself with the mathematical concept of information. Cherry is correct that this does not include the more vague, everyday uses of the word information. Instead, the study of information concerns itself only with signals, where each signal prompts a selection or discrimination. As Cherry points out, if there is no doubt or need to make a selection between alternatives, there is no information being transmitted.

However, information alone does not equal human communication. As Cherry states, it is assumed that as long as the signals are transmitted correctly, the resulting messages "will have 'meaning', value, truth, reliability, timeliness, and all their other properties" (p. 168). Even assuming this is true, a statistical or informational theory of communication gives no clue as to the nature of these message properties or how they arise from the transmitted signals.

Since an informational theory of communication concerns itself only with signals, which have some physical manifestation, it can be more objective and directly empirical than other communication theories. After transmitting a great number of signal samples, information theorists can then statistically analyze the frequency, variation, and other of physical aspects of the signals to discover patterns and limits.

2. What is the information rate of the Hawaiian language (assume N = 16) for one cycle per second (p. 172)? Of the English language (assume N = 32)? Of a third language? (short answer)

Information rate = n log2 N.
Hawaiian language = 1 log2 16 = 4 bits per second.
English language = 1 log2 32 = 5 bits per second.
Binary = 1 log 2 2 = 1 bit per second.

This assumes that only one sign (letter) is being transmitted per second; usually n is much higher. Also, it assumes a random distribution of signs; that is, that each sign is equally probable to occur. While this may be true in binary code, it is rare in natural languages, such as English, where e is much more common than z, and so carries less information.

3. When considering sources of information, when would you use time averaging (p. 192) versus ensemble averaging (p. 195)? Describe the difference. (short answer)

Time averaging would be used on a stationary source--that is, a source that produces information statistics that are relatively unchanging over a long period of time. Because of this, a series of samples can be taken from this single source and then averaged. An example of a stationary source might be yearly rainfall at a certain location.

A moving source--a source that produces statistics that exhibit a macroscopic trend--can be averaged using ensemble averaging. With this technique, a number of macroscopically identical sources are samples at the same time, and averages of their microscopic statistics are computed for all the sources for one time sample at a time, rather than over the time. An example of a moving source might be daily temperatures. Taking samples over a single year would be less useful than taking an average for each date over a number of years, which would allow you to say this October 1st is particularly cold, hot, or average.

4. In simple English, what is the Capacity Theorem (p. 205)? Should engineers attempt to exceed this limit? (short answer)

The Capacity Theorem states the maximum rate information that can be transmitted through a noisy channel without error. It is a largely theoretical limit, since it gives a mathematical limit--up to that limit, it is possible to transmit with an arbitrarily few number of errors. In practice, a channel still produces some errors. If engineers believe the Capacity Theorem, they need not try to exceed this limit, because they can never do it. It is like trying to travel faster than light if you believe the General Theory of Relativity.

5. Give an example and explain when you would use Mandelbrot's versus Shannon's "ideal coding: (p. 210). (short answer)

Shannon's "ideal coding" is efficient but requires long time delays. He would pack the most information into each sign (or "letter"), which could best be done with binary data, where each sign carries information independently. On the other hand, Mandelbrot assumes that information comes in words, separated by the "space" sign. Though less efficient, this encoding is more practical when having a human conversation and using words. Words that occur most frequently tend to have lower costs (be shorter and quicker to say).

6. What is entropy (p. 213)? What is "the old problem of Maxwell's demon" (p. 214)? How do they relate to communication (p. 212-216)?

Entropy is the mathematical opposite of information. If information is the reduction of choices and doubt, entropy is the increase in disorder and chaos. The Second Law of Thermodynamics states that any closed system will tend toward increased entropy. This can give a sense of time--as time moves forwards, systems run down.

The "old problem of Maxwell's demon" is a scenario in which a demon is receiving information about the particle motions of a gas, and using that information to set up a perpetual motion machine. Though receiving information, the demon does not seem to be increasing the entropy of the system. However, the demon is actually part of the system, and so it is supposed that his actions do increase the entropy of the system as a whole--more so than he can reduce it by harnessing information.

Entropy, as the opposite of the mathematical definition of information, plays a role in a statistical theory of communication. Essentially, entropy is the doubt removed by the transmission of information.


Cherry, Colin. On human communication: a review, a survey, and a criticism. Cambridge: Technology Press of Massachusetts Institute of Technology, 1957.