Hacker News new | past | comments | ask | show | jobs | submit login

Richard Hamming (known for the Hamming code among others) has said that "information theory" ought to be called "communication theory". Obviously, communication theory is a field in itself, but I think the gist of his argument holds somewhat true in that information theory tells little about what information is intrinsic but instead tells about what information can be _learned_.

But I think the intuition breaks down. What is mutual information then? Classically, it is the information which can be reliably transmitted between a transmitter and receiver (given some model of both.) However, the unit is still bits.




Communication theory is far too narrow a subfield.

Physics has a lot to say about this. Physicists have done a lot of information-theoretic analysis of the laws of physics, and physics-theoretic analysis of information theory. Information theory can no longer be disentangled from the physics itself. Meaning, what information processing tasks (including but not limited to communication tasks) are possible and with what complexity depends on the laws of the universe we live in. And conversely, which universes are even possible (those with "reasonable" laws) are constrained by the theorems of information theory.


Just to be clear. I don't disagree. Albeit information theory is a grand name, "communication" theory would not strike at the general applicability.


Do you know exact source of that quote? Might be useful for my future paper around this area. Norman Abramson also had similar opinion.


Yes, it is from Hamming's "The Art of Doing Science and Engineering".

From page 89:

> Information Theory was created by C.E.Shannon in the late 1940s. The management of Bell Telephone Labs wanted him to call it “'Communication Theory” as that is a far more accurate name, but for obvious publicity reasons “Information Theory” has a much greater impact—this Shannon chose and so it is known to this day. The title suggests the theory deals with information—and therefore it must be important since we are entering more and more deeply into the information age.

Then later on on page 90:

> First, we have not defined “information”, we merely gave a formula for measuring the amount. Second, the measure depends on surprise, and while it does match, to a reasonable degree, the situation with machines, say the telephone system, radio, television, computers, and such, it simply does not represent the normal human attitude towards information. Third, it is a relative measure, it depends on the state of your knowledge. If you are looking at a stream of “random numbers” from a random source then you think each number comes as a surprise, but if you know the formula for computing the “random numbers” then the next number contains no surprise at all, hence contains no information! Thus, while the definition Shannon made for information is appropriate in many respects for machines, it does not seem to fit the human use of the word. This is the reason it should have been called “Communication Theory”, and not “Information Theory”. It is too late to undo the definition (which produced so much of its initial popularity, and still makes people think it handles “information”) so we have to live with it, but you should clearly realize how much it distorts the common view of information and deals with something else, which Shannon took to be surprise.

> This is a point which needs to be examined whenever any definition is offered. How far does the proposed definition, for example Shannon’s definition of information, agree with the original concepts you had, and how far does it differ? Almost no definition is exactly congruent with your earlier intuitive concept, but in the long run it is the definition which determines the meaning of the concept—hence the formalization of something via sharp definitions always produces some distortion.


Thank you




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: