"Claude Elwood Shannon, the mathematician who laid the foundation of modern information theory while working at Bell Labs in the 1940s, died on Saturday. He was 84."
I don't think you could name anyone who would be clearly more important in the formation of our global techno culture. You can read more about information theory here. "Information Theory regards information as only those symbols that are uncertain to the receiver." Yes, you read that right. "Information" is what is uncertain. A completely random process (if such a thing exists) emits maximum information. The incomprehensible ravings of a madman contain far more information than your nightly news report. As often happens with science, the more you learn about a particular topic, the more you see how scientific knowledge is greatly at odds with the "common everyday understanding" of many things. I thought about this Shannon stuff for a long time and it really turned my whole cognitive landscape inside out. Here's more from that lucent page: "The amount of information, or uncertainty, output by an information source is a measure of its entropy. In turn, a source's entropy determines the amount of bits per symbol required to encode the source's information." Shannon worked out in detail the mathematics dictating the maximum amount of information one can transmit over a given communication channel. This work formed the basis of all our computer network engineering, from the early voice (telephone) networks, to this internet thing we're talking on now. But his work also had more abstract influences. As we formalize the mathematics that seem to work in building machines (computers) that mimic human abilities (like communication,) we profoundly effect our understanding of how we, as humans, are able to accomplish those operations in the first place. I think Shannon's work on information was a prime motivator in the explosion of theory of language and theory of consciousness type philosophizing that took place in the 50's and 60's (and continues to this day.) His basic idea might seem simple today, but it was not in the 40's, and I bet that if you really think about the consequences of information being uncertainty, you'll start to be a little confused too. And confusion (is that uncertainty?) is the first step toward thinking something really informative. The world has lost a great mind.
- jim 2-27-2001 3:59 pm

wow. ive never felt as confused as i am now. i must be brimming with information. what was that about entropy? im feeling a little sluggish. am i getting smarter? yaknow -- bitsmatter. 4 score and the edge of 17. if pies are round, why is pi r squared?
- dave 2-27-2001 4:22 pm


01010100001101101001101101110010000001111010101010101010101010010101000101010010111101000101010000110110100110110111001000000111101010101010101010101001010100010101001011110100010101000011011010011011011100100000011110101010101010101010100101010001010100101111010001010100001101101001101101110010000001111010101010101010101010010101000101010010111101000101010000110110100110110111001000000111101010101010101010101001010100010101001011110100010101000011011010011011011100100000011110101010101010101010100101010001010100101111010001010100001101101001101101110010000001111010101010101010101010010101000101010010111101000101010000110110100110110111001000000111101010101010101010101001010100010101001011110100nytobit


- bill 2-27-2001 4:47 pm


Reminds me of what Alex was saying the other day.
That whenever he reads or sees anything in the media on a subject with which he is personally familiar
it is inevitably full of inaccuracies. This has been my experience too.
- steve 2-28-2001 1:16 am





add a comment to this page:

Your post will be captioned "posted by anonymous,"
or you may enter a guest username below:


Line breaks work. HTML tags will be stripped.