S E R V E R   S I D E
View current page
...more recent posts

Probably, like everyone else, I've linked to this before. But here it is again. Neil Stephenson's (short for a book, long for the web) In the Beginning was the Command Line. It takes some time to get through, but Stephenson (Snow Crash, Diamond Age, Cryptonomicon, etc...) is a great writer, and really knows a lot about computers. This might well be the first text on the syllabus for some future college class on the Early History of Computing. So, in case you want to get a jump on the other kids, dig in now. I'll be over here on this $!@#%&' Linux box trying to get that %^&*!!@ wu-ftp to work by typing cryptic strings of characters into the command line. There's a lot to learn when you start from the beginning.
- jim 2-28-2001 4:28 pm [link] [3 comments]

Probably, like everyone else, I've linked to this before. But here it is again. Neil Stephenson's (short for a book, long for the web) In the Beginning was the Command Line. It takes some time to get through, but Stephenson (Snow Crash, Diamond Age, Cryptonomicon, etc...) is a great writer, and really knows a lot about computers. This might well be the first text on the syllabus for some future college class on the Early History of Computing. So, in case you want to get a jump on the other kids, dig in now. I'll be over here on this $!@#%&' Linux box trying to get that %^&*!!@ wu-ftp to work by typing cryptic strings of characters into the command line. There's a lot to learn when you start from the beginning.
- jim 2-28-2001 4:28 pm [link] [2 comments]

"Claude Elwood Shannon, the mathematician who laid the foundation of modern information theory while working at Bell Labs in the 1940s, died on Saturday. He was 84."
I don't think you could name anyone who would be clearly more important in the formation of our global techno culture. You can read more about information theory here. "Information Theory regards information as only those symbols that are uncertain to the receiver." Yes, you read that right. "Information" is what is uncertain. A completely random process (if such a thing exists) emits maximum information. The incomprehensible ravings of a madman contain far more information than your nightly news report. As often happens with science, the more you learn about a particular topic, the more you see how scientific knowledge is greatly at odds with the "common everyday understanding" of many things. I thought about this Shannon stuff for a long time and it really turned my whole cognitive landscape inside out. Here's more from that lucent page: "The amount of information, or uncertainty, output by an information source is a measure of its entropy. In turn, a source's entropy determines the amount of bits per symbol required to encode the source's information." Shannon worked out in detail the mathematics dictating the maximum amount of information one can transmit over a given communication channel. This work formed the basis of all our computer network engineering, from the early voice (telephone) networks, to this internet thing we're talking on now. But his work also had more abstract influences. As we formalize the mathematics that seem to work in building machines (computers) that mimic human abilities (like communication,) we profoundly effect our understanding of how we, as humans, are able to accomplish those operations in the first place. I think Shannon's work on information was a prime motivator in the explosion of theory of language and theory of consciousness type philosophizing that took place in the 50's and 60's (and continues to this day.) His basic idea might seem simple today, but it was not in the 40's, and I bet that if you really think about the consequences of information being uncertainty, you'll start to be a little confused too. And confusion (is that uncertainty?) is the first step toward thinking something really informative. The world has lost a great mind.
- jim 2-27-2001 3:59 pm [link] [3 comments]

older posts...