...more recent posts
New Neal Stephenson novel, Quicksilver, due out in September.
In this wonderfully inventive follow-up to his bestseller Cryptonomicon, Neal Stephenson brings to life a cast of unforgettable characters in a time of breathtaking genius and discovery, men and women whose exploits defined an age known as the Baroque.Preview chapter on line now.
Computer programs can be very good at textual pattern matching, but they are very bad at semantic matching. Finding every occurrence of 'rose' is no problem; finding every expression of love is impossible.
Given this, might it be the case that since information on the web is largely found by computer programs (like google,) will the web exert pressure (realized or not) on writers to standardize (fossilize?) their use of language?
In other words, will our dependence on google as a means of having our writing discovered by people who are looking for just such things, exert a pressure on us as authors to use language more uniformly? Or, again, will something like the semantic web emerge, not through marking up our writings with XML tags which specify what we "really mean", but through a general shift towards always using the same word or phrase for a single idea?
You might think of this as the emergent semantic web. Or the bottom up semantic web. But - and this is the point - you'll have trouble finding all documents on this or any other subject unless we stick to one name or the other.
Will this be good or bad for language? And for humans?