...more recent posts
Blogger undergoes a significant update. Notes on the very nice looking new designs are here.
From IBM's chief technology officer's speech at the International Electronics Forum:
"Somewhere between 130-nm and 90-nm the whole system fell apart. Things stopped working and nobody seemed to notice." He added, "Scaling is already dead but nobody noticed it had stopped breathing and its lips had turned blue."We are always seeing stories about "the end of Moore's law" and for years these stories have consistently turned out to be untrue. But this seems a little more specific and a lot more believable.
In a possibly related story, Intel has recently announced a complete change in their future processor roadmap, dropping their massive, and monolithic, P4 flagship in favor of a more energy efficient dual core design.
So perhaps CPUs have hit the wall in some sense. But does it really matter? My amateur understanding is that we will still continue to see total system performance increase, but more and more those increases will come from other links in the chain (from mass storage speed increases, from bus speed increases, etc...) as well as from redesigning toward parallelism.
In other words, while we might not see 6 ghz processors, we will for sure see dual core 3 ghz processors (very soon,) and for most applications this will amount to the same thing. So expect more breathless "Moore's law is invalidated!" stories (even though it's not really a law and can't be invalidated,) but don't get too worked up. There is plenty of room still for innovation.
Puff piece, but sort of interesting nonetheless, on wearable displays:
Starner expects that the advent of mass-market floating displays will similarly expand what we do with mobile computers. He and his peers predict that someday we'll all be operating in "augmented reality" with the help of our wearable computers. All that really means is we'll eventually have the ability to bring the Web--or some form of it--with us wherever we go.Clearly the display is the biggest problem going forward (with data entry the second.) But will it go this way, or will flexible roll up displays be the answer? Imagine a cell phone / mobile computer in the form factor of a fat pen. Out of the side of the pen you can pull a flexible screen, like rolling down a window shade. This way you can get an arbitrarily big display in a pocket sized device. Both technologies are here, and close (2 years?) from the retail market.
"Why: Why not?"
Indeed.
Floppy disk drive RAID.
I'm not saying they are there yet, but Apple seems to have gotten something right with the latest iTunes (as well as breaking some stuff and going further down the DRM blackhole.) I think the iMix thing could be potentially huge. See this post for more:
Gyford writes, "It's kind of a social networking tool without really trying. With music as the social lubricant, you have something by which to gauge other people. Never mind those dumb lists of 'My Favourite Music' that Friendster et al suggest you populate. You can see which bands someone actually listens to. After browsing to, say, mechajesus's page, I'm almost interested in getting in touch simply because we share lots of tastes and he also listens to things I've never heard of. For anyone who's all High Fidelity-esque about judging people by their musical tastes (and, let's face it, you can't take anyone seriously who isn't), this is like the proverbial crack cocaine. Well, maybe if you could just download an MP3 with a click ... then I'd have problems leaving my computer to take care of essential bodily functions."
Interesting beta app for the Treo 600 (download link in first post) which gives you lots more control over the camera (can increase the factor set quality level from 65 to 99) plus it sort of deals with the low light blue dot problem. With a small amount of testing I can say that the pictures do look a little better.
Open-Source Mesh Group Releases Software:
The CUWiN project wants to allow self-forming, noncentralized, mesh-based Wi-Fi networks using standard, old PCs with no configuration. Slightly more advanced units could be ruggedized boxes using Compact Flash, but the basic unit would be a 486 or later PC with a bootable CD-ROM or bootable floppy that bootstraps a CD-ROM. Once booted, a unit finds other similar units without any other configuration or control and forms a mesh.
“We’ve been developing software now since about 2000, and our idea is to build software that is super user friendly, super easy for someone who doesn’t understand the nuances of the technology or community wireless networking to set up their own system,” said Meinrath. It’s an attempt to enable community networking to spread beyond the folks who are self-starters.
Here is a post with lots of links to very serious deep thinking on this subject. I should have some more to say, and some more specific pointers when I get done plowing through this fascinating stuff.
A completely killer 2 MP camera phone from Samsung that, if the past is any indication, will never ever be released in the U.S.
Okay, I know I have no pull. But can't someone hook me up with a gmail account?
Basic ran on the Dartmouth Time Sharing System, a network of multiple simple terminals connected to a large computer, Kurtz explained. "The development of Basic was a natural step in a whole progression of computer activities that began when I arrived at Dartmouth in 1956," he said. "The whole thrust was to try to make computing easier for people, particularly nonscience and nonengineering people."My first experience with computer programming was using basic on a TRS-80 computer at my Junior High School.
Around 1960 or so, Kurtz said, he and Kemeny realized that the only way to do that was to develop a time-sharing system that would be especially geared toward small student jobs rather than the "big research stuff."
"The idea was that a time-sharing system made it easy for students or anybody else to get to the computer," Kurtz said. "The user interface to the time-sharing system was very simple. Instead of using things like 'log in' and 'log out,' we used [simple English-language functions] like 'hello' and 'goodbye.'
"We needed a simple language, and that's how Basic got developed," he said. "The languages that were around in those days were just not suitable, so we had to develop one from scratch -- [though] it derived from the existing languages, there's no question about that -- and we also wanted a computing environment where people could use it without having to take a course."