...more recent posts
Those bleeding heart military guys:
The most effective way to find and destroy a land mine is to step on it.
This has bad results, of course, if you're a human. But not so much if you're a robot and have as many legs as a centipede sticking out from your body. That's why Mark Tilden, a robotics physicist at the Los Alamos National Laboratory, built something like that. At the Yuma Test Grounds in Arizona, the autonomous robot, 5 feet long and modeled on a stick-insect, strutted out for a live-fire test and worked beautifully, he says. Every time it found a mine, blew it up and lost a limb, it picked itself up and readjusted to move forward on its remaining legs, continuing to clear a path through the minefield.
Finally it was down to one leg. Still, it pulled itself forward. Tilden was ecstatic. The machine was working splendidly.
The human in command of the exercise, however -- an Army colonel -- blew a fuse.
The colonel ordered the test stopped.
Why? asked Tilden. What's wrong?
The colonel just could not stand the pathos of watching the burned, scarred and crippled machine drag itself forward on its last leg.
This test, he charged, was inhumane.
A New Way to look at Networking - 1 hour 20 minute video of a talk at Google given by Van Jacobson who is
...best known for his work in IP network performance and scaling; his work redesigning TCP/IP's flow control algorithms to better handle congestion is said to have saved the Internet from collapsing due to traffic in 1988-1989.This is an incredibly interesting talk with a long and accessible first section that traces the birth and history of packet based network communications. This is the best thing I've seen explaining the birth of the internet.
Cargill's quandry: "Any design problem can be solved by adding an additional level of indirection, except for too many levels of indirection."
I basically do three different jobs for my work. I'm a linux system admin; a PHP/MySQL developer; and an HTML/CSS designer (not sure if 'designer' is the right word there.) Of the three the later has always been my weakest spot.
I first started all this back in 1999 during the dark ages of Netscape Navigator 4. Back then you just built every page out of tables and tried to accept that you didn't really have too much control over exactly how the page was going to look. Over the years this has slowly been changing. CSS (Cascading Style Sheets) came into vogue, and they allow you to separate a lot of the visual formatting of the page out from the more structural HTML. CSS also, in theory at least, gives you a lot more control. The problem has always been that different browsers implement the HTML and CSS specifications in wildly different ways, so most of your effort as a designer go not into getting a page to look the way you want, but into getting a page to look the way you want in all different browsers. And just to make matters worse, the most popular browser, Internet Explorer, is by far the least standard compliant.
Anyway, like I said, this has all been improving, but the improvement is frustratingly slow. Still, IE 7 is much better, and Gecko (Firefox and Mozilla et al) and Safari and Opera pretty much have it all together. So lately I have really been digging into CSS and trying to get fully up to speed. It's still a mess due to browser inconsistencies, but it's at least good enough now that it seems worth the effort to me.
To that end I've been doing a lot of reading, and so I'm going to start linking - for my own memory at least - some of the better resources I have come across. I'll start with CSS guru Eric Meyer's work on style reset. What he has done is to create a baseline CSS file that declares a bunch of rules that are all meant to zero out the different assumptions made by different browsers about how to render a page. It's an effort to create a level playing field, or a common starting point, for making things look the same across browsers.
Yahoo has a similar style sheet that they promote in their UI toolkit (which is really sweet btw,) but I think I like Meyer's a bit better. This is really great work that is invaluable to people like me.
I've been having some problems with a new external USB hard drive freezing the Finder when I try to copy a very large file (or group of files) to it. The copy will start out okay, but then freeze after some time. I think the problem is that the drive is spinning down (during the copy!) That seems very weird but really does seem like what is happening. I need to get to the bottom of this, or at least get a better solution in place, but for now I added this to my crontab:
*/5 * * * * ls /Volumes/BigMusic > /dev/null
and everything appears to be okay. What this does is query the external drive ("BigMusic") for a top level directory listing every 5 minutes (and then it sends the output of this query into oblivion, so I don't notice this is happening.) So there is no point to this command, but it just forces the drive to stay spun up by making the query every 5 minutes. That is a total hack though, so I'm not really happy with it, but at least I can copy files now.
My computer was acting weird on Thursday night, and Friday morning I rebooted after cleaning out some cache files. But it wouldn't boot. I got the post chime, and then the Apple logo on the grey screen, but then it just stuck there with the spinning black wheel. Tried everything I could think of to no avail. In verbose mode (command-v on boot up) I could see it was getting stuck. I could boot into FireWire target disk mode and see and copy my files with no problem. So that was good in one way - I didn't lose any data - but bad in another because it made it seem like a hardware problem (and not with the drive, so that mean main logic board with means game over.)
Still, today I figured it was worth it to reinstall the OS just in case. But I wasn't too hopeful. Thanks to target disk mode it was at least possible. I attached it to another computer, and rebooted that one with an OS X install disk, and then selected my computer as the drive on which to install. Everything went fine, and my computer just booted without a hitch. Applying all the updates now.
Phew.
Project Honeypot looks like a very interesting attempt to curb comment spam (well, it's going after a little more than that, but comment spam is what I am interested in.) A "honeypot" is a server put on the web for the sole purpose of monitoring illicit activity (like maybe a server you put on the web that is open to attack, but you mean it to be attacked so you can observe the attackers - they think it is a "real" server but actually it's just a honeypot meant to trap them.) This project has set up many honeypots all over the web and have been silently collecting data on spammers. Now they are ready to release their data set so that anyone can tap into this collective knowledge. They have an Apache 2 module as well as an API. I'm going to try to write something against the API. This will be great if it works!
There is a bug in Quicktime's java handling that opens a pretty serious security hole allowing a malicious website to compromise a users system. This is pretty embarrassing for Apple, not just because it's a serious security hole, but also because it allows for Windows systems with Quicktime installed to be compromised as well (and any computer with iTunes installed has Quicktime installed.) This effects both Firefox (Windows or Mac) and Safari when browsing with java turned on (java, not javascript.) Turning off java in your browser preferences closes the hole. People should definitely do this until a fix is released. (And while this doesn't make it any better - you most likely won't notice any difference browsing with java off since almost no sites use client side java applets.)
The datamantic.com domain is six years old today. Time flies.
I've been playing around with the Ext library this week. This is the furthest I've dug into any javascript library. It is really powerful stuff. Pretty mind boggling. Ext lets you create very nice user interfaces out of javascript, and it takes care of all the heavy lifting, most notably ironing out all the cross browser wrinkles. You can see a bunch of examples and demos on this page (click in the left hand column to see dialogs, toolbars, menus, forms, etc...)
So I'm really impressed, and thinking about using this for projects in the future. But there is still something that bugs me about this new world (which I guess people would call Web 2.0, although that doesn't really have a strict definition.) I still think every atomic unit of information (every blog post, every inventory item, every single whatever-it-is you are building a website about) should have it's own page (that is, should be addressable by it's own unique URI.) That's the whole point of the web isn't it? I like all the fancy tabbed interfaces with pop up dialogs and whatnot, but if you can't even deep link to a specific item then it's all useless.
But that's not to say you can't build things the right way with Ext. Maybe it's more like Ext just gives you a lot more power with which to go wrong. So you have to be careful. To be very general, I'm thinking that a lot of these fancy UI features will be useful on control pages (i.e., administrative pages; the backend of the website, like [editpage] here) which don't need to be crawled by google or shared by users, and less useful on public facing content pages.