...more recent posts
Slurpr. Not really practical, but incredibly cool nonetheless. The compact Slurpr box allows you to connect wirelessly with up to 5 access points and bond that bandwidth into one super large connection (well, sort of, it really just load balances between the connections, but still....) Great looking hack of an elegant idea. Nice job.
I've had a few days of relatively luxurious solitude to get some work done. Unfortunately I didn't do that much coding. But not because I was slacking off, it's just that I'm up against a really hard problem. Although maybe that does count as slacking off because this is what I enjoy most about working - battling through the early conceptual stages; trying to fit the whole problem in my head; working through all the possibilities without much to go on, or much to hem me in. I pace around a lot. I make notes like mad when I get a lead. I talk to myself. I pace some more. I have a little porch here and sometimes I go out there and pace and talk to myself some more. Anyway, it's just a really fun time, before the work of limiting your possibilities that some people call 'actually producing something specific.' Joking. But just now I think I nailed the last and most elusive part. It feels great. Cracked open a beer in honor (I'm in a much different time zone so this is more reasonable than the footer on this post makes it seem.) So I just thought I would share. Cheers.
Spamtrap "is an interactive installation piece" consisting of a computer connected to the internet which receives spam emails, automatically prints them out, and feeds the result directly into a bottom mounted paper shredder. Go check out the picture. Nice one. Too bad it can't feed the spammers directly into the shredder though.
Those bleeding heart military guys:
The most effective way to find and destroy a land mine is to step on it.
This has bad results, of course, if you're a human. But not so much if you're a robot and have as many legs as a centipede sticking out from your body. That's why Mark Tilden, a robotics physicist at the Los Alamos National Laboratory, built something like that. At the Yuma Test Grounds in Arizona, the autonomous robot, 5 feet long and modeled on a stick-insect, strutted out for a live-fire test and worked beautifully, he says. Every time it found a mine, blew it up and lost a limb, it picked itself up and readjusted to move forward on its remaining legs, continuing to clear a path through the minefield.
Finally it was down to one leg. Still, it pulled itself forward. Tilden was ecstatic. The machine was working splendidly.
The human in command of the exercise, however -- an Army colonel -- blew a fuse.
The colonel ordered the test stopped.
Why? asked Tilden. What's wrong?
The colonel just could not stand the pathos of watching the burned, scarred and crippled machine drag itself forward on its last leg.
This test, he charged, was inhumane.
A New Way to look at Networking - 1 hour 20 minute video of a talk at Google given by Van Jacobson who is
...best known for his work in IP network performance and scaling; his work redesigning TCP/IP's flow control algorithms to better handle congestion is said to have saved the Internet from collapsing due to traffic in 1988-1989.This is an incredibly interesting talk with a long and accessible first section that traces the birth and history of packet based network communications. This is the best thing I've seen explaining the birth of the internet.
Cargill's quandry: "Any design problem can be solved by adding an additional level of indirection, except for too many levels of indirection."
I basically do three different jobs for my work. I'm a linux system admin; a PHP/MySQL developer; and an HTML/CSS designer (not sure if 'designer' is the right word there.) Of the three the later has always been my weakest spot.
I first started all this back in 1999 during the dark ages of Netscape Navigator 4. Back then you just built every page out of tables and tried to accept that you didn't really have too much control over exactly how the page was going to look. Over the years this has slowly been changing. CSS (Cascading Style Sheets) came into vogue, and they allow you to separate a lot of the visual formatting of the page out from the more structural HTML. CSS also, in theory at least, gives you a lot more control. The problem has always been that different browsers implement the HTML and CSS specifications in wildly different ways, so most of your effort as a designer go not into getting a page to look the way you want, but into getting a page to look the way you want in all different browsers. And just to make matters worse, the most popular browser, Internet Explorer, is by far the least standard compliant.
Anyway, like I said, this has all been improving, but the improvement is frustratingly slow. Still, IE 7 is much better, and Gecko (Firefox and Mozilla et al) and Safari and Opera pretty much have it all together. So lately I have really been digging into CSS and trying to get fully up to speed. It's still a mess due to browser inconsistencies, but it's at least good enough now that it seems worth the effort to me.
To that end I've been doing a lot of reading, and so I'm going to start linking - for my own memory at least - some of the better resources I have come across. I'll start with CSS guru Eric Meyer's work on style reset. What he has done is to create a baseline CSS file that declares a bunch of rules that are all meant to zero out the different assumptions made by different browsers about how to render a page. It's an effort to create a level playing field, or a common starting point, for making things look the same across browsers.
Yahoo has a similar style sheet that they promote in their UI toolkit (which is really sweet btw,) but I think I like Meyer's a bit better. This is really great work that is invaluable to people like me.
I've been having some problems with a new external USB hard drive freezing the Finder when I try to copy a very large file (or group of files) to it. The copy will start out okay, but then freeze after some time. I think the problem is that the drive is spinning down (during the copy!) That seems very weird but really does seem like what is happening. I need to get to the bottom of this, or at least get a better solution in place, but for now I added this to my crontab:
*/5 * * * * ls /Volumes/BigMusic > /dev/null
and everything appears to be okay. What this does is query the external drive ("BigMusic") for a top level directory listing every 5 minutes (and then it sends the output of this query into oblivion, so I don't notice this is happening.) So there is no point to this command, but it just forces the drive to stay spun up by making the query every 5 minutes. That is a total hack though, so I'm not really happy with it, but at least I can copy files now.
My computer was acting weird on Thursday night, and Friday morning I rebooted after cleaning out some cache files. But it wouldn't boot. I got the post chime, and then the Apple logo on the grey screen, but then it just stuck there with the spinning black wheel. Tried everything I could think of to no avail. In verbose mode (command-v on boot up) I could see it was getting stuck. I could boot into FireWire target disk mode and see and copy my files with no problem. So that was good in one way - I didn't lose any data - but bad in another because it made it seem like a hardware problem (and not with the drive, so that mean main logic board with means game over.)
Still, today I figured it was worth it to reinstall the OS just in case. But I wasn't too hopeful. Thanks to target disk mode it was at least possible. I attached it to another computer, and rebooted that one with an OS X install disk, and then selected my computer as the drive on which to install. Everything went fine, and my computer just booted without a hitch. Applying all the updates now.
Phew.
Project Honeypot looks like a very interesting attempt to curb comment spam (well, it's going after a little more than that, but comment spam is what I am interested in.) A "honeypot" is a server put on the web for the sole purpose of monitoring illicit activity (like maybe a server you put on the web that is open to attack, but you mean it to be attacked so you can observe the attackers - they think it is a "real" server but actually it's just a honeypot meant to trap them.) This project has set up many honeypots all over the web and have been silently collecting data on spammers. Now they are ready to release their data set so that anyone can tap into this collective knowledge. They have an Apache 2 module as well as an API. I'm going to try to write something against the API. This will be great if it works!