Ted Leung on the air: Open Source, Java, Python, and ...
Havoc Pennington was writing about the monotone version control system, and concluded with this.
Anyhow, there's unquestionably a lot of room for improvement in our developer tools. One of the questions in my mind is how far you can get if you define the problem as only version control; at some point the really useful stuff would involve the editor, the bug tracker, and so forth as well.
The best version control system (in terms of features, UI is a different story) that I ever used was IBM's CMVC, which integrated very tightly with the bug tracking system. When I used it I also forced it to integrate (badly) with Emacs.
From where I sit, the framing of the problem definitely affects the solution.
Sometime last week I discovered remote growl, which looks to be really useful. I'd like to have some cron jobs on the Debian boxes that I use as servers, and have them report stuff to my PowerBook via growl.
A few days later I discovered this post on a network packet format for Growl. It was neatly juxtaposed with a rant on Jabber. Of course, I then started thinking about how what I really want isn't Growl, it's for jabber and clients, etc to shape up to the point where I could use Jabber to do much of what I'd use remote growl for. Sure, under some circumstances I'd prefer that my traffic not go through a public XMPP server, but I could always run one inside the firewall for that slice of my traffic. There's even a command line tool, sendxmpp, that would be the perfect basis for doing this from cron/scripts/etc. The problem is that there isn't a client in sight that can do some of the cool stuff that Joe mentioned to me. Having the best, most capable protocol is irrelevant if you don't actually have good software that implements it.
At least the Adium boys are increasing their Jabber support, particularly group chats. Selective presence sure would be nice.
If you aren't reading Johanna Rothman's Hiring Technical People, you should be. Here's an example:
Hiring Mistake #2: Hiring for the Future, Not the Present
The second biggest hiring mistake I see is to hire for the eventual future -- but not to create the future from the current reality. I see this mostly when hiring managers and senior staff.
...
Creating a new future is difficult -- possibly the most difficult position a new hire can be in. Hiring managers need to balance the need to create a strategy and act on it, along with the tactical deliverables they already have. Finding someone who can meet the current needs is more important than hiring someone who can only perform the future role. You mguth even need two different people, one now and someone else later. (But maybe you can coach the person into performing the eventual role.) Always make sure you hire the person who can create the future by working in the present.
Michael McCracken is interested in a shell with better auto completion functionality. I use zsh because it's had super auto completion functionality for a long time. At the hackathon, someone, Ben Laurie, I think was recommending the "new" bash-completion, because it would help with all those gpg commands. The zsh autocompletion for gpg can also complete people's user id's which goes beyond what bash-completion does (today, at least). I've been impressed, though, with how fast the bash project is improving. I'd love to see a similar thing for zsh.
In any case, neither of these will produce a code completion style of UI for the command line, which seems to be at the heart of what Michael is asking for. It seems like it shouldn't be that hard, since zsh can already do incremental-search through its history, so it seems like it would be a matter of gluing the appropriate code together.
At the hackathon, Ben Hyde showed me McCLIM (I think) running under OpenMCL, which would be another source of good ideas for improving the command line experience.
Thinking about shell improvements inevitably led me to thinking about the new MSH that will be part of Longhorn. Jon Udell had a great introduction which demonstrated some of the nice features of the shell. I'm glad to see Microsoft taking a new approach to the command line. Most of what we have is many years old, and I can't believe that we've exhausted all the ways of making command line interfaces better.
According to Martin Fowler, at OOPSLA this year a bunch of patterns people got together and voted some patterns off the island
Four patterns were voted off. Factory Method (due to the confusion over what it means - the pattern is different to the more common usage of the term), Bridge, Flyweight, and Interpreter. Two patterns, Singleton and Chain of Responsibility, were split decisions.
I'm not sure what the reasoning was other than
people felt they were sufficiently uncommon and that other patterns would probably take their place
I think the reason is that an insufficient number of people are aware of this.
In his ApacheCon keynote last week, Doc Searls, talked about the role of modularity, with insights that he's gained during his year long study of Do It Yourself (DIY) IT. He drew heavily on some ideas from Stewart Brand's How Buildings Learn (the reading list just got taller), particularly the contrasts between magazine vs vernacular architecture and high-road vs low-road construction. He introduced these concepts by using the state of the Alexis Park as a case study, and then branched out from there.
Here are some of the good quotes I was able to capture:
"Linux is free if your time has no value"
"the job of the architect is to bankrupt the builder"
"all buildings are predictions, and all predictions are wrong"
"form follows funding"
Doc feels that a good way to look at the software industry is to look at the construction industry and look for parallels. He views open source software as vernacular architecture and low-road construction. And based on what I learned about those terms, I'd probably agree with that.
He noted that when we build a house, it doesn't usually stay the way it was built, but that it gets rebuilt over time, and that the modularity of construction materials allows this to happen and increase the amount of DIY involved. His argument is that
the chief ideal isn't openness - it's modularity
I found this all to be very interesting and reasonable. Although there are differences. Construction materials are not as malleable as software. You can't just get more construction materials, but once you have a piece of software, duplicating is trivial (modulo legal concerns).
The thing that kind of bothered me is that the argument is all about materials. If your materials have particular properties, modularity, openness, licensing, etc, then that's all that is necesary. However, in my view open source is about more than just the materials. I point (for the nth time) at Yochai Benkler's article Coase's Penguin, which abstracts the notion of commons-based peer production from its instantiation as open source software. The properties of the materials do affect the degree to which a field is amenable to commons-based peer production, but they are not the only factor. The ability to match people with tasks, the peer production part, is just as important as getting the properties of the material right. This refers to the "community" aspects of open source projects. When I asked Doc about this issue during Q&A, he didn't have a lot to say --which is fine -- we're all learning and exploring together here. I do think that looking at the peer-production/community side of the problem is as important as looking at the materials/commons side of the problem. The organization and mobilization of people who are savvy at the essentials of peer-production is an important part of the future of open source software and commons-based peer production efforts of every sort.
I think I'm about to add a new data filtering rule: "If Malcolm Gladwell writes it, I read it". His September piece,
The Ketchup Conundrum is interesting both for the explanations of how mustard/ketchup taste, and the process which people have used to create and market these condiments.
When I went to dinner with the Sun folks the other night, one of the things that we talked about was the process of writing a book. The formerly unidentified folks in our party were Rich Teer, author of Solaris Systems Programming, his wife Jenny, and Sean Ross. Apparently the Teer's have had a long multiyear slog to get Rich's book done, and today Dave Johnson mentioned that he is working on a book for Manning. Dave is on a tight schedule, which was the same situation for me on my book. It seems that writing a book indicates that you've been through a particular kind of ordeal. Or does it? Guillaume Laforge recently described the welcome package he got for his book on Groovy. I certainly didn't get that kind of a welcome from Wiley/Wrox. I suppose it's just another example of why O'Reilly is one of the top publishers in the computer book space.
I got up to chair Santiago's 8:30AM session (the things you do for friendship) after being up till 3. As we were doing the post session form collecting, etc, Ted Husted came up to me and asked me to sign a copy of my book. This is the first time this has ever happened to me, and I was so stunned that Ted had to tell me what to write. In case you didn't know, Ted is the author of Struts in Action.
Doc Searls was up for today's keynote. Last year was the first time that I heard Doc speak (also at ApacheCon), and I really like the way that he tailors his talks by examining the situation that he finds at the conference. In keeping with his musings on DIY IT, Doc talked about the construction industry and the way that modular/standardized materials facilitate a DIY environment. I've neglected to mention that the conference hotel is under serious construction (the pool is empty, there's rebar in a bunch of places, scaffolding and plenty of red/yellow plastic tape. In fact, I had to duck under some plastic tape in order to even be able to get to the room for Santiago's session. Doc used the state of the hotel as example of some of the principles that he was discussion. He showed photo after photo of the state of the hotel, and in many cases, he contrasted photos from the hotel's website with his own photos of the same areas. It was incredibly funny, and a demonstration of Docs' artfulness as a storyteller and speaker. I have some more ponderings that I'll save for another post.
The other session that I went to was the Lightning Talks. This was patterned after the Python Lightning Talks at OSCON. Stefano and Fitz ran this - Stefano found a court jester's hat, and Fitz had a toy that played recordings of Mr. T. The atmosphere was light, and circus like, and for a little while I feared that this was going to be a disastrous session. Fortunately, the community rallied and we had some pretty good 5 minute presentations. I don't remember all of them, but here are the ones that I had notes for. Stefano showed off Agora, William Glass discussed his geek support group's blog, Jeffrey Barnett showed unalog, and Fitz described how he places his entire home directory under svn. While Fitz was talking it occurred to me that it might be interesting to have a session called "Lifehacks of the Apache Community". Maybe next year. All in all, I think that this session went well for it's first year. I hope that word will get around (and perhaps some advance warning/PR), and that this will be a place for the long tail to make itself heard.
After the session I was sitting in the exhibit area and Jeffrey Barnett stopped by to talk more about unalog. We talked about how it was and wasn't like del.icio.us (someone asked this during the talk), the context that unalog is being used in (universities) and the importance of the social rather than technological factors of these sorts of sites. Jeffrey also pointed me to a few interesting projects, which I'll record here: Openurl and Sakai.
I always end up having to leave in the middle of the closing plenary in order to catch my flight -- weeknight flights to Seattle are pretty scarce. Fortunately, the closing plenary is dominated by a raffle, which just doesn't float my boat, I would have liked to hear the feedback on the conference, though. From where I sat it was a fun and productive conference, but I have a totally different set of metrics than the regular attendees.
SubEthaEdit 2.1 is out. One undocumented feature is a command line tool to integrate a shell with the native app. The thing that isn't documented is the name of the command line tool. So here it is:
see
With Day 2 out of the way, I can fully relax. I did my talk in the morning, and it seemed to go well - people came up afterwards, and during the day. I did feel bad that I missed on the timing -- I still don't quite have the pacing down, which is annoying. I was even using PowerPoint 2004 (no fruit please), which has really nice slideshow mode tools for showing your notes, elapsed time, and upcomings slides. Matt Raible came up to introduce himself, on the basis that we were both married to Julie's and had eldest daughter's named Abigail.
The ConCom managed to rope Miguel de Icaza into giving a keynote. Miguel's talk was mostly a Mono update -- this is a useful talk for the ApacheCon audience, because the Apache community tends to be mostly underinformed about what's going on with the CLR. I always enjoy watching Miguel speak. There's something about his style that I find both entertaining and endearing. After his talk a bunch of us got into an interesting discussion on the merits of Mono/CLR vs Parrot. People who are interested in the details should pester Miguel.
The only talk that I attended was Scott Johnson's talk on the lessons that they learned while building Feedster. I found it interesting that almost all of the problems that he discussed were operational. Problems with hardware in colos, problems with circuit breakers, problems with system software installation, issues related to database administration. As Scott put it -- we are experts at building search software not at operating big web sites. This reinforces the notion that the success of large internet sites is based on two core competencies, one in the application domain, and the second in operational prowess. Google is the most obvious examplar of this idea. He did mention one tool that I was unaware of: Jeremy Zawodny's MyTop for mysql.
The rest of the afternoon was filled with interesting conversations with people. I ended up going to dinner with a bunch of people from Sun: Danese, Bruno, Andy Tucker, Dave Johnson, Flip Russel and few folks whose names I never quite got. By the time we got back from dinner it was midnight. On the way back to my room I ran into Dirk and Stefano, and we ended up going back to the 24 hour lounge, where we stayed up til 3AM with Santiago and Manoj. Late night conversations like these are one of the best parts of ApacheCon.
Here are highlights of the first day of sessions.
Wil Wheaton gave the keynote. The most interesting thing was the real life blog stories including his own. He got great applause when he said that people could record, photograph, etc the performance, as long as it was released under a Creative Commons license.
I'm not attending as many talks this year, but I did go to Matt Raible's talk on Web Frameworks, and learned a bit about frameworks that I haven't had the time to look into, specifically WebWork and Tapestry. Matt's talk is a very working programmer's style talk -- he's giving people the stuff that people need to get work done. I also went to Howard Lewis Ship's HiveMind session. While I get the technical concepts behind IOC containers, I'm still a little mystified by the amount of excitement over these things. The final session that I attended was Torsten Curdt's session on continuations.
I had a number of good conversations over the course of the day, meeting new friends, and catching up with old friends.
The evening's events consisted of the "Star Trek" reception (again), the PGP keysigning party, a BOF on legal issues, and a trip to see the Incredibles.
At the PGP keysigning, Theo de Winter told me (and a bunch of other people) about CA-Bot, a script for batch signing keys. It also implements an e-mail verification mechanism. This sounded really cool. In fact, I got my messages from Theo this morning. The problem is that the GPG plugin for Mail.app couldn't decrypt the messages, which made things very annoying -- I had to respond to them all by hand. If anyone has a better script for handling this stuff let me know. Robyn convinced me that the use of ID for the keysigning doesn't really help security any, so I'm altering my signing policy appropriately.
The Incredibles is an amazing movie. It's clean, and there were lots of really hilarious moments. It turns out that one of the most hilarious moments occurred on our way to the theater. Roy Fielding got a new car that includes a cool navigation system (my first time riding in a car with such a system). So when Roy turned it on and quickly got directions to the theater (like, faster that I could have done it in MapQuest), and the synthesized voice started giving directions, I exclaimed "I'm gonna write this up". Roy then decided that he was going show off a little. "Show parking" he intoned. What ensued could only be described as a live version of the Doonesbury strips that lampooned the handwriting in the original Newton. Let's just say that the car was full of laughter. But I do have to admit that it was pretty cool.
Here are some highlights of the second day of the Hackathon:
I did end up spending some time helping people with PGP keys, and the web of trust now extends to ASF developers from Sri Lanka and Japan, among other places. I was sitting at a table that was a bit preoccupied with PGP. Ben Laurie and Robyn Wagner were engaging in a bit of one upmanship regarding the number of signatures per PGP key. This set Ben Hyde off and we had a discussion about the best way to extend the web of trust. In particular, Ben wanted to find the single person whose key he could sign that would cause the greatest extension of he web of trust. He and Stefano have been collaborating on a bunch of RDF hacks, so (in Dirk's words) Ben got "emacs look" while writing Perl script that turns a PGP keyring into N3 triples. He then passed the N3 to Stefano, who fed it to Welkin, the RDF version of his Agora software.
In the middle there, we joked about the way that we were using open source jujitsu on each other. This of course, s the ability to get someone to do work for you. You've been particularly effective at this if the other person will actually get great enjoyment out of doing that work.
I was also very happy to meet Robyn in person. The ASF has recently decided that we need to retain our own counsel (a necessity of the times). Robyn's background is in patent law, and I learned a bit about patent law as a specialty. We talked a little bit about the various kinds of "soft" infrastructure needed by open source projects, legal, public relations, etc. She also said that I should try to make it to CodeCon in February, which only reinforced my desire to try to get there next year.
Ben Laurie showed me his cool Sony digital camera, which can take photos in the dark via infrared. He's used the camera to take some cool photos of bats inside a bat house. I wonder how long it will take for this to become a standard feature of digital cameras.
I met David Johnson, the author of Roller in an interesting way. Ben H, Dirk, Stefano, Santiago, and I were having a discussion about RDF, N3, and AI. We discussed some of the problems with processing RDF, the aspects of RDF/N3 that are amenable to network effects, and all that AI stuff that was done years ago that is making its resurgence via the Semantic Web efforts. As the discussion wound down, David, who had been sitting at the table turned and introduced himself. Recalling his enthusiastic pre-ApacheCon blog posts, I told him that I hoped we hadn't convinced him that we were all insane.
I talked a bit with Sam about the hacking he's doing on Python for Parrot. I'm looking forward to trying to run Chandler on top of Sam's stuff. One of the things that is a prerequisite for that is the ability to use Python C extensions.
One of our local newspapers did an article on blogging that featured Julie's blog. Be sure to check out her comments on the whole experience.
[Yesterday was day 1, but as with all my conference reports, they are delayed by one day.]
I didn't get a lot of sleep the night before flying out, so I took a bit of dramamine (highly unusual) in anticipation of the boat/limo/flight. This turned out to be a smart move, because there was quite a bit of turbulence on the flight, which which totally did not affect me. Due to ferry scheduling, I arrived quite early (in my mind) for the flight. Apparently, this wasn't early enough, because I ended up in the C group of my Southwest flight. I have a love hate relationship with Southwest -- I don't like playing the seating group game (and for this reason I try to fly Southwest when flying with the family, when having a child boosts us to the front of the line). On the other hand, the "food"/snack, and seats are better than lots of other airlines, and the flight attendants generally are pleasant and entertaining. As usual, I got to the Alexis Park too early in the cleaning cycle, so I had to wait to get a room. I headed straight over to the hackathon room, plopped down at a table with Ben and Santiago, and started to make merry.
Not long after that, I went to lunch with Santiago, Ben, the anthropology team from Syracuse (studying FOSS communities, their second year) Sanjiva, and Brian McCallister. One interesting thing from the lunch: Kevin Crowston from Syracuse was surprised to learn that we (the ASF) use CVS to coordinate social activities, etc.
One of the things that the ASF is going to do in the near future is rely much more on PGP signatures and certificates. I spent some time talking with folks from infrastructure about how that's going to happen. Based on that conversation, I'm going to take some time to wander around and get ASF people to get keys and get them signed. We are having a keysigning, but participation isn't as broad as it should be. During the course of these conversations, I got to meet Noel Bergman, who I've wanted to meet in person for a while. Thom and I also spent some time discussing improvements for Planet Apache. I think I was also successful at persuading Ben that we should use krell to generate the config file for PlanetApache. Open source jujitsu at work.
For dinner I went to Gordon Biersch with Stefano, Lars, Dirk, Gregor, Gianugo and a few others. I was the only American at the table. One of the great things about open source is that anybody can play. The geographic distribution of the dinner party was demonstration of that. I did my hardest laughing all day as Stefano (the Italian) was ragging on Dirk (the Dutchman).
The first two days of ApacheCon are for tutorials, or if you are an ASF committer, "the Hackathon", a chance to sit face to face with other committers and engage in high bandwidth communication. For some of us, it meant the chance to help out a bit with the logistics of the show.
We packed 500 conference bags of schwag in an hour. The swag is good this year. The conference bag is nicer than any in recent memory, and there's a USB rechargeable flashlight.
This past Tuesday was my one year anniversary at OSAF. It has been a great experience for me, not only in the technical dimension but in the community dimension. Most of my involvement in the open source community was due to XML/Java stuff that I did at the ASF, and perhaps a little bit due to my involvement with pyblosxom. In the past year, I feel that I'm branching out into the Python community, and I'm looking forward to meeting lots more people as Chandler reaches a state of usability.
Last week when I was in the office, I had my review (it's nice to do that sort of thing in person). I only want to make one comment about the review (other than the fact that I'm not fired ;-)). The word blog/blogging appeared twice in my written review. This was both a surprise and not a surprise to me. Content from the blog occasionally comes up in discussion with folks in the office, and I do occasionally write about OSAF happenings. However, blogging per se, is not really a direct part of my job responsibilities. It is true that OSAF's organization culture places a high value on transparency (I mean, what other organization is posting its management level meeting minutes on a public wiki?), and the my blogging is a personal reflection of that value. But I'm not an OSAF blogger in the sense that Scoble is for Microsoft or Jeremy Zawodny has become for Yahoo. Since I do have some responsibilities for helping to incubate a Chandler community, it's not surprising that blogging might be viewed as a component of that, which it was. The other place where blogging appeared was in the part of the review where you get evaluated on learning about new technologies and keeping up in the field. This occurrence was the one that was surprising. Abstracting away from me, I think it's interesting that the contents of your blog could be used as input for your review, particularly for this sort of criteria. It's not like the blog is a daily diary of the work that I actually did (although from time to time I contemplate starting a second blog at OSAF to record some of that stuff -- right now we use wiki pages). I wonder what other ways blogging will be integrated into the natural flow of people's jobs.
In response to the semi-(in)famous description of life at Electronic Arts, Charles Miller posted a list of ways to create a good working environment for a programmer:
An office with a door
…and no phone
A culture of asynchronous communication
A fast workstation
…and two monitors. You wouldn’t believe how much difference a second monitor makes
…and their operating system of choice
Good development tools.
A fast Internet connection
Snacks and drinks they don’t need to leave the office for
A good-natured working environment
Flexible working hours
Tasks appropriate to their ability
… and if at all possible, that they find interesting
Investment (emotional or financial) in the end-product
According to this list, I'm doing pretty good. I have an office with a door, but I do have a phone (and an iSight). OSAF does have a culture of asynchronous communication - couldn't do open source any other way. I don't have a fast workstation, but that's because PowerBook G5's don't exist. But, I do have two displays and Mac OS X, my current workstation OS of choice. I have the best tools that you can get for dealing with Python, and a shiny new fast connection. I have 3 meals a day in the office (at home) and the OSAF San Francisco offices are the best in my career for snacks/drinks, etc. Working hours are flexible, the tasks are doable and interesting, and I have a bunch of investment in both the final product, the organization and its culture, and my co-workers.
Not bad for my first year.
Today I needed to change the CVSROOT for a repository that has a bunch of modified files in it. Most of the time when this has happened to me I've just checked out a copy from the new server and left it at that. This time, it was going to be too painful. I started to write a script, but fortunately, I got to Google before I started typing. It turns out that CVS's contrib section contain a script to switch roots. I had to Google for it though - it didn't show up in the Debian packages for cvs either.
While I'm not having problems with downloading Debian packages (I have a cron job that downloads updated packages daily), I think that additional uses for bittorrent is a good thing. apt-torrent is a proxy for apt that uses bittorrent as the transport.