September 12, 2008
I wrote this in July (!!) as a morning-after continuation on this update on my life.
Since it still accurately reflects my thoughts at this time, and since I keep sucking at connecting with old friends in real life, I’m publishing it.
I have started thinking through my career trajectory for the next 2-5 years.
I’ve been lucky thus far. Yeah, there were some rough patches (many depressing rounds of layoffs at Excite, for example), but I’ve had a number of excellent managers, and been able to observe and learn from many awesome co-workers. Webshots paid off both literally (twice!), and figuratively, in all that I’ve had the opportunity to stumble through.
What I didn’t have before, really, was the ability to choose my path.
In 2000, I thought what I really wanted to was to earn enough money to go back and pursue my PhD. I’d even started discussions with my manager at the time to this effect. Thankfully, the layoffs came, and I discovered that the real geniuses in a company (the PhDs) get let go before the inexperienced idjits (that would be, uh, me). And then they go over to Google and make a ton of money.
Er. What was my point?
When I landed at Webshots, there was one thing I’d wanted to accomplish, and I had many thoughts of leaving. Then I developed my weird health issues, and it was all I could do to get to work (almost) every day. And yet, they stuck with me, and I worked it out, and I accomplished and learned quite a bit, and it worked out remarkably well for me.
Three employers, four job titles, six job responsibilities, and eight teams later, here I am.
That kind of change certainly helped keep me from stagnating.
And now… now, I am having a great time, applying lessons on scaling in a new context, increasing the operational complexity I have to tame, and learning (slowly) how to navigate political waters in a large, established company.
Is this where I’ll be in 2 years? Will this be what will make me happiest for the next 18 months?
Honestly, I don’t know.
What I do know is that I intend to intentionally choose what I do next.
July 5, 2008
As you almost undoubtedly know, American Greetings Interactive bought Webshots in October. Operational control transferred over to AGI in March, and with that switch, we lost a lot of talented people who had (to be politically correct about it) been lacking the tools to make Webshots what it should have been.
Most of them have landed on their feet and are doing well at start-ups of one stripe or another, or else taking time off to evaluate their next big thing.
As for me?
I’ve had maybe three full weeks in San Francisco since March.
In November, I accepted AGI’s offer of employment as an architect…in Cleveland.
Yinghua and I spent quite a bit of time meticulously calculating the move. There were certain assumptions about my position within the company, my career options, what the work would look like (would it be interesting?), what the team would look like, housing prices versus salary, quality of life, diversity of the food landscape, and so on, and so forth.
All of which was for naught because all the factors have changed.
There’s a lot of stuff I probably should not blog about, not least of which because some of it’s still in flux, but the most important change is that the position is now in Seattle.
As a result of all of this, I’ve been commuting between San Francisco and either Cleveland or Seattle. Thankfully, mostly I’ve been flying to Seattle.
Oh how I love love love Virgin America.
It’s been a great experience for me.
I’ve rediscovered my love of airplanes and airports. Many years ago, I considered jobs that involved a lot of travel. I was single, I was smoking crack (or something like that), and I loved observing people.
My health issues had made me claustrophobic and flying became quite stressful to me. But now I’m back to loving it.
Maybe it’s the cute, young, hip flight attendants at Virgin America? Or their delicious cuisine? (No, the flight attendants at Virgin America aren’t edible. Though that would be a good perk for their frequent flyer members!)
I’ve also had the opportunity to stay at some great hotels, and, until this month, it was all on the company’s dime. Who could ask for more?
So, yes, Virginia, I’m alive.
And, for this weekend at least, in San Francisco, with some time to relax (with my in-laws no less), start checking my e-mail (ahem), return some phone calls, …and, oh yeah, post to my blog. Which is the last item in priority, but the second I’ve taken action on (the first being relax!!!!).
February 3, 2008
This Tuesday, I will do two things I’ve never done before.
First, I will vote in a primary. (I’ve showed up at the poll during primaries, just never voted in a primary race.)
Second, I will vote for a Democratic Presidential candidate.
I am registered as unaffiliated and, in California, that means I can vote in the Democratic primary if I choose. This year, I will choose to do so in the hopes of doing my part to see to it that Obama, and not Clinton, gets the Democratic nomination.
And if he does, he will get my vote in November, too.
My Political Philosophy in Six Paragraphs
Not to digress too much, but, briefly, here’s where I’m coming from.
What defines a government is its political processes. Do the processes that exist provide some degree of epistemological certainty that its powers are being used legitimately? Are violations of the processes few, brought to light, and corrected for?
That’s not to say that any of its policies are right, necessarily, just that the decisions on how to use that power are sound.
More practically, there are three real problems with the concentration of power. These effect private organizations too, but governments almost always have more power concentrated in them than any private organization.
The first is corruption. Power attracts corruption, and can then be used to increase corruption. People who want power will do anything to get it. So you need a government that exposes corruption and corrects for it. You’re never going to eliminate it, and that’s OK–as long as sound processes are usually followed in using power, some amount of corruption can be tolerated and worked around.
The second is consequence. Any action that a powerful government takes has far-reaching consequences. So you need a deliberation process that takes this into account, and you need to implement your policies in a way that makes all consequences–intended and unintended–as transparent as possible. Then you need to alleviate the negative consequences.
The third is momentum. Once you’ve taken an action, it’s mighty difficult to stop reinforcing that action, and near impossible to take a completely different course even if the consequences are disastrous. And while the market can react quickly to changing circumstances, strong, powerful organizations almost never can.
What I Like About Obama
Barack Obama strikes me as a reasonable person, who will make reasonable decisions given his biases.
He’s engaged with scientific and technical communities early in the formation of his policies.
He advocates for more transparency in government.
He seems to recognize the problem of momentum.
He focuses on ideas and policies.
He seems to grasp that what was so despicable about the Iraq War was that it was executed without following sound political processes.
Contrast this with Clinton. She totally misses (or maybe she really doesn’t) that the stupid policies she’s advocated for–from authorizing force in Iraq because, apparently, she thought she was playing a game of chicken, to her original bologna national health care plan, all the way back to her support of the Clipper Chip–would have any negative consequences at all.
And, let’s face it, she and her hubby campaign like people who want power and will do anything to get it.
Another thing in Obama’s favor is his very real dealings with multicultural, multitheological environments. His religious Faith was really a journey and, seems to be, a good parallel to how he makes political decisions. To quote Wikipedia:
Obama writes that he “was not raised in a religious household.” He describes his mother, raised by non-religious parents, as detached from religion, yet “in many ways the most spiritually awakened person that I have ever known.” He describes his Kenyan father as “raised a Muslim,” but a “confirmed atheist” by the time his parents met, and his Indonesian stepfather as “a man who saw religion as not particularly useful.” […] Obama writes: “It was because of these newfound understandings—that religious commitment did not require me to suspend critical thinking, disengage from the battle for economic and social justice, or otherwise retreat from the world that I knew and loved—that I was finally able to walk down the aisle of Trinity United Church of Christ one day and be baptized.”
I am not voting for Obama because I particularly agree with all of his policies.
For example, I do truly believe that nationalized health care will fail, fail utterly, and fail spectacularly. But I also do truly believe that it’s inevitable because we’re tired of mere philosophizing and need first-hand experience. And the alternatives offered by the Republicans aren’t exactly sound.
What Obama seems to offer for this inevitable expansion of government is that he will, again, follow sound political processes and be transparent about its implementation. Which, by the way, gives it a higher chance of succeeding.
And I think, if any politician today could stop the wave, or keep it from knocking over houses when it crashes, or maybe even keep it working well long enough to find a better alternative, it’s Obama.
Of course, it could all be for show. Given that Obama (a) is a politician, (b) is a Democrat, (c) is a candidate for President, and (d) has raised $130M in the last year … well, the chances are good that it’s all an elaborate con. That’s the trouble with power.
Still, I’m voting Tuesday without hesitation. If the hard-core Democrats are smart enough to nominate him, I will, of course, look for more evidence of his reasonableness before November. I think I’ll find plenty.
January 9, 2008
Ever have one of those periods where you feel utterly burned out?
Where you’re itching to do something more meaningful than yet another social network or infotainment aggregator?
Where simply enabling the transient, meandering passions of people just isn’t enough?
Where you think to yourself, “There’s got to be more than this gallery of shiny new objects?”
Where there is something just beyond the horizon in your brain, but you can’t quite make out what it is?
That’s the period I’ve been in.
If you were wondering.
September 22, 2007
I think I’ve finally done it.
This week, I found myself struggling within the confines of existing infrastructure code that had no unit tests. The task was relatively simple: a new feature at the infrastructure-level that would enable many other product-level features in coming months. If I’d gone down the hackish route and not worked at the infrastructure level, I could’ve isolated this code and tested it in complete isolation. But that would just lead to more bugs down the road, and I’d still have a lot of integration testing to do.
It was one of the more painful and slow development experiences in recent memory. Even worse than dealing with 10-year old, undocumented, legacy Perl code written by non-programmers. Yeah, that painful.
For a long time, I contended that unit testing’s primary benefit was long-term: ensuring that refactoring code which you did not write and do not fully understand does not break. And that you need to reach a certain critical mass in terms of number and variety of tasks before it becomes effective. And that’s still true. I think my skepticism rubbed off, too, because I hardly hear anyone saying, “But the unit tests pass, there can’t be a bug” anymore.
But then I started looking at how I approached the practice of programming. Before I write code–often as part of designing the system–I write out stubs of how I’m going to use the code. Because an aesthetically unpleasing or overly complicated API is an error-prone API. And it dawned on me that this is one of the big benefits of test-first development.
Last year, when I read Agile Software Development: Principles, Patterns, and Practices (finally after it had been on my bookshelf for years), it was a bit like a revelation. Agile development, really, is formalizing what are otherwise good development-time practices anyway. So I resolved to better formalize my development activities.
It’s taken a while, but now I am so accustomed to using unit tests to isolate problems, and the quick turnaround that entails, that there is a significant mental barrier to any other way of doing it. I’ve turned that corner and am heading down NotReallyUtopiaButGoodEnough Street.
September 18, 2007
Back in June, I blogged about the motherboard on my Dell PowerEdge going out.
Last Wednesday, my GE refrigerator/freezer went out. The fan was still blowing, but the air wasn’t near cold enough.
Most of the food in the freezer unthawed before we even realized there was a problem, and the refrigerator was above 45 degrees.
We tried our best to eat all our still-fresh-smelling recently-thawed meat, and bought a bit of dry ice and regular ice to help with the refrigerated stuff, but I wasn’t going to take a chance with most of the food. Replacing $300 of food is less painful than throwing up for 3 days. (No, I don’t need to scientifically verify the fact, thankyouverymuch.)
Today, the friendly GE repairman came and isolated the problem to… the main control board (motherboard). The battery had burned a cute little hole straight through the board.
Unlike the Dell, he replaced it within 15 minutes and was gone. No need to reinstall an OS, even.
Of course, the fridge is several times more expensive and the extended warranty was $80…
July 8, 2007
I’m not dead! Woohoo!
Since moving to Google Reader, I’ve realized just how much stuff I’d been missing doing things manually all this time. And, as with everybody who starts reading feeds, I’ve found it’s completely overwhelming–and trying to read too much is counter productive. (I’ve tried to share interesting posts on my link blog, but original thinking and writing has suffered.)
I knew this, but it all came to a head a few days ago when Scoble started putting LOL Cats and other crap in his link blog. Now, that in itself isn’t too bad, but one big problem with subscribing to the RSS feed of a shared Google Reader account is you don’t know whose feed you’re reading. You only see the name of the original blog and original author. So I spent an hour trying to track down if maybe Google had messed up, or some other feed was spewing trash, before I realized it was Scoble.
And that he was posting, on average, something like 20 stories a day.
And some of the blogs I thought I was already subscribed to were really coming from Scoble’s feed.
And many other blogs he linked to I was already subscribed to. (Call it osmosis.)
In other words, I didn’t have a handle on my feeds.
So, since I was already feeling miserable this weekend due to my allergies blocking up my ears, I pruned my feeds. Post too much? Gone. Low signal to noise? Gone. Have to click 2 or more times to get to mediocre content (ahem, Artima)? Gone.
I feel so much better about it, and I think I’m down to a reasonable enough traffic that I can at least get through the software, industry analysis, and humor blogs every day. And, while I didn’t completely unsubscribe from all partial feeds, I have segregated them to their own folder, which I will check far less often.
The diff on my OPML file is ugly. I’m keeping it for future reference.
I’m also abandoning Google’s “sort by auto” feature, which prioritizes feeds that are updated less often. It gives a false sense of how much progress I’ve made.
I have in mind a few interesting (to me) things to write about, and with my feeds now under control, and feeling better about things at work, I should get to them in the coming weeks.
June 10, 2007
I’m kicking myself for not thinking of this earlier–fastboot!. This allows me to boot my old FC6 system, which allows me to be productive and hold off on configuring a new system (I’ve concluded Kubuntu is just broken, too many things didn’t work for me). I don’t get to take advantage of the new 64-bit system, but I can get some work done. Which is more important?
On the off-chance somebody else may find themselves in a similar position someday, searching Google or Technorati for an answer at 4AM, here’s essentially what I did to get up (granted, I took the scenic route). Remember, on the system with the fried motherboard, I have an old IDE primary HDD, but the new system only boots SATA. I also bought my new PowerEdge with 2x160GB SATA drives (“for free” if you believe Dell’s special offers). This means:
- Disk A. Old primary IDE HDD, with working system and no dependencies on secondary drives.
- Disk B. New primary SATA HDD.
- Disk C. New secondary SATA HDD.
If I only had a single new primary SATA HDD, I probably would have bought a cheap IDE-to-SATA adapter from Fry’s to see if the new system would boot it that way. But I try to avoid Fry’s if I can, especially the one in Milpitas.
Step 1. Install new Linux distro on Disk B. Doesn’t really matter which distro at this point, since you’re not going to use it.
Step 2. Copy an image of Disk A onto Disk C. Be sure to connect disk A before booting. In my case, it meant disconnecting the CD-ROM and sitting the old drive on top of the case. Your command might look like this:
$ dd if=/dev/sda of=/dev/sdc
This will, undoubtedly, take a while (I got 18.1MB/s for 80GB data–for a total of 74 minutes). It doesn’t matter if disk A is smaller than disk C.
As above, an IDE-to-SATA adapter might have worked. But this way, whatever happens to your data as you’re messing with it (including the remote possibility of a poorly-wired new system that fries all your components), you’re just working off a copy.
Step 3. Reboot. This is the only way I know to get Linux to refresh its view of the volumes on disk C. There may be other ways.
Step 4. fsck your new volumes manually. Just to make sure they’re OK.
$ e2fsck -f -c /dev/sdc1
$ e2fsck -f -c /dev/sdc2
Step 5. Disable init-time fsck. I think the problem with e2fsck failing during init has something to do with moving from IDE to SATA, with the device view switcheroo that entails. Never really figured it out.
$ sudo mkdir /mnt/sdc2
$ sudo mount /dev/sdc2 /mnt/sdc2
$ sudo mv /mnt/sdc2/.autofsck /mnt/sdc2/NO.autofsck
$ sudo touch /mnt/sdc2/fastboot
Step 6. Disable disk B and boot into disk C. In your BIOS, turn off SATA controller 0, or else it’ll never boot disk C.
You should now see the GRUB (or LILO) boot menu from your old system. I tried passing various kernel boot parameters to the system (GRUB allows you to edit the command before booting), but no hint I provided the kernel about the root device or root filesystem prevented the e2fsck failure. Hence, the need for fastboot.
Your experience and results may vary. I’m just excited to be able to get some work done.
June 9, 2007
This is just a rant, nothing more.
This is the first time I’ve run a Debian-based distribution since Debian 1.3.1 in 1997! That was my first Linux distribution, and, aside from the cool package management UI, I was not satisfied with some of its configuration choices. I know the version because I still have the 9 floppies I used at that time.
Debian is also where I learned to pronounce “Linux”–incorrectly! So, I happily went around for a year sounding like a total fool (some would say I’ve sounded like a total fool for the last 30 years, har har). Update: Actually, it was probably the Linux FAQ at li.org with the pronunciation guide for “Linux”.
So far, I am less impressed with Kubuntu than I expected. Sure, it boots fast. But several things that worked out of the box with Fedora, just don’t in Kubuntu.
For example, SCIM/SKIM. I’ve futzed with that for hours, and still no love. Comments on various forums and blogs lead me to believe this is a long-standing problem with Kubuntu/Ubuntu, since everybody’s concluded it just won’t work with KDE. But, that’s patently untrue. FC guides me through a wizard at install, and it works the first time I log in. I’ve spent more time on this than I should. Sure, it’s not a core tool I use–I use it for practicing Chinese and doing a quick sanity check of how well my applications support international input, both of which I could do in Windows–but it’s still irritating.
Also, for some reason, Kubuntu does not install all KDE components. I’m not talking about the games and edutainment, I’m talking about adjusting your monitor resolution. I had to manually install all the KDE components just to do simple configuration tasks. (I do give Kubuntu credit for correctly detecting my widescreen monitor, which I was afraid wouldn’t work given the specs for this on-board video card. Still, I prefer not to run at full resolution because it hurts my eyes.)
Speaking of manual package installation–the whole “desktop” versus “server” downloads for Ubuntu tripped me up. Kubuntu is only the “desktop” option, and there doesn’t appear to be any metapackage that allows me to install all the “server” packages. I’ve tracked down a number of packages–emacs, mysql client, etc., none of which I really consider “server” software–but still have more to install.
The disheartening thing is all this futzing is in addition to the normal upgrade woes. I’ve grown accustomed to removing the god-awful gcj and so on. That takes an hour or two anyway.
This is really why I try to avoid clean installs, and why I tend to stick with the same distro. Upgrades put you through enough pain, but complete reinstalls make it even worse.
I tried booting my old disk, but the new PowerEdge won’t boot IDE disks, grub won’t even see IDE disks for chain loading, and copying the disk image onto a SATA drive results in a phantom boot-time e2fsck failure (despite manual e2fsck runs working just fine). So much for trying to save time until I can truly spare a few days to configure a new system.
Now I’m starting to remember why I considered never buying from Dell again. That motherboard is the first time I’ve had any computer die on me in 20 years. I wouldn’t even be surprised if it caused a chain reaction, perhaps starting with this 9-year old Pentium II sitting around collecting dust, whose (sentimental) disk contents I still haven’t backed up…
Looks like I still have a weekend of configuration to do, then a week of quiet meditation, then maybe I can get some real work done.
June 2, 2007
So, my Dell PowerEdge 400SC has given up on living in a world run by war-mongers. Or, at least, that’s the explanation I prefer for why it has refused to boot since we returned from vacation.
In truth, the motherboards on these are known to have problems with overfilled capacitors, causing it to overheat. Sure enough, mine keeps complaining about a thermal event–that is, when it’ll boot at all. More often than not, the fan powers up for half a second, but not even the diagnostic lights come on. None of my capacitors look to be leaking or even bulging.
I bought mine a few years ago when I could get it cheap. $250, I think, not including the store-bought RAM (never buy RAM from Dell!). My primary motivation for the PowerEdge was support for more RAM, no need to buy an OS (== cheaper), and a bigger tower (== more room for hard drives if you don’t buy them up front from Dell).
I should say I am neither a gamer nor an uber geek–I never need the latest-and-greatest, and, in general, I’m not good at spending money on myself. I do wish to do a fair bit of computation on my home box (which I can’t do currently), as I have some personal projects lined up and I’m growing increasingly spoiled by my work environments.
I’ve considered replacing the motherboard (something I haven’t done in ages, since they started gluing heatsinks to the CPU), and I’ve also looked at HP’s ProLiant, which is a better class machine, has better service, and is only a bit more than the new Dell PowerEdges anyway.
I appreciate any suggestions. What do you run at home for development stuff?