Thirty years ago this week I started my first full-time, paid position as a software developer. I'm not sure of the actual day; when I moved to Philadelphia I purged a lot of old papers, and the offer letter for this job was one of them. I was a nineteen-year-old who had left college (that's another story), and whose father had been very clear about the consequences. Fortunately, the winter of 1983/84 was a good year to be looking for software jobs in the Boston area.
I landed at Think Technologies, the creators of Macintosh Pascal, as an entry-level programmer. My chief contribution to that product was a demo program that ended up as the box artwork (also shown in the link above).
I started the same week that the Macintosh was released. As a result, I didn't see either it or the product while I was interviewing. What I did see was the Apple Lisa, with its mouse and graphical UI, and the sense of “this is something big” moved Think to the top of my list.
In retrospect, Think may have been one of the most valuable jobs of my career, and not just for getting in on GUIs at the beginning. I learned a lot about startups, and about the pain of real-world software development. I also learned to keep my ego in check: the company was filled with bright people, and Mel, the person whose ideas were the base of the product, combined a particularly creative career with a reserved, self-effacing exterior.
And I learned that no job lasts forever. Eventually, you reach an intellectual or perceptual plateau, and it's time to move on to something new. As a professional, you must of course balance your personal desire for growth against your responsibilities to see your work in production. But it's easy to get trapped on the plateau, and that's something that I quickly learned to avoid.
The intervening years have seen a lot of change in the industry: graphical user interfaces are now the standard, as are integrated development environments (Macintosh Pascal wasn't the first IDE that I used, but it was the first good one); computer networks are pervasive, as is the resulting knowledge-web of Internet-hosted documentation and anonymous helpful strangers; web-apps have taken us back to a central server with dumb terminals, although client-side JavaScript is an echo of the micro-computer revolution; and lastly, the machines are almost infinitely more powerful.
But programming is still about moving bits from one place to another without dropping any.