
And now for something completely different.
I’ve spent the last forty — not quite, but almost — years of my life paying the rent by writing computer code. It hasn’t been a bad living, and I can’t complain. It’s also been more fun than not: nothing deep about it, really, but intellectually interesting enough to make the days pass quickly.
I always tell people that it’s like solving crossword puzzles for a living, which is not exactly right but isn’t too far off, at least as regards the hour-by-hour texture of the work. It’s very finical. A letter out of place spoils the whole thing.
The element that isn’t captured by that description is that there’s a certain very modest scope for creativity. A crossword puzzle, after all, has only one right answer, but any programming problem has many, and coders, like Talmudists, dearly love to wrangle about why solution A is better than solution B. There’s such a thing as elegant code, and such a thing as ugly, clunky code, and we all aspire to write the former.
This too is fun. People who don’t appear macho at all suddenly become very assertive. And people who you would think have no taste at all — judging by the way they dress, and the books they read — suddenly come out as aesthetes, and make good their claim to the title.
Alas, that’s all gone now. At least in the corporate world where I still have to make a living. I bet it’s still alive in the open-source world — in fact, I know it is. But in the sweatshops where most of us coders earn our bread, the glory has departed. It’s mere drudgery now: Taylorized and overmanaged. One of the chief villains is something called ‘Agile methodology’, nicely satirized in the TV series ‘Silicon Valley’ as ‘scrum’. ‘Agile’ is a horrible neocult, and it has become the bane of my existence.
If you read the Agile manifesto or the Wikipedia entry it doesn’t sound too unreasonable, though the smarminess of the prose ought to ring alarm bells in any reasonably alert and cautious person’s mind.
But what it has led to in practice is an extraordinary bureaucratization of programming, accompanied by an extraordinary proliferation of cult-speak. For example, there really are ‘scrums’. In fact there’s one every day, usually in the morning. Scrums are presided over by ‘scrum masters’. I am not making this up.
I believe it is now possible to get some kind of credential as a scrum master. Scrum masters are usually people who don’t write code, and don’t know how to write code, but presumably understand scrums.
One is expected to stand during scrums, and they really do consist of moving post-it notes around on a whiteboard. This is very important. The post-it notes must also be color-coded, which is a sore trial to color-blind me. I always use a note of the wrong color, and must be called sharply to order, usually by a scrum master who is younger than some of my own children.
These things are referred to as ‘scrum ceremonies’. Again, I assure you: I am not making this up.
There are other picturesquely-named personages too, besides the scrum master: stakeholders and advocates and what not. But as the old proverb has it, shit rolls downhill, and all the shit ends up on the heads of the coders. Coders are supposed to be able to push back, but in practice they can’t, and so they get stuck with whimsical arbitrary deadlines for code whose behavior has never been clearly specified. Then somebody changes his mind about the expected behavior halfway through the development process. It’s a case of bricks without straw, Pharaoh!
It’s a testimony to the decency of human nature, and perhaps to a certain vestigial sense of Munchkin solidarity, that the coders seldom turn on each other and try to shift the blame when the preposterous deadlines are missed. (Though there are exceptions, may they burn in Hell.) The usual excuse, often true enough, is that some emergency supervened because the last delivery of crummy code failed in production, and a fire had to be fought.
One could go on and on — really, somebody ought to write a book about it — but perhaps this gives you some idea.
All this has of course made the programming workplace a much more anxious, unpleasant setting. The sense of solidarity, though it’s not entirely gone, is much impaired. We all used to make merciless fun of the boss, among ourselves. No more. Every so often an easily-disavowed emoji will turn up in some chat application. That’s about the extent of it.
Much of this work is done by contractors rather than regular employees. I’m one of that mercenary army myself, these days. Contractors don’t ordinarily stick around very long — and for that matter regular employees don’t either. There is, of course, no such thing as job security for either category of prole.
One of the sad consequences of this transience and casuality, combined with the nutty deadlines and the absence of specification, is that one really ceases to care about writing good code. It will probably never go into production anyway, and if it does, you’ll be long gone by the time it blows up and exposes the bank to a billion-dollar lawsuit or public disgrace. Or both. And if it does, they will richly deserve it.
Not that I would ever deliberately leave a time bomb in some odious employer’s code base. Oh no. As Richard Nixon once memorably said, That would be wrong. Retro me, Sathanas!
But I write shit code these days. I used to care about error handling. I used to make no assumptions at all about the validity and canonicality of the data feed I was working with. I used to spend a lot of effort making sure that my code worked with improbable but possible edge cases. I used to care about optimization and thread and socket pooling and re-use and short execution paths and compact footprints. I used to care whether threads really bought you anything, or just made the code easier to write.
Admittedly, I was never very good at including comments. I have gotten better at that, because after all, they’re much easier to write than code.
I haven’t quite descended to the Skid Row of coding yet, which is what I call ‘cut-and-paste code.’ You take a block of four or five lines, and in your editor you replicate it a dozen times, from top of screen to bottom, with one or two variable names changed in each copied block. This is a sin against God and man. What you ought to do is factor out the repeated common logic into a function or macro or something, and call it repeatedly with different parameters.
But you see the cut-and-paste approach a lot these days, created by people who certainly know better. A couple of years back I had a boss at a big American bank — one that you love to hate; trust me — who insisted that I ought to take the cut-and-paste approach. Why? Because it was easier for him to read the code that way.
It hasn’t gotten quite that bad for me yet; the cut-and-paste gig didn’t last, and the bad boss is now a more or less distant memory. (God, how I hated him at the time, though.) But my product is crap these days, even without that particular descent to the Ninth Circle.
And it seems that really, this is what my employers want. Or rather, as Madeline Albright once said, it’s a price they’re willing to pay for turning an eccentric, artisanic occupation into assembly-line work.