There’s a conversation going on right now in the tech-industry neighborhoods of the web about the, for want of a better word, evangelistic movement to teach coding to everyone. Indeed, it’s a conversation that’s been going on for years now – but the recent writing I’ve read on it from the folks who are the movement’s most ardent defenders seem to hinge on some odd assertions.
I dipped into this subject a week or so ago when a friend linked to Quincy Larson’s article “Please do learn to code” on Medium.com’s freeCodeCamp site. Here’s what seems to be Larson’s thesis statement: “[P]rogramming is how humans talk to machines. … For computers to succeed at the jobs we’ve assigned them, they need us humans to give them extremely clear instructions. That means coding. Coding isn’t some niche skill. It really is ‘the new literacy.’” He then goes on to more or less restate:
Once a history-shaping new technology comes out of the genie bottle, you can’t put it back. This was true for airplanes, antibiotics, and nuclear warheads. And it’s true for microprocessors, the internet, and machine learning.
Those who adapt to these permanent waves of changes flourish. Those who shrug them off — or fail to even realize they exist — asymptotically approach irrelevance.
Coding is the new literacy. Like reading was in the 12th century, writing was in the 16th century, arithmetic was in the 18th century, and driving a car was in the 20th century.
It’s a bold assertion (and bolded in the original as well), and it feels compelling. But is it the right analogy? Is coding actually the new literacy? Well, not the way I understand the word.
Larson is responding in part to a TechCrunch article by Basel Farag with the somewhat cheeky title “Please don’t learn to code,” a response in its own right to what he calls “the cultural shenanigans of Silicon Valley” in the form of the Learn to Code movement. Farag, equally boldly, starts out with the opposite view: “Coding is not the new literacy,” he says up front, and to be blunt, he makes his case a great deal better than Larson does.
(As an aside here, I’ll admit that I feel that Larson’s piece suffers specifically from being in the typical style of articles on Medium.com, with their one-sentence paragraphs and their high-level view of their subjects. They’re often built on broad statements with no real arguments behind them, and they often deal with their subjects on a level that’s abstracted to a fault. In particular, I’ve read several pieces on Medium on the subject of “design,” and as someone who designs things, I frankly have almost no gorram clue what any of them mean by that word.)
Farag’s argument is actually a lot more nuanced than his title suggests. He acknowledges coding as an important skill in the 21st-century workforce – “But only in the right context, and only for the type of person willing to put in the necessary blood, sweat and tears to succeed. The same could be said of many other skills. I would no more urge everyone to learn to program than I would urge everyone to learn to plumb.” And he goes on to back this up with concrete reasons for being cautious about plunging into the pursuit of coding, especially as a career move: The fact that many programmers, once they have the hammer of coding, tend to see every problem as a nail; the slippery, ever-changing world of cutting-edge programming languages, which can become obsolete literally overnight; and the challenges of the tech field that mean coding proficiency isn’t by any stretch a golden ticket to success.
In contrast, Larson more or less says, “No, you’re wrong” several different ways without ever making his case beyond, “It really is ‘the new literacy.'” And the real problem with that repeated statement is that he fails to explain why.
It’s not enough to say that it’s an extension of the pattern that began with learning to read and write and work with numbers centuries ago; the parallels don’t hold up. Because saying “Programming is how humans talk to machines” is both true and false. Programming is how programmers talk to machines, and if they’ve done their job competently, part of what they tell those machines to do is give non-programmers a point of access – which is why we have the concept of a user interface.
Actual literacy, as in the ability to read and write the alphabet our language is set down in, is a skill whose lack is actually – to guardedly use a word tainted with ablism – crippling. A certain level of numeracy – enough to do basic calculations – is also pretty vital to getting by in the modern world. A baseline of what we used to call computer literacy is indeed becoming well-nigh impossible to do without: The ability to know your way around an operating system, understand drive and folder structures, create a basic spreadsheet, work with a word processor, put a string into a search engine that will produce useful results. And knowing how to operate a motor vehicle is indespensible for many (but not all) of us – but just as most drivers get by perfectly well without knowing how to strip and rebuild an internal combustion engine, most people who operate computers have plenty of work to do with them without ever needing to pop the metaphorical hood on the machines they command. Having either skillset might save you some trouble and expense when things go wrong, depending on how deep your knowledge goes, but you may well be just as happy paying an expert to apply their depth of experience to the problem. That is, after all, what we have experts for.
Of course, the problem here is that coders – who are, by profession and inclination, hackers in the most neutral sense of the word – invariably do talk to machines on a deeper level than most of us; and because they tinker with their world in this way, many of them imagine that’s what the rest of us should do as well, whether it’s to our benefit or not, whether it solves an actual problem we have or not. It’s a strange sort of myopia that presumes what they do is self-evidently the most important and useful skill there is, and therefore they are obviously on the crest of the wave of The Future. The truth, I imagine, is rather more complicated than that.
So as a person whose bread and butter is language, I’m afraid I have to call this “new literacy” business out as hyperbolical nonsense. The comparison doesn’t hold up. The ability to write and understand and troubleshoot machine language simply isn’t anywhere near as central to our getting by from day to day as the ability to interpret the glyphs of our writing system into sounds and words – or even the ability to string written language together in a way that conveys useful meaning. (Indeed, while we’re at it, if we’re going to advocate for skills in the IT field that everyone should learn, I would say there’s just as strong a case that everyone should learn technical writing, which in its own way is just as much the means by which humans learn to talk to machines as programming is – and the ability to write clear, plain, easy-to-follow instructions for navigating a UI is a frustratingly rare talent.) Nor do I see it ever becoming as central; in a world where our day-to-day interactions with technology are only becoming more natural and intuitive, I’m having a hard time imagining a World of Tomorrow where the only way to get a machine to do anything is to input a string of code. Computers are tools, and programming languages are the tools necessary to build the tools. That makes coding enormously important to the modern world, but only as important as the end it serves. Making it an end in itself seems like it’s deeply and incoherently missing the point, and conflating it with the other ways in which language plays a role in our lives distorts our perception of both of them.
(Do I think programming is a good thing to teach in schools? In fact I absolutely do, just as I think grammar and algebra and art and music and woodshop and home ec are good things to teach in schools. Because understanding a broad range of subjects is the path to being a well-rounded human being with good cultural intelligence, and because seeing how things work under the surface demystifies the world in useful ways, and because young people can’t be expected to know what path in life they want to pursue unless they have exposure to a broad range of interesting things. And learning code as an adult might be a good idea for many of the same reasons. But let’s not give it unearned status as either a One True Way or as the mystical key to unlocking our future security, and let’s especially not place the burden of those fairytales on our kids.)
 Unless, somewhat ironically, you’re in one of the professions that builds things – ironically because many of us are, unfortunately, tempted to look down on the blue-collar careers that rely on trade schools rather than so-called higher education, which does a deep disrespect to the talent and skill (often including several fairly sophisticated mathematical disciplines) required to succeed at them.
 And, indeed, one of the unintended consequences of a movement to give everyone a particular skillset is that genuine expertise is inevitably devalued, because it’s that much easier for the management of a company to handwave the need for people who specialize – a variation of the problem that cratered my own profession when the bosses started to say, “Why do I need an editor when I already have a spellchecker?” The biggest problem facing the workforce of the future isn’t going to be lack of technical savvy, it’s going to be the same thing that’s frustrated workforces since time immemorial: A management culture dedicated to finding a magic formula that will let them cheat the law of fast/cheap/good, and which refuses to accept that that trick never works.