Conventional wisdom since the dawn of the digital age has held that coding is an essential skill that far too few people have. For a long time, there wasn’t an executive or thought leader in tech who didn’t take the chance to bemoan the lack of coding in the labor force. “Our policy at Facebook is literally to hire as many talented engineers as we can find,” Mark Zuckerberg famously said in 2013. “There just aren’t enough people who are trained and have these skills today.” He put his money where his mouth was, joining Bill Gates to back the nonprofit Code.org. As much as a platform for learning to code, that website is a gathering place for public figures to extol its benefits. Few positions in the political and social spheres enjoy such broad support as the necessity of large-scale coding education.
In recent years, that consensus has started to be called into question. “The headlong global frenzy to teach programming in schools is coming about 20 years too late,” wrote Newsweek tech columnist Kevin Maney last May. Our current system of interacting with computers using “a made-up language” comes from a time when they were clunky and weak, making it easier for us to learn a new language that they could understand than teaching computers our own language. That era, according to many thinkers, is coming to an end.
What is programming code, really?
Forget the world of computers for a second. If I told you that I had received a note that my friend had written me in code, what would you expect that note to look like? Would you expect it to be plain English that you could read? Probably not: you’d likely imagine something strange-looking and unintelligible that only I and the sender knew how to de-code.
A code, fundamentally, is a communication between two entities that share an exclusive knowledge base. There are many kinds of codes—not just wartime codes that opposing militaries sought to understand, but social codes that indicate status, power, social competency between people who both know share the same ability to interpret them. Dress codes exist to maintain a certain level of personal presentation among anyone who seeks to be part of the group that writes them. Code signifies inclusion and exclusion; it distinguishes “with us” versus “against us.”
It’s not a semantic wrinkle that this is the word that we use to refer to the languages that humans use to talk to computers. A computer code is just the same as any of those other codes: complex and esoteric. Unlike a dress code at a fancy restaurant, though, computer code wasn’t invented to exclude people, but rather to be able to interact with machines that were very fast but very dumb, and which needed instructions spelled out exhaustively in order to do anything.
But if there’s one thing that we can take away from the well-publicized lack of coding in the workforce, it’s that computer coding is exclusive. Not intentionally so, of course—it is, as all those thought leaders claim, the result of an education gap. There is little political significance to the exclusivity of coding. Still, though, the fact that code represents a barrier to non-developers being able to create software is something that stands in the way of wider digital adoption.
Code makes it hard to interact with computers on a granular, operational level, purely because in order to do it, you have to first learn the code. Making things difficult is contrary to the nature of technology. From that conclusion alone, we can expect digital technology to soon catch up with the general population of the world, and make accessible what today is fairly inaccessible.
There’s evidence that this evolution is already underway. The U.S Department of Defense’s science lab, DARPA, is developing a project called Mining and Understanding Software Enclaves, or MUSE. This initiative will map “hundreds of billions of lines of open source code” to find redundancies and attempt to construct a database so that any set of functions that a user seeks to do can be found, via MUSE’s index, and assembled. It sets the stage for a potentially revolution in code accessibility. As Maney writes, “Theoretically, just one person on the planet will have to write the raw code that makes a computer perform a certain task.”
Digital literacy is the answer
Rather than pitching code as the lone skill that tomorrow’s economy needs, we should be trying to increase digital literacy. The coding skill that is valuable now not only moves too fast to be able to pin down, but it also may very well be outdated by the time today’s youngsters achieve their prime productivity.
Demanding that the general public learn how to code would be the same as Henry Ford saying, 100 years ago, that America needed to become a nation of mechanics in order to thrive in the automotive future. As it turns out, that wouldn’t have been much help. Much more effective in spreading the use of cars is something like driver’s education, which includes practice using the vehicle and instruction on how to drive safely with other cars on the road. This is a far better social-level initiative than, say, national training on how to build a Model T in 1915.
Digital literacy is much more generalized knowledge than coding, but it is probably more effective in teaching a society’s worth of young people how to interact with technology. On the one hand, computers are catching up with human intellect to the point that all students will need to achieve in the future what engineers need code to do now, will be logical thinking skills. On the other, it makes sense to demand of a society that the widely-taught skill that children are exposed to be a flexible, accessible education that they can dig into if they choose, or apply natively if they don’t.
We have a skills gap in this country, and it is a crisis of economic productivity. But finding the right solutions to it will require more than just parroting what is already done. The solution lies further out, in projecting the arc of technology as the story of how it will assist human performance—not the other way around.