There’s a lot of talk happening right now, in Twitter and elsewhere, about whether coding—programming computers—is a skill that should be learned by most people as a basic tool to make their lives easier, or whether it can happily remain a skill restricted to those of us who do it for a living.
Daniel Jalkut, creator of MarsEdit, suggests that current levels of programming literacy can be compared to a period in the past where literacy itself was rare. He extrapolates from there, and concludes that the ability to program could become a similarly fundamental skill.
Literacy isn’t about becoming a Hemingway or a Chabon. It’s about learning the basic tools to get a job done. I think programming — coding — is much the same. You don’t have to be the world’s best programmer to develop a means of expressing yourself, of solving a problem, of making something happen.
There’s no question that a lot of people would benefit from knowing how to code. If you’ve ever done anything repetitive at a computer you would have been better off knowing how to get it done automatically. Every programmer I know has tales of seeing people slog through monotonous tasks that could have been accomplished by a computer in a matter of moments. Think of things like renaming files one-by-one, or copying snippets of data from one file to another file in a different format.
If that sounds familiar then you’re probably doing work that a computer could do much faster and more accurately than you can. If only you knew the right way to instruct the computer to do it for you.
There’s another side to this coin though. Sure, knowing how to code might have helped you out once in a while. But similarly there have been times in my life that things would have gone smoother for me if I’d known the Spanish for, “No, I’m not in the market for a donkey right now.” And yet I’ve never learned Spanish. It’s a matter of choosing to learn the skills that will be useful often, versus those that will be useful only occasionally. (I’m ignoring here the motivating factor that coding can be a lot of fun.)
My justification for not learning Spanish is that I’ve spent only a matter of weeks in Spanish-speaking countries in my life. Can you say the same about the amount of time you’ve spent in front of a computer? Most of us now spend our entire work days, and in many cases also a lot of our free time, interacting with a computer in one way or another.
Maybe learning the computer’s language will turn out to be worth your while if you’re in that situation, even if we never reach the point where young kids learn the “3 Rs and 1 P” in primary school.
Not to mention that we sometimes code things up because we’re too lazy to go out and find where they already exist ;-).
If the future doesn’t bring computers that can understand my instructions communicated in ‘literary English’ then frankly you geeks aren’t doing your jobs…
Funny you should say that, sj. Way back when I began, in 1981, an electrical engineer walked up to me and said just that. Claimed that he had more job security.
Now it may happen that natural speech recognition finally comes … buy my outsider (programmer but not AI perspective) is that current efforts (iPhone) are more mining patterns than “understanding.”
Who knows though ;-), it could happen, this time!
Hey someone has to build and maintain those robots…
Anyway my wife’s response to reading this post was ‘No, everyone just needs a decent IT department’.
As Rory says, it’s about the most constructive use of time – we have a couple of IT people who write scripts when we need to do repetitive tasks. Asking them to do it means I can get on with other work.
I suppose the point is that computer programs are designed to be used by people who don’t know how to code – I’m not an Office expert, but I can use the functions in Excel when I need to. I can’t see a future where that won’t be enough – surely it’s the case that, as interface design progresses, less specialist knowledge is needed, rather than more?
One more thought – without knowing any computer code at all, my understanding is that knowing how to ‘code’ is a 2-step process – the first step is understanding the logic of the instructions you want to give. The second step is actually knowing the code.
as a Philosophy graduate (stop laughing), I’ve actually taken a course in propositional logic. From conversations with nerds, it’s clear that the leap to computer code is considerably smaller if you understand how logic (and therefore logic gates) work.
So I reckon the proposal should be that everyone should learn logic. Only then can those with a satisfactory understanding of basic reasoning be allowed to progress to coding.
Philosophy and logic are a good part of it, but so also is seeing a raft of users interacting, some more logically than others.
Often a non-programmer can lay out a logical flow for what he wants to achieve (first hurdle) but won’t see (nay, expresses frustration for) corner cases that stress the problem space.
At some point you need to pick up a feel for “many things happening at once.”