TL;DR: I think coding is an essential skill for modern humans surrounded by code and machines that run it. Please learn to code.

I disagree with Jeff Atwood on "Please don't learn to code":

The "everyone should learn to code" movement isn't just wrong because it falsely equates coding with essential life skills like reading, writing, and math. I wish. It is wrong in so many other ways.

In fact, I do regard coding as an essential modern skill. Yes, right alongside reading, writing, and 'rithmatic. At least this part of the post had me nodding:

I suppose I can support learning a tiny bit about programming just so you can recognize what code is, and when code might be an appropriate way to approach a problem you have. But I can also recognize plumbing problems when I see them without any particular training in the area.

Visible pipes

Luckily, pipes are not as occult as code. If you go into a basement or open the door under a sink, you can see them and follow where they go. That's some training, albeit informal or self-directed.

I'm not sure how you'd get exposed to code in the same way: View Source in a browser used to be a decent start on the web, but that's less helpful lately. Code elsewhere has typically been hard to get at, Open Source notwithstanding.

Still, I'd bet there are people in the world for whom running water comes from magic and drains into magic. Where magic means: "I never thought about it, never needed to, and am sometimes vaguely afraid of it."

For that class of homeowner, the kitchen floods when the sink springs a leak, until an expert arrives. It's not the end of the world, and plenty of people get by just fine like that. But, I certainly wouldn't agree that "Please don't learn about pipes" is good advice in general.

Learning by doing

Admittedly, "learn to plumb" isn't the same as "learn about pipes". But, is there a difference between "learn to code" and "learn about code"? I don't think so. Like writing, code doesn't seem like a thing that's easy to learn about without doing it.

When I write "coder", I generally mean this: Someone who is capable of encoding his or her intent and decision process into a form that can drive a CPU to perform tasks on his or her behalf.

That's a very broad definition, but it implies a lot. First, you have to realize that you can make a CPU can do things on your behalf--it's okay, you won't break it. It's a tool made by humans and you as a human can understand it. Then, you need a notion of algorithmic thinking, in order to formulate your intent and reasoning in a form that a CPU can execute. These are not natural or intuitive things.

I agree with that it's good to "recognize what code is, and when code might be an appropriate way to approach a problem". But, if you've never made a CPU do your bidding, it's easy to see it as a mysterious black box with a will of its own--possibly malicious or at least capricious.

And, if you've never worked to force your thoughts into to the confoundingly literal and common senseless constraints of computer programming, it's hard to even imagine how code works. If you can't imagine how it works, how do you work it into your mental toolkit for getting things done?

Learn to code, and a lot of other things get dragged into your head along the way.

Who needs all these coders?

And then, there were these bits from "Please Don't Learn to Code":

It assumes that adding naive, novice, not-even-sure-they-like-this-whole-programming-thing coders to the workforce is a net positive for the world.

It implies that there's a thin, easily permeable membrane between learning to program and getting paid to program professionally.

This is looking at work from the wrong angle. It isn't about getting paid to program, so much as coding to be good at your job.

I'm not talking about Java-heads who live all day in Eclipse. I'm talking about the Excel-head who used to rock VBA macros, who maybe just started playing with Google Apps Script. I have no idea how popular Google Apps Script might or might not be, but I've seen some crazy amazing things done in VBA by sales and account reps who'd punch you in the nose if you called them geeks.

As far as I can tell, the future of work is heading toward more work with greater volumes of information and data. Should professional programers be the only people in an organization who know how to apply computational power to solve problems? Maybe the vendors will sweep in, clean up all the cybercrud, and get the real work done for us.

Programming should not be a priesthood

Consider writing: there's a lot to learn and it used to be a thing done only by a few scribes. But, people today get a lot of mileage out of just sticky notes and email. Sure, improving your grammar and learning how to structure an essay can help in many, many ways. But, you don't need to be a professional writer to be a professional who uses written language.

The same can apply for coding. The problem, though, is that the sticky-notes-and-email level of competence barely exists or is near impossible to access. So, not only do I think we need more coders--we also need more tools that support coding and make coding more accessible. I think we should support professionals who use code.

More than that, I think we should encourage and support humans who code. I really do consider coding next to reading, writing, and math. I don't think we can all rely on someone else to write the perfect app for the work we'll need to do in the future. I expect the successful people will be those who can apply Taco Bell programming to reams of data and find answers. We'll need to ride bicycles, not tricycles.