This post was conceived at OSBridge in Portland this past June, but its birth has taken a while. And in the meantime, this brilliant talk by Leslie Hawthorn, given at eurucamp 2015, appeared:

Leslie says everything that I say here, only better and more succinctly. If you want the TL;DR version, just go watch her. I had already written these words, though, and I want to tell my story as it relates states of being “technical.” Or not.

When it comes right down to it, we are all “technical.” And technology is everywhere in our lives, not because software is eating the world, but because human beings make all sorts of tools, and every single one of those tools is a technology.

“Techne” = “art, craft, skill”

“Technical” and its close cognate, “technology,” have come to be defined more narrowly over the past 50+ years, to refer almost exclusively to industrial and scientific applications (and those terms have in their turn been defined more narrowly over the centuries). But what’s happened to the word “technical” goes beyond simple shifts in usage over time. This word has become a key way to identify people — not things made by people — as belonging to a certain privileged elite. Or, in the case of “non-technical” — being ruthlessly excluded from that same elite. (Because then we can wring our hands over the “problem” of “bringing non-technical people into tech.” Whatever that’s supposed to mean.)

This matters not just because “technical” is a useful word in many contexts other than software programming. It’s important because this particular word has come to distinguish the coders — who are also The Most Important People — from everybody else. There are “technical” contributors to software projects, and there are “non-technical” contributors. And the technical guys (noun choice deliberate) stand tall over all the rest of us.

A few years ago, “technical” had already come to refer mostly if not entirely to software development, but its appropriation was a bit more general. When I started working at my present company, nearly eight years ago now, my manager told me that she had hired me in spite of the reservations of one of my interviewers, who had expressed concern about my apparent lack of “technical” knowledge or skills. What exactly that meant, nobody seemed to know. I certainly wasn’t expected to code. But being insufficiently technical was a thing, and was considered a cause for concern.

Fast forward five years. I’m compulsively curious, I like to learn new things, and I work with programmers. I’d been reading code for years (before my assessment as “not technical enough”), but now I learned to code, at least enough to write sample code for an SDK, and to work with the programmers on code comments to produce reference documentation. It was a proud moment when the SDK was presented to the company, the architect acknowledged my contributions, and he described my work explicitly as “very technical.” I’m sad to say, in hindsight, that I was proud at that moment. A Real Programmer had pronounced me Technical. I had Arrived. I was part of the in-crowd.

Because that’s the point behind current uses of the word “technical.” It separates the sheep from the goats. The folks with the real cred from everyone else.

And in the years since I was welcomed into the ranks of the “very technical,” the word’s definition has gotten more and more narrow. At two major recent conferences I attended, I encountered it extensively and pervasively, to describe only the act of writing software code. All the time. Testers not allowed. Only Real Full-time Programmers can be technical, in this view of the world. According to this world view, I no longer have “technical” status — but then, I no longer want or seek it.

Yes, I want to develop my coding skills, and to apply them more in my daily work. But I want no part of a taxonomy designed to mark an exclusive and narrow geek elite as fundamentally More Important Than All The Rest of Us. I want to challenge all such taxonomies, and the taxonomy of the technical and technology in particular.

When I started teaching with the web — which amounted to no more than putting my course syllabuses online, and updating them with links to external resources — I was in the vanguard of classroom technology. Most of my students had computers, but they didn’t all have Internet access. Those who did, had dial-up. Most of them accessed the web from library workstations. I still printed out paper syllabuses, and wrote the updates on the chalkboard. I wrote out my lecture outlines on the that same chalkboard, or took notes on the board if class discussion was the order of the day. Yes, PowerPoint was available, but I never went that route. (You really shouldn’t have to ask …)

I started every term with a pencil and a pen in my hand, together with the syllabus projected on a screen. And I took the time to explain that the laptop on my lectern and the pen and pencil in my hand were all technologies — tools made to help humans make other things. I’d mention the printing press — a taste of things to come in the Renaissance history courses I often taught. The astrolabe, the compass, the jacquard loom (an important progenitor of programming technologies), the automobile — these are all technologies. And I’d mention them. But I’d start with the simple pencil and pen because they were so accessible, and because they and the laptop computer — or more precisely, the web page that I was displaying in its browser — were equivalent technologies for the purposes of the classroom. It was absolutely vital that in bringing a new technology into the classroom I do everything I could to keep the classroom experience familiar, and accessible, and unthreatening. Like the world my students knew, not different from it. Representing continuity with the past, not break from it.

Sure, the context of those not-really-so-very-long-ago classrooms was different. But software development these days, and the rest of the world too, could use a more generous understanding of what technology means, what it means to “be technical” — a clearer-eyed and all-encompassing view of the many different techniques required to produce a piece of software, or any other human endeavor.

Software is eating the world, and its creators and masters are all too eager to stake their exclusive claims to the privilege that their work confers. Let’s take back one not-so-little word, and celebrate the whole of human technical endeavor.