I remember laughing out loud when John Ralston Saul defined the word “dictionary” in The Doubter’s Companion (1994) as: “Opinion presented as truth in alphabetical order.” I’m suspicious of dictionaries. Always have been. But I’m even more suspicious of those who love dictionaries. There’s a certain kind of guy who gets visibly aroused whenever he talks about the the ironclad certainty of dictionary definitions and grammatical rules. He’s got a visceral hatred of ambiguity, uncertainty, and doubt. What he wants, more than anything else, is to nail reality down, once and for all. He shakes his fist at Living Language, soaring far above him in the sky, and mutters menacingly: “I will catch you in my nets! I will clip your stupid little wings! I will put you in my cage! I will make you my pet! I will make you my slave! And if I can’t catch you with my nets, I’ll shoot you! And stuff you! Put you on my wall with the rest!”
When you deprive language of its freedom to be somewhat slippery, squishy, and imprecise, you deprive it, as well, of its talismanic power to illuminate the world around us. Language can’t be nailed down because it’s alive. And, like all living things, it’s in a constant state of evolution. Random mutations abound. New meanings emerge. Old meanings die. Existing meanings are modified. Such is the nature of language. For instance, the word “gay” has changed quite dramatically. In A Child’s History of England (1851), Charles Dickens said of Thomas Wolsey: “He was a gay man”—meaning, of course, that the great Cardinal Wolsey was a cheerful, lighthearted, happy man, not given to bouts of melancholy; a man who loved life and knew how to have a good time. Gay didn’t mean homosexual in 1851. Indeed, it didn’t take on that meaning until well into the twentieth century. But by now, in 2017, the new meaning of gay (homosexual) has completely eclipsed the old meaning of gay (joyful)—so much so that homophobic jocks at Abbott don’t want to be caught reading Nietzsche’s Gay Science (a book I assign often). One guy—the quarterback, if memory serves—went so far as to make a cover for his copy of The Gay Science. Though it pains me to admit it, the cover was actually quite pretty.
If It Was Good Enough for Jane Austen . . .
When I was in high school, my teachers told me that I couldn’t use the word “I” in a formal essay. I was taught to refer to the human race as “mankind” or “man” (e.g., Man’s Search for Meaning). When referring to a hypothetical individual, my teachers maintained that masculine pronouns such as “he” and “his” ought to be used (e.g., When the average student contemplates his future in these difficult economic times, he invariably worries about whether or not he’ll be able to find a good job after he graduates).
When I was an undergraduate at Concordia University, my professors told me that using “I” in a formal essay was perfectly acceptable. What’s more, they told me that referring to the human race as “mankind” was sexist; “humankind” was to be used instead. My professors also told me that using masculine pronouns to refer to the hypothetical individual was sexist. But they never really provided me with a viable alternative, making philosophical essays especially difficult to write. Most of us got around the problem by avoiding personal pronouns altogether. When personal pronouns were absolutely unavoidable, we generally resorted to the gender neutral “they”.
When I was a graduate student at Johns Hopkins University, my professors taught me that “they” referred to more than one person; it was plural, and could not be used to describe a hypothetical individual. Thankfully, these professors did provide us with alternatives. But they were all more or less ugly and awkward (e.g., “he/she”, “he or she”); even the best of the proposed compromises, which involved alternating between “he” and “she” throughout your essay, proved, in practice, awkward. Still, I’ve been preaching this grammatical gospel to my students for years, dutifully correcting their improper usage of the word “they”.
But I’ve recently discovered that “they” is a perfectly acceptable all-purpose gender-neutral pronoun. Always has been. Jane Austen’s Mansfield Park (1814) is a case in point: “I would have every body marry if they can do it properly.” If it was good enough for Jane Austen, then it’s good enough for you and me.
The philosopher Joseph Heath recently published an op-ed in the Ottawa Citizen about the state of Canadian higher education. It’s pretty much gone viral, and for good reason: it’s provocative, well-argued, and, to my mind, pretty much right on. Even so, the first person to post a critical review of Heath’s article in the comments section didn’t take issue with his thesis; instead, he faulted Heath for spelling “minuscule” like this: “miniscule”. As it turns out, this alternate spelling of the word has been living in our linguistic country for over a century now. Which begs the question: Isn’t it time we granted him full citizenship? Or, at the very least, permanent residency? Regardless, treating him like an illegal alien seems, at this point, rather odd. Shouldn’t we just grandfather him in?
Grammar Nazis and Fallacy Fetishists
There are distinctions which remain richly meaningful and useful in practice, despite the fact that they are patently false in theory—which, to my mind, calls into question the usefulness of a certain kind of argumentative prissiness. It’s not unlike the problems faced by grammatical outlaws. If speech acts like “I ain’t got nothing” and “I ain’t feeling good” communicate clearly within a particular context whilst, at one and the same time, violating the rules of grammar and logic, what’s wrong, the lawless language or the rigid rule? Fallacy fetishists defend the rigid rule every time; meanwhile the poets, who always seem to side with the living, say the fault lies with the grammatical rule. But I’m inclined to think that here, as elsewhere, we can live and let live: by upholding the linguistic convention most of the time and—at one and the same time—remaining open to linguistic innovation. We have to remember that language is alive, just as we’re alive, and, as a consequence, it’s in a constant state of evolution. Demanding that linguistic conventions stay the same forever is about as silly as telling your ten-year-old to stop growing up. Besides, as Nietzsche puts it in Human, All Too Human (1886): “Every word is a pocket into which now this, now that, now several things at once have been put!”
What bugs me most about fallacy fetishism is that it’s so often used, not to uphold reason, but to undermine common sense. The use and abuse of the “ad hominem fallacy” is a case in point. If a shoe salesman goes on and on about how much I need another pair of shoes, I should probably consider the source. That being said, it would be stupidly cynical of me to conclude that everything the shoe salesman says about shoes is suspect simply because he’s a shoe salesman. After all, he could be an honest man, an honorable man, who’s telling the truth. Maybe I really do need another pair of shoes. Still, the facts remain the facts: the shoe salesman has a vested interest in selling me as many pairs of shoes as possible. Most would conclude that taking this into account is entirely reasonable.
But the fallacy fetishist begs to differ. So he pulls you over and shows you his badge: “Afternoon, ma’am: name’s Patrick the Patronizing. I’m an officer of The Law of Logic, sworn to serve and protect the good citizens of Reasonable Land. What ya did back there, ma’am: well, it was a clear violation of The Rules of Argument™. But I’m gonna let you off this time, gonna let you off with a warning. But mark my words: if I catch you mentioning the vested interests of that shoe salesman again, I’m gonna have to cite you for speeding through an Ad Hominem Zone.” He’d like you to believe that evidence of this kind was declared INADMISSIBLE by The Supreme Court of Logic long ago. Usually, alas, because he’s a shoe salesman impersonating a police officer.
Thinking about things, analyzing things, invariably involves categorization of some kind: you stereotype, you pigeonhole, you make distinctions. It’s a messy business, no doubt about that. Sometimes it’s even ugly. But the best we can hope for are finer distinctions and more accurate categories. Getting rid of categories altogether, getting rid of all generalizations, would ultimately unravel language itself. These imperfect instruments we refer to as “words” are all we have, and if we wish to make sense of the world, we have to use them. Are the limitations of language a problem? Sure. But they’re not a curse. Only the rigidly dogmatic see it as such. Like the dream of a selfless scientific objectivity, the dream of a language purged of all ambiguity—a pure and precise language, a language of Science—has been sometimes noble, sometimes silly, sometimes dangerous, and sometimes benign, but always destined to failure. Be that as it may, there’s no reason to despair. So long as your attitude towards language is sufficiently playful and poetic, you’ll be able to say whatever it is you need to say. Well, um, sort of.
Apples and Hand Grenades
I once spent the better part of a wedding reception arguing with an investment banker who adamantly maintained that all sorts of seemingly altruistic actions were, in fact, selfishly motivated. We went around in circles for hours and hours, downing whiskey after whiskey, until I asked him what, in retrospect, I should have asked him at the beginning of the night: Do you think anything we do, no matter how seemingly sweet, can be called altruistic? Nope, he said. Everything we do is selfish. Alas, I realized at that point, much to my chagrin, that we’d been talking past each other for hours. I was working with a rather conventional understanding of the word “selfish” whilst he was working with a decidedly nonstandard, and altogether idiosyncratic definition, a definition so capacious that it could house the whole of human action. When a word’s meaning is stretched this much, it ceases to mean anything. As the philosopher Dmitriĭ Vladimirovich Nikulin quite rightly observes, in On Dialogue (2006): “Every theory should have its limitations: if it explains everything, it explains nothing (in particular).”
Playing with language the way that toddlers play with play-dough is tempting. No doubt about that. But it’s a temptation that we ought to resist, especially when we’re talking politics, for at least four reasons: (1) Words like “fascist” and “racist” and “misogynist” and “violence” are like plastic bags: you can stretch them a bit, but they break and become useless if you put too much stuff in them. (2) It betrays a troubling lack of interest in facts, history, reality, and truth; e.g., comparing the situation of anglophones in Quebec to the situation of South African blacks under Apartheid discredits you in the eyes of anyone with common sense. (3) It undermines your credibility in the eyes of those you wish to persuade, as it suggests that you’re either a fool or a charlatan; viz., either you have no sense of proportion, and are thus stupid, or you have no intellectual conscience, and are thus untrustworthy. (4) It degrades discourse by reducing all of it to propaganda. Social trust is based, to some extent, upon a democratic faith: in language and each other. When we play fast and loose with definitions we undermine that trust, and make meaningful dialogue rarer than it might otherwise be. Reasonable conversations are undermined, not by those who compare apples and oranges, but by those who compare apples and hand-grenades.
—John Faithful Hamer