Monthly Archives: September 2015

Fake It, Make It, Be It

“The parsimonious explanation for why you feel like a fraud is that you are one.”—Aaron Haspel, Everything (2015)


I’ve had people tell me a couple of times in the last decade something along the lines of: “I feel like a fraud, like an impostor, despite my successes. I guess I have that Impostor Syndrome everyone’s talking about. Guess I need some therapy or drugs to fix this, make me feel better about myself.” When people tell me this, in my head, I’m almost always thinking to myself: “um, well, yeah, so far as I can tell, you kinda are a fraud.” But the solution, as my friend Jed Trott rightly observed last night, isn’t to fix the way you feel, it’s to fix the way you are—viz., to actually become person you’re pretending to be. Three of Aaron Haspel’s aphorisms are especially good on this issue: (1) “All intellectuals must begin as pseudo-intellectuals.” (2) “To be better it is first necessary to pretend to be; and objections to improvement often masquerade as objections to pretense.” (3) “It is impossible to recognize your betters until you acknowledge that they exist.”

Recognizing that your betters exist is often profoundly uncomfortable. It might even make you feel, well, to some extent, like a fraud. But that’s okay. Indeed, it’s often salutary, good for you—like vegetables, working out, and fresh air. Yet again, I propose that we return to the sensible advice proffered by the Roman Stoic Epictetus. In The Art of Living, he maintains that “one of the best ways to elevate your character immediately is to find worthy role models to emulate. . . . Invoke the characteristics of the people you admire most and adopt their manners, speech, and behavior as your own. There is nothing false in this. We all carry the seeds of greatness within us, but we need an image as a point of focus in order that they may sprout.”

—John Faithful Hamer, From Here (2016)

The Simulacrum

The idea of the simulacrum which I discussed in the following review (slightly revised here) I did for a sociology class in my undergrad days strikes me in retrospect as a good example of a postmodernist tendency to take a good concept and overextend it – a tendency which, ironically enough, postmodernist thinkers have criticised modernist, Enlightenment-inspired thinkers of having in abundance, manifesting in particular as an impulse to create “totalising” ideologies. The dangers of sweeping generalisation are perennial.

IMG_7495Chapter 7 of Sturken and Cartwright’s “Practices of Looking: An Introduction to Visual Culture“, titled “Postmodernism and Popular Culture”, begins by discussing the “simulacrum” in the context of postmodernism. The simulacrum is what replaces the idea of “the original” – whether that original is an image, a text, or a more abstract entity such as “an ideal small town” – as a result of what might be called the postmodern reordering of cultural space. It has been able to do so because of the rise to predominance of the image over other forms of cultural media and of technologies of (apparently) perfect reproduction.

The simulacrum cannot be, and does not need to be, tied to a particular referent. In the case of “an ideal small town”, Disneyland’s Main Street is a simulacrum par excellence: it refers to no particular town anywhere, as opposed to the referent of, let’s say, an actual town in Maine as referred to in the memory of someone who has been to it. Its very generic nature is its essence, and thus it is easily reproducible, and any and every re-production is as essentially “original” as the initial production. The fact that an image or other entity might be the first instance of production rather than the thousandth has no significance when that entity is a simulacrum. They are all originals and thus none are originals.

Postmodernism questions the modernist assumption of progressive linearity of development, of historical necessity, and thus the temporal causal chain with which one would trace back to the “original” potentially is brought into question. [pp. 251-252] “Presence”, the immediacy of direct perception, of experience, is itself challenged as not so direct, in fact as always mediated. [p. 252] (This is a centrally familiar concept in phenomenology.) How do you know that this is the “original” you are experiencing? All you have are your sense impressions, interpreted by your brain; thus you perceive not the “real world”, but your brain’s constructed simulacrum of it. Postmodernism also challenges the claims to universality of various philosophical concepts and the institutions which are founded upon them. How do you determine “authenticity” when its basis – the values underlying it – may be culturally contingent and thus limited in scope? [p. 252]

As a musician who works in digital media, as well as a philosopher, I find a certain irony in the notion of the predominance of the simulacrum in the age of the image and digital reproduction, in the notion that there are, or may be, no longer any originals. I would say that even if the first instance, the first digital file, of a composition I create on my laptop in Ableton Live is not “original”, in the sense that it’s indistinguishable from every subsequent copy I make and distribute electronically, there is yet an irreducible and inextinguishable originality in the act of its creation, which, after all, happens first in one and only one place: my brain. (The memetic question of how original any creative person actually can be, given unconscious influences by others, and the deterministic causal chains those imply, is a different one, not requiring resolution for this determination of the originality – in the sense of being different from and prior to all copies – of the products of my brain’s activity.) This sense of originality adheres to the composition forever thereafter, no matter how many copies I or anyone else may make of it. Its referent is durable. Thus, the irony lies in the sweeping nature of the claim of ubiquity for the simulacrum; perhaps just the sort of metanarrative which postmodernism aspires to abjure.

—Kaï Matthews

(Image from )

John Abbott College Teachers 89% in Favor of Strike

“Speak softly, and carry a big stick.”
—Theodore Roosevelt

IMG_3390Earlier on this evening, as the September sun fell into a blood red sky, John Abbott College teachers voted 89% in favor of a strike—with the largest voter turnout I’ve ever seen at a union meeting. We’ve just given our union representatives an overwhelming mandate, a big stick which they can take to the bargaining table with them. Let’s hope they don’t have to use it.

The core issues involved in this strike vote don’t really have anything to do with the paycheck deposited into my account every two weeks. It’s about pencil-pushing bureaucrats who’ve never set foot in a classroom telling trained professionals how to do their jobs. It’s about the vast majority of my younger colleagues who are still living precariously contract to contract after years of loyal service. It’s about classes that have gone from 25 to 45. This isn’t about the money; it’s about honour and respect.

—John Faithful Hamer

Sticks and Stones may Break your Bones, but Aaron Haspel Draws Blood: A Review of Everything: A Book of Aphorisms (2015)

“I approach deep problems such as I do cold baths: fast in, fast out. That this is no way to get to the depths, to get deep enough, is the superstition of those who fear water, the enemies of cold water; they speak without experience. Oh, the great cold makes one fast! And incidentally: does a matter stay unrecognized, not understood, merely because it has been touched in flight; is only glanced at, seen in a flash? Does one absolutely have to sit firmly on it first? Have brooded on it as on an egg? Diu noctuque incubando, as Newton said of himself? At least there are truths that are especially shy and ticklish and can’t be caught except suddenly—that one must surprise or leave alone.”—Friedrich Nietzsche, The Gay Science (1887)

41XBc2HTu0L._SX322_BO1,204,203,200_In a letter to a friend, Nietzsche maintained that the only readers who could really claim to have understood his Zarathustra (1891) were those who were, at times, profoundly wounded by it. I couldn’t help but think of this remark as I read Everything (2015). Although this book is quite short and extraordinarily clear, it’s not an easy read. Far from it actually. Haspel says that he asks but “one thing of literature: that it draw blood.” And he delivers on this score, again and again, with aphorisms like the following: (i) “Whatever you think you like — are you sure you like it? Or do you like being the sort of person who likes it?” (ii) “Whatever you have done, you are the sort of person who would do that.” (iii) “It never seems to occur to the teacher who complains of inattentive students that he may not be worth attending to.” (iv) “If you want to destroy your marriage talk about it.”

But these are only some of the most obviously challenging aphorisms contained in this volume. The more insidious ones are like time-bombs or retroviruses: I rarely “get” them the first time I read them. Don’t even necessarily get them when I’m reading them. Instead, something happens or someone says something, days or even weeks later, and a bell goes off in my head and I think “a-ha”—that’s what he meant! For instance, this aphorism (which I posted the other day on Facebook) is loved at first for almost all of the wrong reasons: “If it has never crossed your mind that you might be stupid, you are.” People who’ve been (like me), at times, painfully aware of their inadequacy, read this and feel smart. Until, that is, they realize, a few days or weeks later, that although failing the aphorism’s test proves that you’re stupid, passing it doesn’t prove that you’re smart. A week or two later, however, it gets worse: the self-congratulatory glow loses all of what’s left of its luster when you realize that you can be stupid and know you’re stupid.

Some of Haspel’s aphorisms are laugh-out-loud funny, such as: (i) “Passion, n. An overwhelming urge to spend your life at something you don’t do especially well.” (ii) “The ideal work environment for a writer is jail.” (iii) “Blaming an actor for being a narcissist is like blaming a tiger for being a carnivore.” (iv) “It is when we recognize our hopeless inadequacy at everything else that we discover our vocation.” And some of them are straightforwardly brilliant, such as this one, which is, to my mind, the best summary of the Socratic way of life I have ever read: “A grudging willingness to admit error does not suffice; you have to cultivate a taste for it.”

Still, if you’re looking for the kind of writer beloved of avid readers of The New Yorker—the kind who knows how to make his educated liberal audience feel superior to all of those yahoos in the sticks who hunt, pray, vote Republican, and believe in weird stuff—don’t buy this book. Seriously, don’t. Because you’ll hate it. Haspel holds up a mirror, and, trust me, you’re not going to like everything you see. I know I didn’t. If Haspel has an overarching message that he wants to impart it’s that we’re not exempt from the follies of our day, even (and perhaps especially) when we think we are: “We are more like our contemporaries than we imagine, and less like our ancestors.”

I read a great deal (probably more than I should), and I’ve been a great lover of the aphoristic genre for over twenty years. Yet never before have I encountered so many aphorisms written by a contemporary of such a high quality: Haspel is in a league of his own. At his best, Nassim Nicholas Taleb’s aphorisms in The Bed of Procrustes (2010) rival those of Epicurus (e.g., “Love without sacrifice is like theft” is something I wish I had written). But my fellow Canadian, George Murray, probably deserves the prize for second place. His most recent collection of aphorisms, Glimpse (2010), is often outstanding (e.g., “Rubble becomes ruin when the tourists arrive”). Even so, the collection is scandalously uneven, and it really doesn’t hold a candle to Everything. To wit: Aaron Haspel is the greatest master of the aphoristic form writing in English today. It’s always hard to know which books will stand the test of time, which books will be read 300 years from now. But if I was a betting man, I’d bet on Everything.

—John Faithful Hamer, The Goldfish (2016)

A Silent Spring for Democracy?

11894012_10152973011131822_6700012347523273932_oI am increasingly seeing calls to ban people from expressing racist / misogynistic / homophobic / xenophobic views on social media. Now, first of all, no one who knows me can imagine that I would ever personally be in favour of any perspective that judges people primarily along superficial and random demographic lines. However, determining whether or not something should be said ought never to be based on whether one personally supports or agrees with the viewpoint or not. If anything, we ought, like Voltaire before us, to not only tolerate but in fact fight for the right of those who disagree with us to freely, without censure, express their particular take on the world.

We ought to be vigilant about the fact that there now (again) appears to be a disturbing movement afoot, not least among those on the left (with which I otherwise largely sympathize), to prohibit expressions of any points of view that do not roughly align with their own, or with which they take exception. Let’s not beat about the bush: this is totalitarianism.

History shows us that zealous control and prohibition of others’ views tends to have several counter-productive results. As happened with Neo-Nazism in post-war West Germany, it may drive those with sufficiently different views from the current moral majority’s, underground, which is potentially risky, since society loses not only the ability to keep tabs on their numbers and activities, but also loses valuable opportunities for establishing and maintaining an open dialogue. Allowing expression is *not* the same as agreeing with it. In fact it is only by allowing expressions of what we disagree with that we can achieve a dialogue. The ensuing argument may prove vicious, but rather that than an authoritarian and oppressive attitude to the thoughts and opinions of other people.

The importance of maintaining a society where wildly different points of view are tolerated, and even encouraged, isn’t just a good idea in order to be able to keep tabs on a potential enemy. There is something else, something which ought on reflection to be apparent: in a society where totalitarian values and principles have begun to hold sway, it may not be long before any one of us find ourselves on the “wrong” side of any given issue (according to the administrative or judicial powers, the moral majority, or whoever else) and discover that we are no longer allowed to express our views or argue our case, an insupportable situation for most individuals.

It is therefore paramount that we remember that democracy contains a built-in contradiction well worth honouring, namely: in a democracy you are allowed to give vent to undemocratic views. The moment we overlook or renounce that important contradiction, and claim that democracy as (perceived) content (such as a particular view of humanity, e.g.) is more important than democracy as a systemic framework or form of government, we have set foot on a path that leads into great darkness.

—Marie Clausén, author of Sacred Architecture in a Secular Age (2016)

Miniscule Matter?

“The punster, the grammarian, the nitpicking fact-checker
all display contempt for what is being said.
They counterfeit attention.”—Aaron Haspel,
Everything: A Book of Aphorisms (2015)

MINISCULE MATTER? The philosopher Joseph Heath recently published an op-ed in the Ottawa Citizen about the state of Canadian higher education. It's pretty much gone viral, and for good reason: it's provocative, well-argued, and, to my mind, pretty much right on. Even so, the first person to post a critical review of Heath's article in the comments section didn't take issue with his thesis; instead, he faulted Heath for spelling

The philosopher Joseph Heath recently published an op-ed in the Ottawa Citizen about the state of Canadian higher education. It’s pretty much gone viral, and for good reason: it’s provocative, well-argued, and, to my mind, pretty much right on. Even so, the first person to post a critical review of Heath’s article in the comments section didn’t take issue with his thesis; instead, he faulted Heath for spelling “minuscule” like this: “miniscule”. As it turns out, this alternate spelling of the word has been living in our linguistic country for over a century now. Which begs the question: Isn’t it time we granted him full citizenship? Or, at the very least, permanent residency? Regardless, treating him like an illegal alien seems, at this point, rather odd. Shouldn’t we just grandfather him in?

—John Faithful Hamer, From Here (2015)

Burner Alley: Desecrated and Destroyed by Dolts

10460856_10152231489437683_4107503743666771997_oFor years now, I’ve been taking visiting friends and relatives, my students, and countless others, to see a little alley about a half a block away from our place on Laval Avenue. The locals call it Burner Alley.

I’m not sure when it happened, but at some point Burner Alley was taken over by a bunch of artists and hippies who own (or rent) properties adjacent to the alley. What these creative minds did to the alley is hard to describe, but this photo album might help. Regardless, this much I can tell you: it was art, it was amazing, and it was revised often.

Burner Alley was an attraction, a place of beauty and wonder. And yet a small army of city workers destroyed it yesterday morning. They showed up on a Saturday morning with bulldozers and dump trucks. They ripped it all down. And then they took it all away.

I’m so angry right now that I’m actually finding it hard to write this. Really don’t know how to make sense of such senseless stupidity. Why anyone thought this was a good idea is beyond me. The alley wasn’t a fire hazard. It wasn’t attracting pests. Nor was it messy. Quite to the contrary actually: it was exceptionally well maintained. Always clean. Always neat.

Burner Alley was a magical place, a sacred place: and now it’s gone. Desecrated and destroyed by dolts. For nothing.

—John Faithful Hamer, The Myth of the Fuckbuddy (2016)

Dressing for the Heat

1891600_10152484108932683_2793182695933157344_oIt’s during hot weather like today’s that one of the deeply insane aspects of our dominant Western culture stands out to me. It’s the insistence on adherence to standard workplace dress codes, especially suits and ties, which are, at best, only suited to mildly cool weather, during warmer weather. This means, of course, that during outdoor walking or public transit commutes many people are quite overdressed, and workplaces, shops, and theatres, for instance, are overly air-conditioned to meet this norm. Everyone who’s worked in an office building has probably witnessed the incongruity of the secretary who brings in a sweater to wear indoors when it’s sweltering outside.

All of this is of course extremely wasteful; so much energy could be saved if we all wore seasonally appropriate clothes. (I won’t propose nudism for really hot weather, although that’s actually the most logical option.) In some hot countries (e.g., Israel or The Philippines, where you usually see politicians on the news in short sleeve shirts and no jackets or ties), they do wear more appropriate attire. But not here in North America or Europe during hot weather. No, we insist on denying the reality of the season and modifying as much of our environment as possible to create an artificial spring or autumn. We resent nature and want it to conform to our needs, to accommodate our dress codes. (I don’t include myself, or many people I know, for that matter, in this “we”, of course. It’s merely a rhetorical device.) This truly is insane.

—Kaï Matthews

Walking, Thinking, and Attention

Saw a posting (via Brain Pickings) about Rebecca Solnit’s Wanderlust: A History of Walking (2000) a few weeks back, and just got around to buying the e-book. Walking, as a basic, mundane, yet profound way of being in the world, is its theme.

For me, walking has been essential to knowing and remembering, to processing and understanding both the world of things and the abstractions of thinking. Some of my earliest memories involve it: I remember the side street of our block on Ismaningerstraße in central Munich, Wehrlestraße, walking with my older brother holding my hand; I was less than 2, maybe even younger; we moved out to the inner suburb Solln very shortly thereafter. We left Germany when I was four and a half. But when we went back to visit when I was 19, I was surprised at how accurate my memories of the places I’d walked were; sharper than the vague ones of our apartment and then our house in Solln. The buildings, the church whose tower we could see from our building’s back courtyard, everything on Wehrlestraße. In Solln there was a patch of woods (which is still there, as are the farmers’ fields nearby) at the end of our street with a path that led to a commercial strip on the main road (Wolfratshauserstraße) where my grandmother would take me to the local Konditerei for pastries and gummi-baerlien. Again, my 19-year-old self, retracing my 3-year-old self’s steps, found it just as I’d remembered it.

I’ve written before of how whenever I’m in a new place and want to really know it, I walk. Driving through a place is like skimming the Cliff Notes of “War and Peace”; even a bicycle is still a machine which mediates one’s experience, which takes a piece of one’s attention away from one’s surroundings. Walking is so automatic that it barely registers on one’s awareness, leaving it free to contemplate one’s surroundings; when it does impinge upon it, when we stumble over some obstacle, for instance, that is still a direct engagement with those surroundings. There is no more intimate way of being in the wider world. We’re animals. Moving under our own power is what we do and how we have known the world for far longer than any other means.

It is also why we can feel a loss of mobility, whether our own or another’s, so keenly: Tom, a First Nations guy who’s been one of the two guys who do the garbage, yard work, and snow removal around our building, and who has cheerfully dug out my car every winter (I pay him) with his snowblower, got cancer a few months ago, lost the ability to walk, may have only six more months to live, is bald from chemo, and is getting around in a motorized wheelchair. I saw him yesterday as I was out shopping and told him I was glad to see he could still get out and about, all over the wider neighbourhood; he laughed and said, yeah, it’s nice. But it’s still heartbreaking to see him not walking and riding his bike around, to say nothing of not having long to live.

Like fellow Concordia grad and flâneur Chris Erb, I’m baffled by people who are perfectly able to walk (and whose local environment lacks any major impediments to it – an objection raised in earlier discussions of this piece) but who dislike it, who prefer to minimize it by using their cars or other means. (Chris once described a visit with some Fredericton friends: he wanted to show them some place that was a ways away, and thought nothing of walking there, but as they proceeded and realized that it was going to be more than a couple of blocks, they became anxious and balked.) It makes me wonder whether rigidity of thought goes with a lack of significantly frequent walking – and I don’t just mean the lack of exercise which can also impair blood-flow and thus cognitive function.

Walking as a meditative activity can be in familiar surroundings, where that familiarity facilitates the automaticity of it, so that one’s thoughts can wander abstract pathways; or it can be in novel surroundings which trigger new associative thinking – free wandering parallels free association. In either case, something about the act of moving on our feet seems to set our thoughts in motion in a way that doesn’t happen as much when we’re sedentary. (There’s an image of philosophers as armchair thinkers, but I think a look at the lives of various thinkers throughout the ages would reveal that many of them engaged in perambulatory meditations much of the time.)

Why should moving about on our two feet be so intimately involved in our cognitive engagement with the world? A folk/evolutionary psychology explanation would probably cite the fact that we’ve been walking for several million years, and in a world where paying acute attention while walking was essential to survival. Possibly, but that’s only the bare bones of an hypothesis.

I’m looking forward to reading what Solnit has to say about it and much else.

—Kaï Matthews

Postscript, after a fair amount of Facebook feedback; I reproduce here my most recent comments: 

It’s interesting that the comments on my piece should so quickly turn to the practical impediments to walking in our built environments. While one quite valid response to my piece could be to focus on the potential elitism of extolling the singular virtues (for the able-bodied, to be sure) of walking – “Well, aren’t you just so fortunate to have places to walk! Bit of a luxury, innit?! Ain’t no sidewalks where I live!” – I focused on its relationship to our grasp of the world for a quite practical reason.

If walking did not have such a unique beneficial quality to it, one that is lost when other modes of transport are substituted for it, then there would be far less reason to fight for its existence as an everyday and primary means of locomotion. Zipping around our cities in cars, on public transit, on bikes or scooters, or via Futurama-style pneumatic tubes, for that matter, would do just as well, and that would have obvious consequences for urban planning. Any policy worth implementing should have a sound philosophical and scientific basis; speculating about what makes walking special thus is not some indulgence for lazy elites but rather is relevant to everyone, no matter their circumstances.

I want to argue for the essential, unique, and irreplaceable value of walking, a value I think is rooted in our ancient bipedal nature, in the way that the evolution of our cognition may be intimately bound up in it.

There was a black guy in LA I remember seeing on the news back in the 90s (IIRC) interviewed about his pushback against cops harassing him for his penchant for taking long, long walks around the city. He spoke of how important it was to him, not merely for the sense of being able to exercise his legal but all too poorly respected right to walk wherever he felt like in public spaces, just like any white person, but also because it helped him think. I remember thinking, yeah! I know just what you mean! The news anchors interviewing him seemed not to register that aspect of his well-articulated explanation of his grievance against the LAPD; they could only view him through a lens of “black man complains about not being able to walk outside his ‘hood.” The idea that he could be pondering and philosophizing during his walks, that that might even be his primary reason for his extensive perambulations, seemed not to occur to them.

Post-postscript: There’s an old joke about the difference between New York and LA: in LA they say “Have a nice day!”, but they’re thinking “F@&# you!”; in NY they say “F@&# you!”, but they’re thinking “Have a nice day!” But for me an even more pertinent difference, and why I prefer NY, is that LA is a sprawling, spread-out city of cars (as the old pop tune goes, “Nobody Walks in LA”, which isn’t strictly true – lots of poor black and Chicano folks do – but true enough), whereas NY truly is a pedestrian city, dense and compact enough for it to be feasible, and a place where it’s not unusual for lifelong residents to never get a driver’s licence.

Post-post-postscript: a number of comments I’ve received have affirmed and expanded upon my themes, so I present them here:

There is something inherently healing as well about walking. I don’t run. I walk. And I notice. Walking pulls me away from my thoughts, and brings me closer to them, at the same time. There is an inward-outward movement, like a wave, from my inner world, to noticing and hearing what’s around me.I suppose that’s why I like to visit cities. I can leave the car where it is, and explore, meet people, engage in my surroundings. – Leeça St.-Aubin

My mother was the first to teach me the power of walking, but many have since encouraged me. Nietzsche admonishes us to trust only those thoughts which come to us while walking and Taleb tells us in an excellent essay precisely why he walks. I’ve witnessed the strange mix of panic and exasperation that many suburban North Americans express when you tell them they’re about to walk for more than 20 minutes, and it is indeed bizarre. Just happened two weeks ago actually. “Why don’t we drive?” she blurted out, trying to conceal her anger. I think you’re on to something concerning the relationship between categorical thinking devoid of nuance and excessive driving. Something to that. – John Faithful Hamer

“All truly great thoughts are conceived while walking.” From Twilight of the Idols. (Nietzsche) – Rich James

Many years ago I decided that the best way to explore Brooklyn was to walk from Brighton Beach to the Manhattan Bridge. I took the el train to Brighton Beach, had breakfast at some bar/breakfast place and decided to ask the locals for suggestions. A lively conversation about bus versus metro ensued, but when I clarified that I was planning on walking there was unanimity: that is impossible! I did it, but the route I chose was more or less the equivalent of walking Jean Talon from end to end. Not the most exciting walk…. But I did learn a thing or two about Brooklyn during my meandering. I think that I did something like this: From Brighton Beach to Manhattan Bridge via Coney Island Ave. – Zvi Leve

Taking the train to TO tomorrow and will be reading Dan Rubinstein’s Born to Walk: The transformative power of a pedestrian act, along the way! – Marilyn Berzan-Montblanch

Native American chief, Red Cloud, was born to a mother with the name “Walks As She Thinks”… – Jaffer Ali

Over the last 36 years I’ve run around 65,000 miles. Hiked, skied, climbed, and backpacked much more. For me the motivation is not the opportunity for meditation, but rather the opportunity to observe nature with all my senses that gets me out there. Forget pace. Remember beauty. – Tom Bohannon

Totally agree about the walks: it’s a biological necessity that’s treated like a luxury. – Diana Young

Hell yes!! I love walking!! Especially after a big meal, that’s just the best. When I’m traveling in a new city I walk for hours to soak up the atmosphere. I live in China so wherever I walk I turn heads and get the occasional dirty look and muttered assumptions but it’s small price to pay for a good constitutional. I also love running, and biking and just propelling myself in general without any mechanical assistance though so maybe I’m the strange one. – Mike Benner

REDSHIRTS, by John Scalzi

[SPOILERS for a three-year-old book to follow]

Redshirts won the 2013 Hugo Award for best Sci-Fi novel, and it received a lot of buzz at the time for being a great read. I dutifully added the citation to my ‘to-read’ pile and a few days ago was able to check it out digitally from my local library. E-books are superior to print books in many ways, but one advantage of print is that you can tell simply by holding it – perhaps after a glance at the font size – how long it will take to finish. I surprised myself by starting and finishing Redshirts in a single day: a round trip to Toronto from Mississauga accounted for the bulk of it, with about an hour at home to finish it off. As you may gather from this, it’s a light read.

The story is set aboard the starship Intrepid, flagship of the Universal Union, a starship which takes on a variety of missions – diplomatic, military, exploration – with the common feature being that on each mission, one of the junior crew dies. Senior staff, including the brash captain and the coolly unemotional, alien science officer, are oblivious to this trend, but the junior crew certainly aren’t, all of whom find elaborate ways to avoid going on inevitably-lethal away missions. In other words, Redshirts is essentially a parody fanfic of the original Star Trek  TV show. In the first half of the novel, the humour is awfully broad, and consists almost entirely of pointing out the ways in which the Intrepid and its officers don’t behave in the way we’d expect a real military craft and crew to act. The bridge chairs don’t have seat belts! The uniforms don’t have pockets! The bridge shakes, no matter where or how the ship has been damaged! Science crew are always able to come up with an answer to the problem of the moment, but only at the last minute, and they always miss something that the science officer notices immediately! This is bad enough, but the humour worsens as it gets more meta, and starts pointing out the way that the conventions of televised fiction don’t line up with reality. In real life, you see, people rarely stop to give expository speeches explaining the problem of the moment, especially to people who are already familiar with it, or to their subordinates.

These insights are hardly original: people have been making these gibes at Star Trek‘s expense since the show aired, almost fifty years ago. I remember David Gerrold making the pockets-and-seat-belts jokes in his Star Trek memoir, and that came out in 1973. The only original bit that author John Scalzi offers is a novel explanation for a notorious situation aboard the Enterprise: the fact that, whenever the senior staff walk the corridors of the ship, the junior staff around them are all hurriedly moving around them, never stopping to talk with their superiors or with each other. It turns out that the crew are afraid if they catch an officer’s attention, they’ll be ordered to join a presumably-fatal away mission, so they avoid the command staff wherever possible.

The only thing sustaining interest in the first part of the story is the question of why the away missions are so dangerous. I was expecting a different answer than the one the story eventually gives us. The superficial emotional tone of the first half of the book is lamely comic, but it’s a thin cover for a thick layer of horror. Crew on away missions die, and in terrible ways. Knowing this, the crew go to great lengths to avoid going on these missions… and this includes finding ways to ensure their fellows, not they themselves, get assigned first. This subversion of solidarity is chilling, and it only becomes worse on the missions themselves, where the crew – out of sight of their commanders – begin to actively sabotage each other, to ensure that someone else become the necessary sacrifice. The mood is genuinely and surprisingly bleak. Unfortunately, that mood isn’t sustained, because Scalzi can’t avoid having to explain why the Intrepid is the way that it is. (The burden of science fiction is that everything must have an explanation.) I had assumed, given the established tone, something along the lines of The Cabin in the Woods, a roughly simultaneous exercise in genre metafiction: the crew die as sacrifices to some alien force, and the bridge crew are complicit with this arrangement to preserve what they see as the greater good. Scalzi goes a different way: it turns out that the Intrepid and its crew are all fictional, the stars of an early-twentieth-century TV show, and its junior crew die so often because the head writer of the show is a hack and doesn’t know any other way to build dramatic tension.

With that mystery solved, the book’s second half becomes even less engaging than the first, as the story transforms from a parody of Star Trek generally to a parody of Star Trek IV. In place of dangerous away missions and ensigns scheming to make someone, anyone else get killed, we have the book’s viewpoint characters, the newly-minted junior staff of the Intrepid, travel to 2012 Burbank, California, to confront the creators of the science-fiction show that has taken over their lives. Aside from more lame jokes in the key of fish-out-of-water, Scalzi offers another interesting bit: the Intrepid crew are dopplegangers for the actors who played their parts on the series, leading to several cases of mistaken identity. Most notably, one of the crew is a perfect physical copy of the showrunner’s son, who appeared on the show briefly but then, days before the Intrepid crew arrived, had a serious motorcycle accident, leaving him mangled and brain-dead. His grieving father was about to pull the plug, but in exchange for his son’s life being saved by the deus ex machina of Otherworldly Medical Science, agrees to change his show such that the junior crew won’t die anymore. Mission accomplished, our heroes return to their world, confident that they’ll own their own destinies now, or at least as much as anyone in their situation can.

If that was all, I’d rate Redshirts as a slight read, as fanfic with delusions of grandeur. But the entire exercise is redeemed after the book proper ends. I suspect Scalzi intuited as much, given that the full title of his book is Redshirts: A Novel with Three Codas. It’s those codas that pay the whole thing off. They deal with the people left behind in 2012, who are now aware that their TV show isn’t just a basic-cable triviality, but is also a machine with existential significance. In one coda, the show’s head writer battles writer’s block, as he’s terrified that if he writes about someone dying, someone actually will die. In another, an actress deals with the fact that her doppleganger was the love of someone’s life, and that when her doppleganger died, the grief almost destroyed him. And yet she, the original, is alone, and has inspired such passion in nobody. And in the most powerful epilogue, the showrunner’s son becomes aware that his body and brain had been smashed into wreckage, and that impossibly, he’s been given a second chance at youth and health… and yet his life to that point has been wasted in shallow pursuits. Of all people, he deserves the gift he’s been given the least. So what should he do, having received it?

I found the codas genuinely moving, and far more engaging than the entire novel that preceded them. Science fiction is supposed to be a literature of ideas, and these ideas – what significance do our lives have, and how should we live them? – are much more interesting than wondering why, in the future, military uniforms don’t have pockets.

—Andrew Miller