Monthly Archives: February 2016

Guys Who Don’t Put It Down, Don’t Go Down

3090__the_war_of_the_roses_1989movie_SCARY PREDICTION: In 2019, the American Sociological Association’s prestigious Best Dissertation Award will go to a PhD dissertation entitled: “Guys Who Don’t Put It Down, Don’t Go Down: An Analysis of the High Correlation Between Men Who Refuse to Put the Toilet Seat Down After They Pee and Men Who Refuse to Perform Oral Sex After They Cum” (2019). An article summarizing the study’s findings will be published in the Spring 2020 edition of The Journal of Incredibly Obvious Results. Men’s rights organizations, like A Voice for Men, will immediately question the study’s methodology. Barbara Kay will claim in a National Post editorial that the very existence of this study is further proof of something she’s been saying for years: namely, that we live in a misandric society. Meanwhile, in Montreal, after hearing about the study’s findings on the clock-radio in his bedroom, our Prime Minister will smile, shake his head in agreement, and head south.

—John Faithful Hamer, The Myth of the Fuckbuddy (2016)

The Spicy Pleasures of Stoic Practice

“A Stoic is a Buddhist with attitude, one who says ‘f*** you’ to fate.”—Nassim Nicholas Taleb, Antifragile: Things That Gain from Disorder (2012)

chili-peppers1Unless you were probed by a fish-faced alien last night, an alien who refused to use lube (much less protection), your problems are probably pretty generic; but that doesn’t make them any less problematic. Putting your problems into perspective works well when you’re a fly on the wall; not so much when you’re the fly caught in the web. We’re very good at being philosophical about other people’s losses. For instance, we find it easy to calmly remark “Well, you know, these things happen” when we learn that our next-door neighbor’s kid has accidentally broken a glass, scratched the new car, or stained the white-leather sofa. But when your kid breaks your precious-little-antique-cup-you-got-in-Cape-Cod-when-you-were-eleven, you freak out. When it’s your new car that’s been scratched, you freak out. When it’s your fancy sofa that’s been stained, you freak out. Stoic practice seeks to remedy this by making you just as philosophical about your own losses as you are about other people’s losses.

In The Art of Living, the Roman Stoic Epictetus maintains that we should meditate—each and every day—on what it would be like to lose everything we care about: our stuff, our health, our wealth, our reputation, even our loved ones. This is all supposed to make us better able to deal with adversity when it comes our way. But in my experience it rarely does. I’ve seen stoical John Wayne types fall apart under pressure and I’ve seen emotional basket cases behave heroically in a crisis. So I’m inclined to believe, with Aristotle, that people often surprise you, and that a man’s “philosophy” isn’t a particularly good predictor of how he’s going to behave in the face of adversity. If you really want to know what a person’s made of, you’re going to have to wait and see what they’re like in the face of actual (as opposed to theoretical) adversity. Be that as it may, I think the aforementioned Stoic visualization technique is good for you regardless of whether or not it helps you deal with future losses. Why? Because it feels good. Because it’s pleasurable. In fact, as I said to my friend Graeme Blake on Mount Royal yesterday, I’ve come to suspect that Stoicism is really just a refined form of Hedonism.

As baby-food manufacturers well know, some things, like the sweetness of sugar and the saltiness of salt, taste good the first time. They’re straightforwardly and immediately pleasurable to all. Other tastes—like the face-pinching sour of vinegar, the burn of really spicy food, and the bite of hard alcohol—must be acquired. Things of this kind do not taste good the first time. All to the contrary! For instance, I’ll never forget the first time I accidentally ate a mouthful of really spicy food. We were in a crowded Cuban restaurant in Washington, DC. And it was horrible. Felt like a near-death experience or a panic attack or a really bad acid trip. At first everything got really quiet, like someone had just turned down the volume on the room. Even the voices of the people at my table sounded muffled, distant, and barely audible. My chest seized up and I forgot how to breathe for so long that I thought I might pass out. And then came the burn—oh, the burn—a burn so hot it felt cold at first. I frantically reached for my glass and brought it to my lips. But the water only made things worse. Way worse. Tears streamed down my face and my light-grey t-shirt was soon drenched in sweat. Alas, this was not a positive experience. Even so, I grew to love spicy food, just as I’ve grown to love many things I initially hated, like whiskey, wine, and Stoic meditation.

Like the thrill of a roller-coaster ride or a really good horror movie, the pleasure we derive from Stoic visualization is always, to some extent, masochistic pleasure. At first we’re shocked and unsettled by the horrors we’re imagining, but then we’re comforted by the realization that everything’s actually okay and we’re really not in danger. The spicy pleasures of Stoicism are, then, not so different from the pleasures people derive from skydiving and bungee-jumping. We’re dealing, here, with the thrill of the near-death experience.

—John Faithful Hamer, From Here (2017)

Jerome Rodale and the Founding of Prevention Magazine

“When you don’t need a doctor, you really don’t need one.”—Aaron Haspel, Everything (2015)

cover (17)When Jerome Rodale began publishing Prevention magazine in June 1950, the United States was a society largely governed by experts. The professions had grown from humble beginnings in the late nineteenth- and early twentieth-centuries into formidable institutions that commanded respect and even a certain amount of awe. The anti-elitist Jacksonian tradition, so central to earlier periods, was largely dormant in the first two decades of the post-war era. Muckraking was, likewise, a rather marginal journalistic genre during this period. For the most part, Americans had faith in their experts. Although much of what these experts did happened behind closed doors, few seemed to mind. These highly-trained men and women could be trusted to do the right thing and work towards the common good. Most Americans believed, as the Commissioner of the Food and Drug Administration put it in 1947, “that most manufacturers make sincere efforts to meet all legal requirements not only because they are the law of the land, but because it is the right thing to do.”

A series of public scandals tested America’s faith in its experts during the 1960s and 1970s. Watergate and the government’s mismanagement of the Vietnam War are only the most memorable episodes. The crisis in authority was widespread and affected multiple facets of expert culture from politics to science, but its effects upon regulatory agencies, health professionals, and food scientists are of particular interest. First, there were the cancers caused by thymus irradiation. Then the estimated five million pregnant women who—acting on their physicians’ advice—took the synthetic hormone diethylstilbestrol (DES) to avert miscarriage and preterm labor, only to discover later on that it severely deformed the reproductive organs of their unborn children. Then there was the swine flu fiasco and the sulfa drugs that gave people kidney stones. There was also the milk formula that was supposed to be better than mother’s milk, but which contained so little vitamin B6 that it caused convulsions and brain damage. Headline after headline suggested that perhaps the experts entrusted with the health of the American people were not doing a very good job. What if the experts meant well but did not really know what they were doing? Or worse: What if the experts did not mean well at all? What if they could not be trusted to work towards the common good? What if the experts were in fact in league with special interests whose aims were at cross-purposes with the public good? In the 1962 bestseller Silent Spring, Rachel Carson argued that government scientists entrusted with the public welfare were shirking their responsibilities and allowing private interests to poison the earth and sky. The book sparked a vitriolic debate over the merits and demerits of the widespread use of agricultural chemicals.

The controversy surrounding Carson’s Silent Spring launched two of the most influential, and most uniformly white middle-class movements of the late twentieth century: the environmental movement and the natural health movement. There were precursors, of course. For instance, Jerome Rodale—who actually coined the term organic in 1942 —made the same argument and raised all of the same concerns about pesticides and herbicides in his organic farming classics, Pay Dirt (1945) and The Organic Front (1948), which were both published long before Silent Spring (1962). Rodale reiterated the same concerns throughout the 1940s in Organic Farming magazine and its successor, Organic Gardening and Farming. In the 1950s and early 1960s, he hammered away at the same themes in numerous books, and in his health magazine, Prevention. Even so, for the most part, Rodale’s message had fallen on ears still attuned to the authority of experts. His ideas, as well as his publications, existed at the fringes of mainstream American society.

Rodale was nevertheless well placed after 1962 to focus the public outrage unleashed by Silent Spring. Carson lacked the will and the means to institutionalize her dissent. Moreover, she died of cancer less than two years after the publication of Silent Spring. Rodale, on the other hand, was in good health and had already amassed a small fortune, the result of several successful entrepreneurial ventures. He had two established monthly magazines as well as a publishing house, Rodale Press, which allowed him to get his message out to the health-conscious subculture on a regular basis. Prevention magazine was where all sorts of unorthodox health writers vetted their new ideas and wrote monthly columns that paid the bills; Rodale Press—located in Emmaus, Pennsylvania—also published many of their books when no one else would touch them. When an influential member of the natural health movement wanted to get out a message to the health conscious, he or she did it in Prevention. While Rachel Carson must therefore be credited with initiating the public debate that transformed the concerns of a quirky subculture into the ideology of a movement, it is Jerome Rodale, more than anyone else, who created and maintained the institutions that grew the natural health movement from a passing fad into a political and cultural force to be reckoned with.

Along with an institutional framework, Jerome Rodale possessed an aversion to expert culture that made him particularly attractive to a significant segment of the American population that was growing skeptical of expert opinion. Unlike so many health gurus past and present, he did not buy a mail order PhD, nor did he pretend to have any kind of specialized training in the science of nutrition. Adolphus Hohensee and Carlton Fredericks were not nearly as scrupulous. Hohensee, a health reformer whose popularity peaked in the late 1940s and early 1950s, described himself as a PhD in nutrition. He did have a doctorate. But it was worthless in the eyes of his would-be colleagues, for he had purchased it from a shady, unaccredited institution.

Carlton Fredericks, famous for his syndicated radio show “Good Health,” also pretended to have specialized training in nutrition, but he did so in a way that must have pleased his lawyer. Fredericks reinvented himself as a nutritionist soon after he legally discarded the name he was born with (Harold Frederick Caplan). A prolific writer of articles and books from the early 1950s until his death in 1987, Fredericks made a point of putting a PhD after his name. He studiously failed, though, to specify the discipline within which he had completed his graduate work. He wrote on nutrition extensively—and described himself as “a nutritionist” —so his readers assumed that he was a professionally trained nutritionist. This was not the case. His formal training was neither in medicine nor nutrition, but rather communications.

Jerome Rodale never felt it necessary to engage in this kind of duplicity. He did, like Fredericks, change his name (he was born Jerome Irving Cohen). But this was not an uncommon thing for an American Jew to do in the first half of the twentieth century. Aside from the name, Rodale was who he said he was. He flaunted his lack of formal education. His charm lay precisely in his down-to-earth persona. He was the little guy who made up for his lack of degrees with an extra helping of commonsense; a feisty, unpolished, five-foot-six-and-a-half-inch tall New York City Jew who refused to go on blindly trusting an Anglo-Protestant establishment that seemed to be doing such a terrible job. The emperor had no clothes, and he was going to shout it from the rooftops.

What Rodale abhorred, more than anything else, was language that was deliberately and unnecessarily abstruse, language that was intended to exclude the uninitiated. In his books and magazine articles, he often did little more than translate the specialized discourse of science journals into lay terms. He informed his readers about the results of medical experiments that had demonstrated, he believed, the harm done to the human body by certain chemicals or the good done by certain vitamins and minerals. His work is replete with references to prestigious medical journals. To a certain extent, therefore, he was a popularizer—that worst kind of heretic who dares to translate a profession’s jealously guarded esoteric knowledge into plain speech. “The scientist,” argued Rodale, “sits in his ivory tower”—and “invests himself in a phony cloak of omniscient authority.” He expects to be trusted implicitly and left alone. But health was far too important a topic to be left to the experts, especially when so many of those experts were being funded directly or indirectly by the food and drug industries; industries that Rodale believed did not have the public good in mind. “Matters have been in the hands of the scientific specialists long enough,” Rodale once wrote, “it is time for the public to take a vigorous hand in what is going on in science, or it will have to pay in the form of living shorter and less enjoyable lives.”

Before Prevention magazine arrived on the scene in 1950, the leading health magazine in the United States had only 10,000 subscribers. But Prevention’s first edition went out to 50,000 paid subscribers, making it—from its inception to the present day—by far the most widely read health magazine in America. Ever the astute businessman, Jerome Rodale had been drumming up support for his new venture months in advance. Subscriptions rose steadily in the 1950s and 1960s, reaching just over 400,000 in late 1969. The pace picked up considerably in the early 1970s. Between January 1970 and March 1971, subscriptions doubled to 800,000; by July of that same year, they had cleared the one million mark. Four years later, in 1975, subscription numbers had grown to one and a half million, and the magazine was being read by over four and a half million Americans a month. By decade’s end, in 1980, Prevention was going out to well over two million people a month, with an estimated readership of at least six million. Ten years later, in 1990, it had just over three million paid subscribers and a readership of close to eight million. In 2008, Prevention was reaching more than ten million readers a month, making it the eleventh largest magazine in the United States, and, by a significant margin, the country’s top health title. Furthermore, Prevention became the most widely read health magazine in the world, with editions in dozens of countries, including Finland, Poland, and twenty-two Latin American nations.

The dramatic upsurge in Prevention’s circulation numbers during the early 1970s corresponds remarkably well with a striking increase in public support for the natural health movement’s ideas, industries, and institutions. The demand for wheat germ, vitamin E (extracted from wheat germ oil), sunflower seeds, pumpkins seeds, ascorbic acid (the synthetic form of vitamin C), fish livers (a source of vitamin A), and other health-food products, had grown so large by the early 1970s that it often outstripped supply. Likewise, more and more Americans joined health clubs during this period, engaged in regular exercise, and worried about pesticides, herbicides, and food additives. With the natural health movement, the anti-nuclear movement and, to a lesser extent, the environmental movement, Americans began to put together new kinds of social movements, ones with weak or fragmented institutional structures and enormous diversity, but held together by a few fundamental principles, a connection to personal lifestyle, and a skillful use of the means of communication.

No one understood the historical significance of what was happening in the early 1970s more than the natural health movement’s two most vocal and vociferous critics, Harvard nutritionists Fredrick John Stare and Elizabeth Whelan. Assessing the situation in 1975, they wrote: “There was nothing gradual about the arrival of the modern health food movement. It struck like lightning during the early part of 1970.” Whelan and Stare correctly acknowledged that most of the movement’s central tenets had been kicking around at the margins of American culture for well over a hundred years—at least as far back as that moment in 1833 when the first Graham Boardinghouse was established in New York City. They quickly added, however, that Sylvester Graham and all the unorthodox health reformers that followed him—from John Harvey Kellogg and Bernarr Macfadden to DeForest Clinton Jarvis—“catered to the emotional needs of a relatively small portion of American eaters” and had little effect on mainstream American attitudes toward food, health, and disease. The difference was a matter of scale not novelty. Whelan and Stare adroitly noted that the health enthusiasms of the 1970s differed from those that preceded them precisely because they could no longer be accurately described as a passing fad; they were “becoming the material of a movement.” As they put it: “reasonably normal people”—“not just young, long-haired radical members of the lunatic fringe”—were shopping at health-food stores and reporting that they received “most of their nutritional information from Adelle Davis and the Rodale Press.”

The Rodales benefited greatly from the expansion of the natural health movement. And as the family fortune grew, so did their power and influence within the movement as a whole. Advertising in Prevention magazine became a necessity for any business that wanted to help meet the multimillion-dollar demand for vitamins and other food supplements. The magazine’s position as more or less the only game in town allowed the Rodales to keep the price of advertising high. In 1974, a one-page advertisement in Prevention cost between five and six thousand dollars, a great deal of money for a publication of its size. Ten years later, in 1984, a full-page black-and-white advertisement cost $15,047; the same page in color went for $26,550, while the highly visible back page cost $34,500. Advertisers clearly believed that they were getting a good return on their investment, for each edition of Prevention contained between sixty and one hundred pages of advertising out of a total of about two hundred pages. Prevention’s power and influence permitted its publisher, Robert Rodale, to command these handsome sums; it also allowed the magazine’s editors to keep its advertisers on a short leash, policing content in ways that would have made advertisers in other industries cringe in horror.

Getting on the wrong side of the editorial staff of Prevention magazine was tantamount to financial suicide for a vitamin manufacturer. Thus, while other magazines coddled their advertisers—removing offensive articles, firing offensive writers—Prevention magazine could (and did) dictate strict guidelines, to which its advertisers were obliged to adhere. Each month the editors ran a page-long column that piously discussed their exacting advertising policy. Rodale Press even went so far as to hire a private laboratory to spot check the products advertised in Prevention and ensure their strength and purity. Any advertiser found wanting was banned from advertising in Prevention. This stern approach towards advertisers reflects the character of the magazine’s founder.

Jerome Rodale was enthusiastic, quick-tempered and impulsive, and he was anything but careful and measured in his writings. Beyond the class and cultural motives that fueled Rodale’s assault on expert authority was a more visceral fear of death by heart attack. As the youngest child in a family with a serious propensity toward heart disease, he stood by and watched cardiac arrest kill most of his immediate family at relatively young ages. For obvious reasons this experience informed much of his obsession with health and wellness. Rodale claimed to live according to a rigorous regimen that included healthy organic food and plenty of exercise—and for the most part, he probably did. Nevertheless, it is hard to believe that he was quite as virtuous as he claimed to be. Photographs of him taken throughout his life suggest that his adherence to the Spartan ideal was spotty and subject to a fair amount of backsliding. He was not always the picture of health. Pudgy often, and sometimes downright overweight, Jerome Rodale had a prominent double chin, which was hidden from late middle age on by his signature goatee. An awareness of his own weakness and susceptibility to temptation could, perhaps, have been precisely what fueled Rodale’s apostolic fury. Regardless, while he was at the helm, a sense of urgency and anger pervaded Prevention magazine. This changed noticeably when his son Robert took over in 1971. Prevention became more respectable.

Robert Rodale was dignified, cautious, and reserved. He lacked his father’s mercurial temperament and short stature. At five-feet-eleven-inches tall, he stood nearly half a foot taller than his father. Robert was more moderate and more inclined to compromise. Even so, the populistic drive to make scientific knowledge available to a lay audience remained central to Prevention’s mission after his father’s death. In 1978, Robert Rodale declared that in “writing Prevention articles we try to use the bare minimum of technical terms. And when we do have to use them, we are sure to explain their meaning in a way anyone can understand.” Numerous articles were devoted merely to offering plain English definitions for commonly used medical terms. In short, Prevention writers sought to unmask the use of jargon for what it too often is: a cynically employed intimidation technique that silences the uninitiated and “perverts the purpose of language—communication—by creating confusion, ambiguity, misinterpretation, inadvertent humor or sheer tedium.”

—John Faithful Hamer, In Healthy Living We Trust (2016)

Authority and the American Body

cover (17)For much of the second half of the twentieth century, a public debate raged between professionally-trained health experts—doctors, nutritionists, and food scientists—and a motley crew of health reformers. Some of the latter—such as Jerome Rodale, Robert Rodale, and Mark Bricklin—were principled populists who resented the smug elitism of the health professionals and questioned their judgment; others, like Adolphus Hohensee, were opportunistic charlatans who peddled in hope and capitalized on human frailty in the most reprehensible fashion; still others—such as Benjamin Feingold, Adelle Davis, and Robert Atkins—were professionally-trained health experts who, for one reason or another, defected to the enemy camp. Mark Bricklin, often the movement’s most articulate voice, maintained that the conflict between health professionals and the natural health movement was just another episode in the unending struggle between those who think that society should be paternalistically governed by experts and those who believe that an enlightened populace has no need for this kind of guidance.

Despite their claims to professional competence, the orthodox health experts were quite clearly wrong about some things. For instance, Jerome Rodale, Adolphus Hohensee, Adelle Davis, and Carlton Fredericks advocated breastfeeding consistently—throughout the 1940s, 1950s and 1960s—long before the American Academy of Pediatrics got on the bandwagon. The best and brightest of the medical elite insisted, for far too long, that store-bought formula was superior to breast milk. During the same period, American doctors regularly prescribed amphetamine—that is, “speed”—and diuretics to pregnant women to ensure that they did not gain more than fifteen pounds with each pregnancy. Moreover, at present, the consensus among medical experts is that antibiotics have been over prescribed, and that close to ninety percent of the tonsillectomies performed in the decades following World War Two were completely unnecessary.

Health reformers were also often wrong—indeed, manifestly so. Vitamin therapy and herbal remedies, for example, have been subjected to well-designed double-blind studies again and again, and the results have been decidedly underwhelming. Vitamin C does not cure the common cold, as two-time Nobel Prize winner Linus Pauling famously maintained; it is, however, a moderately effective antihistamine. Echinacea appears to be even less useful. An herbal supplement derived from purple coneflowers, echinacea was, in 2005, used by close to 15 million Americans, making it the most popular natural product. Yet another excellent study, published in The New England Journal of Medicine, found that echinacea neither prevented colds nor eased cold symptoms. Times have been tough for vitamin E, too. Results from a ten-year Harvard study of 20,000 women reported in the July 6, 2005 issue of The Journal of the American Medical Association have debunked the notion that vitamin E supplementation prevents cancer, Alzheimer’s disease, or anything else for that matter. Researchers now suspect that long-term vitamin E supplementation may even been harmful. Things are not looking up for the herb St. John’s Wort either. Once believed to be Mother Nature’s answer to Prozac, it now seems to be capable of clearing up a mild case of the blues. As a treatment for even the more moderate forms of clinical depression, however, St. John’s Wort has thus far been proven thoroughly ineffective. In any case, for those who seek to understand the deeper cultural significance of the natural health movement, these time-sensitive judgments are of only limited interest.

The nature of the scientific enterprise is such that the truth it produces is always tentative and imperfect. The facts it manufactures have a shelf-life. For instance, researchers may discover ten years from now that baby formula is, once again, better than mother’s milk. Some pediatricians would surely drag their feet and resist the change, but if the evidence was overwhelming, they would all, in time, recommend store-bought formula. The same could not be said with confidence about unorthodox health reformers. Among their ranks, commitment to the scientific method and the falsification process has been much less consistent. Evidence that contradicts first principles has often been willfully ignored. Health-writer Carlton Fredericks’s wholesale rejection of blood sugar research is a conspicuous example. An ideologue if ever there was one, Fredericks put on the mantle of science whenever it suited him. Yet he flagrantly disregarded numerous scientific studies that demonstrated, quite conclusively, that hypoglycemia—that is, chronic low blood sugar—was not, as he so often claimed, a national epidemic.

Health reformers spoke and wrote in the language of science, and many of their theories were based upon sound research. But these epiphenomena should not distract us from the essentially disingenuous nature of their approach. Men and women like Jerome Rodale and Adelle Davis were not looking for truth when they pored over medical journals; they were looking for vindicating evidence. A priori assumptions guided these plundering expeditions shamelessly, from start to finish. Even so, what was at stake during the second half of the twentieth century was much bigger than any real or imagined fidelity to scientific due process; it was bigger, as well, than the truth or falsity of this or that medical proposition.

Sickness and death are weighty matters and the cadre in any given society—be they priests, witchdoctors, physicians, or pure food activists—that successfully lays claim to epistemic privilege where these things are concerned occupies a powerful place. Mark Bricklin insisted that white-coated scientists occupied that place in twentieth-century America. Both the priest and the scientist, he argued, “feel that they are quite intimately in touch with the true nature of reality. Both the priest and the scientist have to master vast bodies of literature involving strange language and obscure terminology.” “Today,” he added, “if we are uncertain about something, we look to science to give us an explanation. If there is trouble with our crops or trouble with our health, we moderns consult those priests in white coats called scientists.” Even so, Bricklin maintained that perhaps the most telling development was the simple fact that the word “unscientific” was being bandied about in the public sphere—to discredit people and ideas—in much “the same way the word blasphemous used to be used.”

Medical professionals were by far the most numerous members of the priestly scientific caste to which Bricklin referred; they were also the most consequential, since they had face-to-face dealings with the general public on a regular basis. Doctors, in particular, personified scientific authority in a way that other men and women of science could not. “Of all those who consider themselves to be members of the sciencehood,” Bricklin declared, “none are more priest-like than physicians.” It took the medical professionals almost half a century to beat off their rivals—the midwives, homeopaths, chiropractors, and faith healers—and establish themselves as the principal authority on health matters in America. The natural health movement constituted a sustained attack upon this authority.

Looking back in 1984, Mark Bricklin noted that “scientific medicine” had enjoyed unprecedented power and prestige in the 1940s and 1950s. Health professionals had cornered the market; they had a near monopoly on the truth about the American body. But, he added, “we can see that the party mood began to sour about 10 or 15 years ago.” “People,” another Prevention writer declared, “have begun to break the spell of awe surrounding doctors and take care of themselves.” “Ultimately,” alleged yet another, “self-care is a political way of looking at things: Who exercises the power around health? Up until recently, it’s been the doctors and professionals, but now it’s moving toward the consumers and laypeople.”

—John Faithful Hamer, In Healthy Living We Trust (2016)

Jian etc.

The first time we meet it’s at the Ivanhoe on Main street, a bar where drug addicts and students mingle. Located by Vancouver’s bus depot which marks the border to the lower east side, it is the kind of place I would not go alone, although it is a popular enough place among my peers. Beer is cheap, two dollars a glass. I’m only 19 years old, but that’s old enough to know the beer here tastes like piss and the carpets smell the same. This is where I meet the guy who turns out to be my rapist, although I won’t know to call him by that word until much later.

Ivanhoe

After all, what is rape? It seems like something that should be relatively straight-forward in its definition, yet when you talk to people it is clearly not all that clear. What constitutes consent? What is the difference between date-rape and aggravated sexual assault? Do rapists who make an “honest mistake” get put in the same category as the armed cartoon-like stranger lurking in dark alleys?

Increasingly, popular discourse has been willing to entertain the idea that rape is not something done solely by masked criminals. Discussions of rape come in and out of public discourse with relative frequency, and the term “rape culture” which was coined by radical feminists in 1970s has received increasing attention with the spotlight now on Jian Ghomeshi.

claireAt 19, I had not heard of “rape culture”. However, my early experiences around sex were marked less by eroticism than by shame and power. My first sexual experience, when I was twelve, happened in the bedroom of a boyfriend who decided to take off my shirt and suck on my barely existent nipples. I did not object; I was too surprised. I was also too uncertain. Perhaps, I thought, this is normal. In hindsight, it was a ludicrous attempt at adult sexuality, but in truth it scarred me. What scarred me was not the act itself, which was only unpleasant, but my boyfriend’s retaliation when I broke up with him the next day. In what can only be described as a kind of public shaming ritual, he found me in the park, threw me on the ground by my hair and spat on me. He said something—slut or bitch, I can’t remember. Around me stood a circle of my peers– some of them my friends—who did nothing. Their silence was what I remember, and their lack of willingness to look at me.

I was so aware of the existence of rape culture before I actually heard the term, that when I finally did hear it, it was like discovering the name of a bird or a flower that you’ve, quite literally, seen since childhood. Nevertheless, there are plenty of women– from bell hooks to Camilia Paglia– who reject the concept.

On the Canadian scene, rape culture made its way into The National Post with commentator Barbara Kay last year. She claims that the term mischaracterizes male behavior and results in misandry: “You can produce any culture you like if you dumb deviancy down. If you change ‘against her will’ to ‘without her consent,’ as we have, that is a huge paradigm shift from what we used to think of as rape: i.e. forced sex. And if a drunk woman can’t give her consent, another moved goalpost, she is ipso facto raped.” Kay’s comments here—which claim a radical distinction between acts that are against someone’s will and without someone’s consent—advocate a return to the masked criminal definition of rape. More significant, Kay’s comments represent questions of the law as questions of cultural definition, which is interesting for those interested in the dialectic between culture and law, but fundamentally misleading. (For a more detailed look on the importance of consciousness and active consent see Supreme Court ruling here.) Kay’s thesis is unsurprising to those familiar with her conservative anti-feminism.

More surprising (at the time) was Jian Ghomeshi’s lack of comment last year during a debate that he organized between Lise Gotell and Heather McDonald around rape culture on his radio program Q. Ghomeshi’s reluctance to intervene when McDonald’s denial of rape culture quickly turned to rape victim-blaming shocked many of CBC’s faithful listeners.

Canadians were perhaps less surprised by Ghomeshi’s lack of comment on rape culture when he fell from grace after showing CBC producers a video of him appearing to sexually assault a woman. It wasn’t exactly the first time a celebrity’s reputation has been bemoiled by a sexual assault accusation, but it was a story that I followed obsessively unlike many of the others. Why? Because in this particular instance, the person in question was somebody I liked. Also, because it appeared that the issue was not whether there was consent; the stories seemed to suggest that absence of consent was precisely (and importantly) what turned him on.

A woman goes back to a celebrity’s house. A woman who is planning on having sex with him. Instead of kissing her, he slaps her, instead of seducing her, he degrades her. He then pretends like everything is normal. He might offer her a ride home. He might ask her if she will see him again for cocktails. For those who have read accounts of the women accusing Ghomeshi, the stories all sound strangely familiar. They follow a pattern of normalcy, bizarre and disorienting violence and then normalcy again. What makes him so successful in evading reprisal is that he is, otherwise, as a lover at any rate, so incredibly boring.

My rapist is also boring. He is the nephew of my English professor. It is my second semester at Langara College, and I love this professor. The last Friday of the semester, my professor invites our class to join him at the Ivanhoe. It must be winter, which in Vancouver means rain. Class gets out at dusk and the sky, which has been heavy all day, begins to fall.

Because I love this professor so much, I’ve come to the Ivanhoe even though it is a bar I do not like. I bring my friend, Mindy (not her real name), because we plan on partying later. Mindy is hot in the most conventional sense of the word. Six feet tall, blonde, her mother was a British model when she was young. Mindy looks like a Bond girl and has also done some modelling. But she isn’t available because she’s married to a tattooed drummer named Eli (also not his real name). My professor’s nephew, let’s call him Jason, wants to sleep with Mindy. He is trying to impress her, trying to be funny and/or clever. He keeps talking about the books he has read. He’s in grad school. He doesn’t know that Mindy doesn’t take his uncle’s class, that Mindy works as a waitress and that she is not interested in college.

Mindy is not impressed. “Who is the loser?” she asks, although not loud enough for him to hear. She doesn’t like the way Jason styles his hair, which is parted in the middle and in a sort of bob; it lays flat against his head. He reminds her of a goat.

Predictably, Jason starts hitting on me when he realizes Mindy is taken. I don’t mind his hair. I think he’s kind of cute.

“What are you girls up to after?” he asks.

“We’re thinking of getting some coke,” I say.

Jason wants to hang out, wants to pay for the drugs. We let him, but we get sick of him soon. He’s trying too hard. We do not care about how smart he is. We leave him on the street corner halfway through the night, jumping into a cab and telling him bye. We are mean to him. By this point, he already has my number.

Why do some men rape?

December 2012: a group of men gang-rape and kill a young woman in Delhi. This was not a date rape. It was a premeditated, clear-cut aggravated assault. A medical student, Jyoti Singh had been to a movie with her male friend. They thought they were getting on a bus, but it would prove to be a torture chamber, where she would be repeatedly raped and beaten for hours, finally dying from internal injuries sustained after her attackers decided to rape her with a rusty steel pipe. She and her companion were found at the side of the road barely breathing, thrown from the bus after her rapists were finally through with her. Rape is fairly common in India; however the violence of the crime, the level of planning that it required and the fact that it resulted in a virtuous woman’s death, left many people around the globe stunned. Why would anyone do such a thing?

In the early days after news of the Delhi attack spread Heather Timmons asked this question to psychologist David Lisak. Lisak lists biological, historical and cultural explanations for rape, but ultimately warns against seeing rape as motivated by something purely sexual: “I think sometimes the sexual element clouds our understanding of what rape is. Fundamentally, it is targeting a group of people they hold hate for.” In short, rape is a hate crime, motivated by a profound antipathy towards women and targeting that part of her anatomy that makes her female. But rape is also about entitlement and control. If a man feels that he is superior to a woman, then rape is a way of asserting that superiority, of proving to her and to himself that she is the weaker sex.

What happens when the victim doesn’t die? What happens when she doesn’t even act damaged? The date rape survivors who move on with their lives–we are harder to immortalize. We are easier to hate.

Jason calls me to see if I might like to come to Victoria to visit him. With Mindy’s negative impression of him out of the way, I say yes.

“Bring some work to do,” he says. “I have a paper to write that weekend, but I’d really like to see you.”

Jason is a graduate student at the university that I am thinking of applying to for my undergraduate degree. I am attracted to him. I want to see him. I know that I will probably have sex with him.

Saturday morning, I catch the ferry from Tsawwassen to Vancouver Island. It is a grey day. The sky is heavy. I feel nervous, knowing that I am going to the house of someone I do not know very well, but I don’t really worry too much. He is my professor’s nephew after all.

At the ferry terminal, Jason is waiting in a black Tercel. He waves to me, and I throw my bag in the back of his car. We give each other an awkward hug.

“Sorry about being rude to you that night,” I say.

“Yeah,” he says, “that was pretty lame.”

I don’t say anything. I know he’s right. The conversation shifts to innocuous subjects. He is casual, friendly. I feel that I have been forgiven, and notice that he has changed the style of his hair. I also notice that he is older than me, well-established in his twenties. His hand, clutching the steering wheel, looks bonier than my own hand which is still soft and girl like. The tendons stick out like ropes along his forearm.

Jason lives in the basement suite of a house. Glass doors lead onto a patio. The apartment is nice, sparse but well-lit with only one room, a bed in one corner next to the bathroom and a small screen which separates the bed from the desk. Immediately upon arrival, Jason gets into the shower. I am surprised by this, but I don’t say anything. Instead, I put down my bag and sit on his bed. I remove my hairpins and lay them on the bedside table. I wait.

A few minutes later he gets out of the shower. He comes to me on the bed and removes his towel. He has an erection which is level with my face. I think I laugh. I can’t remember. He then leans over and kisses me, but without tenderness. He is pressing my shoulders down on the bed. My feet are still on the floor, and I feel them lift as his weight settles on me. I am surprised, but I kiss him back. After all, this is why I am here. Then he is fumbling with my jeans. He pulls them down, pulls down my underpants, and thrusts his penis inside me.

“Wait,” I say. I am not ready, he is hurting me.

He says nothing. His eyes look into mine but they are not friendly. He does not try to kiss me again. His eyes are black, opaque, like drops of crude oil.

“Stop,” I say.

“Shut up,” he says. He is holding my hands on the bed, his arms weighted against my arms. I squirm but it only excites him.

He finishes, a short hard grunt. Then he gets up and dresses.

“Do you want to get something to eat?” he asks.

His face is now casual, friendly. I know that something important has happened but I don’t know what to call it.

According to the American Psychological Association, normal responses to sexual abuse include shock, fear and disbelief. However, these are short term responses and are often replaced by defense mechanisms that have more far-reaching effects. Of the various defense mechanisms which are a response to trauma, repression and denial are considered two of the worst, since they alter the nature of reality and can lead to maladaptive behaviors. Unlike repression, suppression, the conscious effort not to think about traumatic events, is actually quite adaptive. According to Harvard researcher George Vaillant, suppression is “the defensive style most closely associated with successful adaptation.” Humor is also thought to be one of these more adaptive defenses against trauma, as is sublimation—the use of art, writing, sports or other socially acceptable pursuits to channel the negative energy generated from a traumatic event.

In rape cases where a high-profile figure is the accused, public backlash against the accusers is almost a given. People like me, who watched events unfold in Ghomeshi’s case last year, were fascinated to see how this progressed. First one accusation, the predictable argument, the now cliche invocation of Fifty Shades of Grey, and finally the shattering of Ghomeshi’s defense with a slew of credible women all claiming to have been assaulted by him at one point. The backlash against these women was also predictable—why didn’t they come forward sooner? Why not press charges? I’m guessing that most of these women chose to forget about it. They chose to forget about it because it was something they could, more or less, forget about. Was the backlash against these women that they had not come forward, or was it because they weren’t damaged enough? The expectation that a woman be somehow destroyed by sexual assault, permanently damaged, incapable of moving on with her life is part of the same cultural attitude that permits rape and sees women as natural victims. And if Jian is allowed to be irrational and mercurial why can’t the same defense work for those he assaulted?

Objections are made when date-rape is discussed at the same time as rape’s more violent manifestations, but I think this objection is misplaced. No one is disputing that what happened to Jyoti Singh is worse than what happened to me or many other women who have been date raped, just as no one would dispute the distinction between petty theft and armed robbery. However, both are theft, and in the case of date-rape and aggravated sexual assault, both are rape. They follow a similar logic; they are both defended and supported by rape-culture.

claire againSunday morning I leave before dawn and take the bus to the ferry terminal. Jason is still sleeping and I make sure not to wake him. The air is damp and it plays lightly in my hair, which I now wear loose around my shoulders. In September, I will go to the university. I will see Jason around campus. I will chat with him. I see him around campus with his girlfriend. I store what has happened between us, a kernel for a future mind, an event that is so mysterious and so banal that it becomes archetypal. Or perhaps, an event that is so universal that it needs a symbol, something feminine and ordinary, like an egg or a lost hairpin.

—Claire Russell

*Originally published at Slattern. Republished with permission.

Something Democratic

“There’s something democratic about being the occasional asshole—you make a mistake, you apologize and everyone else breathes easier”—Tony Hoagland, “Dear John,” What Narcissism Means to Me (2003)

yqaexAristotle maintains that you’ll never know if someone is really your friend until the shit hits the fan. So long as you’re fun or useful to them, you just can’t be sure. The friendship’s true colors will come into view only at that moment when you cease to be useful and fun. For instance, I know a charismatic young creep who befriended a professor friend of mine just as long as he needed letters of recommendation and mentoring from him. But as soon as my friend’s usefulness to him was done, he broke off contact and moved to Japan. What’s worse, when my professor friend’s daughter went out for dinner and drinks with this guy in Japan, he proceeded to trash talk her father the entire night. Apparently he had never even liked my friend. Alas, their “friendship” was never more than a matter of convenience for him. Betrayals like this are always sad. But Aristotle insists they’re for the best. Purging your life of false friends is one of those things that’s best done sooner rather than later. True friends stick by each other even when it’s no longer convenient. And it’s good to know who your true friends are.

Nassim Nicholas Taleb, a rather Aristotelian philosopher when it comes to ethical matters, maintains that you can never really know what a friendship is made of until you mess up. This is, I think, a rather important addition to Aristotle’s theory of friendship. Can your friendship survive “the occasional asshole” incident? Is it ruined by it? Does it become stronger? Can your friend accept a heartfelt apology? Can they forgive you? Will they hold a grudge? These are questions of vital importance, questions that will ultimately decide whether or not it’s possible to have a long-term friendship with this person. But alas, these questions can be answered only after someone makes a mistake. As such, perhaps it’s best to get this stuff out of the way as soon as possible. To that end, I suggest that you make a big scene at tomorrow night’s dinner party. Seriously, get wasted and make a total ass out of yourself. Talk about how much you love The Da Vinci Code and Ann Coulter. Be loud and silly. Spill a drink or two. And let the chips fall where they may.

—John Faithful Hamer, The Goldfish (2016)

Wedding Day Rainbow

“Whenever . . . the rainbow appears in the clouds, I will remember my covenant between me and you . . . .”—Genesis 9:14-15

472775_10151424690692683_2090565344_oAfter that horrible Flood, there was an olive leaf, and then a beautiful rainbow: pregnant with the promise of redemption and renewal—which is why we laughed (uncontrollably) on the hotel shuttle bus, when a rainbow appeared in the sky above the entrance to Grounds for Sculpture, a New Jersey sculpture garden.

Because let’s face it, Alanis Morissette was wrong: there’s nothing ironic about rain on your wedding day—especially when you’re getting married outside at six, and it’s raining cats and dogs at five. But the deluge subsided just in time, as if by Divine Intervention, and a magical rainbow appeared—a rainbow that seemed to say that the Flood of tears—and freak-outs and blow-ups and money and time and effort—was all so very worth it.

Because this—this wedding, made out of the hearts, wallets, and imaginations of faulted human beings, was perfect in every detail. From the bride’s dress to the groom’s promise to grab his wife’s ass, each and every day, so long as we both shall live.

(for Lara and Edwin)

—John Faithful Hamer, The Myth of the Fuckbuddy (2016)

Down-to-Earth

down-to-earth, adj. As unexceptional as me; devoid of excellence; that which does not make me feel inadequate, insecure, jealous, or envious.

zqvccIt’s important to communicate effectively: by using examples and language that are readily accessible to those you wish to communicate with. But you don’t have to act like a fucking idiot to connect. That’s pandering, and it’s NOT important. Perhaps it’s the teacher in me . . . or is it the preacher? Regardless, I want to raise you up to my level; I don’t want to stoop down to yours. And I want YOU to raise me up to your level; I don’t want you to stoop down to mine. I want to enlighten you. And I want you to enlighten me. I want to bring out the best in you. And I want you to bring out the best in me. I want to celebrate your excellence, and I want you to be okay with mine.

Wisdom of DashWhy must we debase ourselves to make others comfortable with their own mediocrity? Why do we expect others to debase themselves to make us comfortable with our own mediocrity? Why do we ask this of our politicians, our public figures, even our friends? Previous generations may have been squeamish about sex, but we’re just as squeamish about real human excellence—unless, of course, it involves music or sports. In The Republic, Plato seems to suggest that this a basic feature of every democratic age. I hope he was wrong.

Democracy needs to learn how to celebrate human excellence with a clean conscience. Baptizing every success story in the holy water of hard work, Malcolm-Gladwell-style, isn’t the solution. As my friend Kaï says, you can’t escape the unfairness of nature by sanctifying hard work because the ability to work very hard may itself be something that’s unequally distributed from Day One. The same is true of the seemingly innate loves and proclivities which draw us to a vocation or avocation in the first place, and make much of the hard work easier than it might otherwise be. If, like me, you wish to wage war against the maddening unfairness of nature, do so by sharing your stuff and caring for the weak and vulnerable.

—John Faithful Hamer, The Myth of the Fuckbuddy (2016)

Who Created the Undisciplined Generation?

cover (17)“It is my contention,” Jerome Rodale once declared, “that the deficient, fragmentized, refined modern diet is at the bottom of much crime today. The brain is not nourished properly. Thus there is confused thinking, and vicious behavior.” Like many social conservatives, Rodale was horrified by the street violence and hippie drug culture that became a regular feature of the news during the 1960s. He thought that America’s youth—the baby-boomer generation—had been “spoiled by a life of affluence” and were “too lazy to work for a ‘straight’ living.” Jerome Rodale was a self-made millionaire, a man who read Horatio Alger novels in his free time. He subscribed to the American Dream without reservation and, for the life of him, simply could not understand why anyone would want to “drop-out” or engage in public protest in the United States. Minority anger and student activism were, for Rodale, essentially pathological in nature. Resentment toward “the system” was really just misdirected hostility stemming from poor dietary habits.

Rodale did not understand why sociologists and criminologists concerned with juvenile delinquency refused to entertain the possibility that America’s horrendous eating habits might have something to do with the growing crime problem. “They scoff,” he grumbled, “at such suggestions.” Yet while they do, “the crime rate seems to be going up and up, until one of these days it won’t be safe for anyone to walk down any side street at night, or perhaps even in the daytime.” In a disapproving nod toward the Moynihan Report, Rodale claimed that experts in government and academia seemed convinced that the juvenile delinquency problem was essentially a cultural problem—that, in sum, “these hoodlums come from broken homes.” “Well I have news for them,” Rodale declared. “This is not so.” “The world,” as Prevention’s John Yates put it, “is a very orderly place, everything follows from something else, and if you abuse your body, you’ve got more of a chance of getting into trouble with the law.”

In answer to the question—Who Created the Undisciplined Generation?—Prevention staff writers professed, in 1971, that “a whole generation was raised in the United States on potato chips, soda, greasy hamburgers and assorted candies and snacks. Such a typical teenage diet, loaded with refined starches and sugars, produces chronic ups and downs in blood sugar levels.” A diet such as this led to “mental confusion, depression, anxiety and abrupt mood changes.” Young people brought up in such a dissolute fashion shunned moral complexity and tended toward extremes; they were attracted to radicalism, outlaw culture, and violence like moths to a flame. Even so, Rodale piously declared: “It is my considered opinion that we can feed our young ones into decency, and even honesty.” Proper nutrition improves the character and can turn a criminal into “a model citizen.” Thus, Rodale maintained that the “best place to solve the problem of juvenile delinquency [was] in the cooking pots of the homes.”

In Natural Health, Sugar and the Criminal Mind (1968), Jerome Rodale went as far as to suggest that the Communist threat, the Kennedy assassination and Hitler’s crimes against humanity might have all, albeit indirectly, been caused by the excessive consumption of refined white sugar. In a particularly creative historical stretch, Rodale claimed that if Ivan the Terrible had “understood the principles of nutrition, the entire course of the Russian monarchy might have been different, the revolution might never have happened, and there might be no Soviet menace today.” Likewise, Rodale contended that Lee Harvey Oswald must have been suffering from hypoglycemia, and that he probably would have refrained from shooting the President of the United States were it not for his abominable addiction to the sweet stuff.

“Adolf Hitler,” argued Rodale, “makes a startling case for the harmful effect of sugar on an individual, for Hitler was a sugar drunkard. This, no doubt, is one of the factors that contributed to his becoming a restless, shouting, trigger-brained, raving maniac.” “There can be no question,” contended Rodale, “that Hitler suffered from low blood sugar, due to an over-consumption of sugar.” “Hitler could never get enough of his favorite whipped-cream cakes. There was always a box of candy near him”—and he quite simply “could not drink wine unless he put sugar in it.” Rodale maintained that all of that sugar caused Hitler “to lose his sense of values.” The same thing, incidentally, happened to Napoleon, who was also “addicted to sugar and pastries.”

Virginia B. Jaspers, the Connecticut nurse made infamous in the 1960s by the murder of three babies under her care, was also, argued Rodale, a sugar junkie who had been rendered mentally defective by an over-consumption of sweet snacks. Jaspers had lost her patience and shaken the infants to death after they refused to take their bottles. Rodale insisted that the root of her maniacal behavior was to be found in her “child’s passion for ice cream and soda pop. She was a sugar addict and had to have a box of candy at her side all the time. All day long she drank sweet carbonated beverages.” Thinking along similar lines, Adelle Davis maintained that sugar consumption was behind the atrocious murder of Sharon Tate. Charles Manson’s “Family” had been, she claimed, subsisting on candy bars for days before they went on their infamous rampage. The diet had, she insisted, driven them mad. “Where the diet is good,” Davis declared, “there is no crime.”

Jerome Rodale could not believe that such a dangerous substance had been granted such wide public acceptance. In 1968, with palpable disgust, he described a family excursion to an ice cream festival sponsored by the local Parent Teacher Association, wherein “more than five hundred persons” gorged themselves “on all kinds of over-refined carbohydrate foods—ice cream, candy, cakes, soda pop, hot dogs and the rest.” Rodale lamented the fact that children learn from their elders, and that the elders at the festival were setting such a terrible example. “Grown men were eating away on the circus type of sugar cotton on a stick—great big colored puffballs which were pure sugar.” If nothing changed and current trends continued, Rodale maintained that by 1988 “marriage and the family” would “be threatened” and half the adult population would consist of hippies.

In 1971, it was already well known that the Governor of California, Ronald Reagan, had presidential ambitions; it was also known that Reagan had quite a sweet tooth, and that he kept a bowl of jellybeans on his desk at all times. Jerome Rodale sympathized with Reagan’s conservative politics and his mistrust of hippies. Even so, he could not support the idea of a Reagan White House because he found the Governor’s penchant for candy profoundly disturbing. Rodale issued public statements to newspapers warning “that a man who eats jelly beans, gum drops and chocolate-covered peanuts would be a poor performer in the presidency.” He did not relish the thought of an emotionally unstable jellybean addict in the White House with his finger next to the button.

—John Faithful Hamer, In Healthy Living We Trust (2016)

The Crusade Against White Bread

cover (17)If anything has united health-conscious North Americans, it is the belief that white bread is inherently evil. Indeed, they have denounced refined white flour with a consistency that is matched only by the equally steadfast manner in which they have condemned refined white sugar. But this is a well-worn position that dates back to the very moment when the refinement process was patented in the mid-nineteenth century. The newfangled loaves had only just started to appear in American kitchens when health reformer Sylvester Graham set out on his quixotic crusade against white bread. His diatribes against the refinement of flour had all of the righteous indignation of an itinerant preacher’s altar call.

But Graham’s jeremiad was only heard in small circles, and even there he probably found some hard hearts. Americans liked their white bread, for the most part. They liked the look of it. They liked its ethereal fluffiness, and its delicate (some would say nonexistent) flavor. Besides, white bread stayed fresh considerably longer than its virtuous brown predecessor, because the refined flour from which it was made had been emancipated from its more earthbound, perishable parts. For Graham, the impoverished remnants of the denuded wheat berry that went into white bread represented everything that was wrong with the industrialization of the American food supply. He believed that something precious and essential was irretrievably lost during this violent process.

Graham’s sentiments have been recycled and reused by pure-food activists, almost verbatim, for over a century and a half. One Prevention writer described white bread as “pre-sliced absorbent cotton” with the nutritional value of sawdust, whilst another maintained that consuming it was a mortal sin: “Destroying God’s temple takes place when we ingest material that has been bleached, processed, and stripped of all its God-given nutrients.” Adelle Davis went so far as to claim that France was easily overwhelmed by the Nazis in 1940, in part, because of “the enfeebling French passion for white bread.” There was certainty and perhaps some comfort to be found in this stridency. Newly-minted health enthusiasts, still trying to figure out what was required of them, could be absolutely sure of at least one thing: white flour products were expressly forbidden.

Did it then follow that whole wheat bread—made of virginal, unsullied flour—was permitted? Not necessarily. Even the subject of bread could be thorny. Jerome Rodale, for one, stood squarely against the consumption of bread in any form to his dying day. “Bread,” as he so often declared, “has no place in the Prevention System. It is not the staff of life, even though it is whole wheat.” “To me,” he added, “this prescription against bread is one of the most important planks in the Prevention System, and applies to the organically-raised wheat as well as that raised with chemical fertilizers.” Unambiguously clear statements such as these left little room for amendments or innovation.

All the same, as if to keep Prevention’s six million readers on their toes, Robert Rodale successfully performed yet another doctrinal about-face in the late 1970s. On his watch, bread not only made it back on the table, it became a staple, one of the centerpieces of his new and improved, high-carbohydrate version of the Prevention System. Prevention magazine continued to sing the praises of whole-grain bread in the 1980s and for much of the 1990s. Jerome Rodale must have rolled over in his grave!

—John Faithful Hamer, In Healthy Living We Trust (2016)