Category Archives: Human Nature

The Golden Age

“Creating the future is a frightening enterprise, especially when we do it without any awareness of the past. I am amazed how little we actually care to examine past human experience. It’s like hunting in a wood full of bears, ignoring all the disarticulated skeletons of dead hunters, and confidently proclaiming that bears don’t really exist. They belong to the past!”—Joseph Gresham Miller

Lucas_Cranach_the_Elder_-_The_Golden_Age_-_Google_Art_ProjectDo you dream primarily of what is, what once was, what could have been, or what could be? Your answer to this question tells me almost everything I need to know about you. Political conservatives locate their Golden Age somewhere in the not-too-distant past (e.g., the 1950s), whilst religious fundamentalists locate it somewhere in the unsullied early history of their movement (e.g., the Early Church for Pentecostals, the Pious Predecessors for Salafists). Progressives and starry-eyed idealists locate it somewhere in a future purged of the sins of the present, whilst Romantics locate it in a past purged of modernity, a pastoral place that looks a whole lot like The Shire described by J.R.R. Tolkien in The Lord of the Rings. Most environmentalists seem to locate it in some eco-friendly pre-modern past wherein we all lived in happy harmony with sweet Mother Earth. Computer geeks locate it in a shiny future replete with flying cars, robots, and killer apps, whilst defenders of the status quo, apologists of the present like Steven Pinker, insist that we’re living in a Golden Age right now. The outliers, of course, are the pessimists, like Arthur Schopenhauer and St. Augustine, who insist that life in The City of Man has always more or less sucked, and that there has never been, nor will there ever be, a Golden Age.

St. Augustine argues in The City of God that Original Sin has so corrupted human nature and the natural world—with sin, disease, and death—that the reformation of the individual and of society will always, of necessity, have to be a highly circumscribed exercise. All is not possible, insists the Bishop, because the freedom to do good is habitually hemmed in by this-worldly corruption. “The choice of the will,” avers Augustine, “is genuinely free only when it is not subservient to faults and sins.” St. Paul the Apostle likewise believes that decisive victory in the war against sin is not possible in a fallen world; the battle is, instead, fated to rage on and on, even within his body: “I know,” he once lamented, “that in me (that is, in my flesh,) dwelleth no good thing: for to will is present with me; but how to perform that which is good I find not. For the good that I would I do not: but the evil which I would not, that I do” (Romans 7:18-19). Like Paul, Augustine maintains that there are some intractable human problems which the individual and society will have to grapple with again and again, until the end of time. Perfection can be nothing more than a noble goal in The City of Man. Always before us, yet perpetually out of reach. A beacon on the horizon of a fallen world.

—John Faithful Hamer, The Myth of the Fuckbuddy (2017)

Does Money Make You Mean?

“Human nature has a flaw. Under conditions of apparent competition, when a hierarchy of relative winners and losers is created, no matter how, the people at the top tend to fall for something called a self-affirmation fallacy which causes them to attribute their high status to their own merits and qualities, even if they became rich by winning at some gamble which could have gone the other way. Being rich literally makes people change, makes people less sympathetic, less compassionate, less law-abiding, less honest.”—Helga Vierich, Professor of Anthropology, Yellowhead Tribal College (Spruce Grove, Alberta)

LordvoldemortAfter years of being an overweight sweetheart, this guy I knew in high school started working out, lost all of the weight, and eventually looked like Brad Pitt in Fight Club. Before this dramatic transformation, he had plenty of female friends who adored him and confided in him (but alas, never hooked up with him). The girls saw him as a sweet, understanding, empathetic guy. But soon after his manly metamorphosis, he became a repulsive “bro” who used girls with the indifference of a sociopath. And, just so we’re clear, I’m not talking about a garden-variety player. I’m talking about a full-blown misogynistic asshole with the conscience of a turnip! At one point I confronted him about his nasty behavior: “What happened to you? You used to be such a nice guy.” “I’m hot now,” he said, with a sleazy smile, “and you don’t have to be nice when you’re hot.”

That’s when I realized that he was, in fact, always an asshole; he was just really good at hiding it. The power that came with his newfound hotness afforded him the opportunity to behave in ways that accorded with inclinations that were always there. Nassim Nicholas Taleb’s aphorism—“You will never know for sure if someone is an asshole until he becomes rich”—follows the same logic: money doesn’t make people mean, it just allows mean people to be mean. Or, to put it another way, as Taleb once did on his Facebook page, in a clarifying remark: “People reveal their temperament when they have choices.” Paul Piff’s research into the relationship between social class and unethical behavior suggests that Taleb may be wrong about this. In numerous experiments, he has demonstrated that you can turn a completely normal person into a sociopathic jerk. It’s actually quite easy: just give them some power. If Piff is right, then it’s not so much that latent asshole tendencies are brought out by wealth but that wealth (in and of itself) can turn many perfectly normal people into assholes.

—John Faithful Hamer, Blue Notes (2017)

Ethical Followership

flock-of-sheepA well-functioning society cannot consist merely of leaders. We can’t all be leaders at the same time. Most of us have to be followers most of the time. Yet you won’t see any wealthy suburban kids going to Followership Camp this summer. Nope, they’ll be going to Leadership Camp. Nor will you see any of the same kids enrolling in Followership Programs next semester. Nope, they’ll be enrolling in Leadership Programs. It’s laughable, when you really think about, and dangerous: because the biggest ethical challenges these young people are likely to face in their lives will be about ethical followership, not ethical leadership.

As sophisticated moral dramas like NCIS make clear, ethical followership is all about balancing the competing claims of equally noble virtues. It’s about knowing when to acknowledge the claims of loyalty and when to listen to the cries of justice; when to follow orders and when to disobey them; when to trust your boss’s judgement and when to question it; when to play by the rules and when to break them; when to cover for your colleagues and when to blow the whistle on them.

Moral dilemmas such as these are resolved easily by none but the single-minded. After all, die-hard supporters and die-hard detractors have at least one thing in common: they’re never forced to make difficult choices. Because it’s easy to say YES all the time or NO all the time. What’s hard is to know when it’s time to say YES and when it’s time to say NO.

—John Faithful Hamer, The Myth of the Fuckbuddy (2017)

Emotional Intelligence and Hissy Fits: The Cultural Ecology of Antifragility

turkey prozac

We all have experienced this at times: other people can drive us crazy! We love our families and friends, so why this old saying: fish and house-guests stink after three days?   Why can’t we live together peacefully, like elephants? Why aren’t we rational enough to avoid doing things that annoy each other?

Look at the list of things about, um, other people that can grind our gears… and even drive friends and family wild with frustration, or even apart with resentful anger: recklessness, cruelty, meanness, inconsistency, pranking, deceit, maudlin sentimentality, duplicity, illogical beliefs, gullibility, hubris, sanctimoniousness, jealousy, manipulative wheedling, conniving, and sheer over-the-top emotionality (making “a scene”, being a “drama queen”)

What if I suggested that such things about human behavior are not bugs but features? What if they are all part of the overall adaptation of human nature, that somehow helped turn and adjustments to living in social groups into the building blocks of a whole second replicator?

I suggest that “rationality” and analytical intelligence are evolved traits, with a starring role in shifting our species into a new level of networking and communicating, bumping up the flow of information, and personnel, within much larger communities and much wider geographical ranges than are characteristic of an other primate.   Inter-links between people at several or more degrees of separation meant that  individual networking actually disarticulated the individual from restriction to any local group. I suggested that even territoriality, linked to defensive aggression and such a normal feature of the behavior of many primates, fell under negative selection in hominids at some point in our evolutionary history.

I, furthermore, suggest that dominance hierarchies and ranking systems, based on aggression, were actively curtailed. They had to be, to permit the evolution of the degree of infant helplessness, and the longer childhoods that accompanied brain enlargement during human evolution.   Sure, humans are capable of violence, especially in groups.   But I am suggesting that this was because violently aggressive individuals have always had to be contained and countered by coalitions of the brave and compassionate.   Without such opposition from the “good guys” who rally behind heroes there would never have been sufficient blow-back to keep bullies and killers in line.

We individual humans are, for the most part, the products of a long evolutionary history that has favored compassion and cooperation, but that does not mean we are uniformly so kind and rational that we never lose our tempers, never yearn to get our own way, never wish for the personal luxury of solitude, having a beautiful object (a bauble or a blanket…!.

Now we might ask ourselves, what exactly was the evolutionary environment that gave a thumbs up to hyper-sociability, and a thumbs down to inter-group and intra-group competition and aggression? What possible environment generated higher fitness for individuals whose activity tended to flatten gradients of stress and life expectancy?

My initial insights arose from a field study among a patient and kindly bunch of hunter-gatherers. The Kua were my teachers for three years, and yet, as I left the Kalahari, my dominant sensation was not that I was leaving a group of peaceful and “noble savages”, but rather that this foraging economy produced individuals as ordinary, as flawed, as insightful, wistful, funny, and sometimes as intensely annoying, as any other humans I have ever known. It was merely a different economy, not another way of being human.

I have thought about this over the intervening years. What if our obvious capacity, for small deceptions, fractiousness,  and occasional surliness,  actually balances our kindness and sociability not by accident but, rather, as it were, by design? We can hardly ignore these aspects of human interpersonal antics today… well, what if it was precisely some kind of continuing see-saw between naughty and nice, convivial and argumentative, politeness interspersed with occasional huffy misunderstandings and temperamental behavior that was precisely the behavioral mechanism kept these bipedal apes ecologically solvent?

2cab2e339136fb565536e7576f611f5cWhat if, in the long game of playing off individual genetic destinies against benefits to the collective cognitive niche, the occasionally explosive mix of emotional and irrational behavior was the key to generating “antifragile” cultural ecologies that were less likely to over-exploit any given local resource?

Thus, as humans evolved, reflection literally was an after-thought. As irritations and small conflicts increased, even as individuals found themselves holding back from escalating an argument, even as everyone’s impulse control was tested, there was always “the last straw”: an emotional scene that might set everyone packing to leave.   And, just as we still often find ourselves doing today, reflection after the event will then supply “good reasons” to justify it.

The fact that this pattern is at least partly learned, and not just an innate drive, made it more flexible still. It permitted more condensed and sedentary organization in richer ecosystems, more dispersed and mobile organization in poorer ones. Further, as learned system, it could incorporate the tighter social control during the more condensed phases within a cultural repertoire, without sacrificing the overall scope of individual networking.

People, today, when living in more crowded and sedentary communities, still tend to establish networks, through marriage and friendship, and those of each individual are still variable and rarely identical even among siblings. Furthermore, these tend not to be limited to a single community or neighborhood.

Despite the idea of “tribal” tendencies that cause links between people in groups to converge, individual life histories among human beings still tend to create ties (even “weak” ties) to more physically distant relatives, acquaintances, “pen pals”, and “old childhood friends”. Such links tend to be kept up more actively by some individuals. Sociological research into networks has suggested that such people are hubs in terms of information flows between communities. The idea that people across continents are hardly ever more than six links away from everyone else – the “six degrees of separation” model, has been experimentally confirmed many times. It began with the appearance, n 1961, of a seminal piece of work, in the form of a doctoral thesis by Michael Gurevitch, entitled “The social structure of acquaintanceship networks”.  This was presented and accepted by the  Department of Economics and Social Science at the Massachusetts Institute of Technology.

This research, and the many studies that followed, suggest that extensive networking is a human adaptation to culture, an aspect of the “social brain”: so perhaps it is not a contingency of any one kind of economic system. It is species specific, not culture specific. And we come by it through our evolutionary history as social mammals, and particularly, as social apes.

People appear to activate networks to achieve some consensus about who should undertake leadership roles.   Such leadership roles in rituals, in setting up task forces, in dispute resolution, and in disciplinary courts, and in safeguarding community assets, often went to quiet and modest people that could be trusted not to abuse their positions. Often such responsibilities fell upon older people, especially those who were already hubs within local networks.

A reputation-based system of rank, thus, imposes a burden of responsibility on the most trusted elders, so they have authority over communal working groups, as well as for the convening of assemblies to undertake dispute resolution.

Given that such ephemeral institutions for conflict resolution can emerge at times of greater aggregation, it seems that even mobile hunter-gatherers can stick it out despite arguments with neighbors and even intimate betrayal. Thus impulse control and reflective philosophizing over human foibles comes into its own in keeping the volatile human primate tractable at trying times. And this is incorporated into even the most mobile forager culture. Networks of family and friends, therefore, can effectively restrain people: no one wants to lose a hard-won reputation for strength of character.

The historical and ethnographic record from hunter-gatherer societies suggests that such roles can disappear and reappear with the seasonal cycles of aggregation and dispersal. The fact that almost all the ethnographic data indicate patterns of aggregation and dispersal of people over the course of an annual pattern of resource use is critical. Mobile hunter-gatherers are not nomadic in the sense of wandering ceaselessly in search of food: on the contrary, they circulate through a variety of locations with known resources.

Arrangements between families to meet at particular localities to camp together are often made during seasonal aggregations, and are always negotiated via networks among friends and relatives. So the times of aggregation could be characterized as a kind of network convergence, pulled toward those particular gregarious and trusted persons who serve a hubs linking many individual networks together. And this temporary integration of networks in a larger gathering, under leadership of the most trusted and respected persons, affords people the necessary time to negotiate camping parties and permissions with those who hold primary rights to each small local part of the overall territory within the aggregate.

It is conceivable that this flexibility – what Julian Steward called various “levels of integration” above simple “bands” – represents a capacity for organizational complexity not often attributed to foragers. And yes, it does indicate that even mobile foragers have the capacity for political and social organizational arrangements well beyond the scale and scope of the simple camping party.

Recently, David Graeber and David Wengrow suggested that the emergence of such leadership and more complex organization, during hunter-gatherer aggregations, indicates that humans have an innate tendency to develop political hierarchy. Is the term hierarchy the correct one in this case?   The term is synonymous with “pecking order” and has often been used to describe the way dominance of one animal over another in a ranked system is related to access to food and solace.   It conjures up a flow of authority and even coercion from the individual at the “top” which controls the movement and opportunities of individuals further down.

Brian Hayden has even suggested that “aggrandizer” personalities make use of these emerging hierarchies during periods of aggregation to seize power over others, partly by persuasion and partly by Machiavellian manipulation of others.

Hayden suggests that these self-promoting persons may have some overlap with the sociopathic traits seen on Hare’s checklist. In other words, when people live in more settled aggregations, they become vulnerable to the self-serving aspirations of a narcissistic and psychopathic minority, who make themselves “big Men” and assume power over others. In other words, the emergence of the bully gang explains the way hierarchical political power evolved in humans. (1)

One of the difficulties with this interpretation is that it does not always correspond with observed behaviour in people who are diagnosed as psychopaths today (2).  Another is that it does not situate the cultural behavior (or the ruthless individual) in terms of the consequences within that particular environment (3).  The most striking aspect is, of course, the way both the New Guinea and the NW coastal systems of leadership tend to exhort their communities to produce surpluses.   There is an obligation to contribute to a communal store of fish or other food and even material goods, a store managed by a trusted – or haranguing – senior leader. This results in higher overall productivity than is called for by the simple calculus of dependency ratios.

This communal store is risk insurance. Food and other assistance can be secured for families who meet with illness or injury. I would suggest that is why leadership in a band or tribal system is a function of trust and respect; if leaders merely hoarded or extorted tribute for personal gain, they would not last long.

Such surpluses also fuel a certain level of recurrent ceremonial socializing. Feasts can be planned for, which assemble people from many more surrounding communities. Thus, while a display of generosity towards those in hardship within a community can demonstrate the character of the leader, any display of generosity where a village hosts many of its neighbors during a festival goes well beyond this. It demonstrates the quality of the people of the hosting community. The net effect is that the people in each community are given additional motivation to work harder.

Why is this important? I suggest that such regional festivals also redistribute food across regions where not all harvests of are likely to be equal. Each local community is thus less exposed to risks of famine. The community, who had the most surplus food in any given year, trades this food for higher prestige and simultaneously reduces the chances that hungry neighbors will come to raid.

What happens if the concentrated settlement becomes more permanent: a village? Organizational improvisations can become entrenched institutions, with people developing hereditary rights to leadership roles – especially in adjudicating disputes.   Vested interests that resist change can entail internal conflict, which can be resolved by proof of generosity and earned reputation for diligence. In this case, the famous “potlatch” can also offset conflicts between neighboring communities over access to fixed resources.   Political and judicial roles maintain cooperation, restore peace, and to offset risks in a sedentary community.

Lineages and “big man” systems, therefore, appear to be risk aversion strategies – aspects of cultural adaptation, not evidence of selection pressures on human genomes causing novel shifts in innate behaviours during the Holocene.  Hierarchies of coercion and the self-affirming narcissists are not, as Hayden suggests, products of evolutionary genetic change, but rather, I think,  illustrations of the behavioral plasticity of human beings, and the way people have learned to collectively cope with higher environmental risk.

Meanwhile, we see further cultural reification of emotional sensitivities to behavior causing physical or reputational damage to other persons: this takes the form of legal codes, codes of ethics and human rights, and codes of polite behavior. This always involves symbolic evaluation; labeling behaviors as negative, positive and even sacred and profane.

However there is a danger under such circumstances.  I doubt that it comes from people who are born psychopaths.  What the foragers seem to all have understood only too well was that the human “behavioural plasticity” can take a wicked turn: people have a great emotional weakness- the “sin” of pride, more specifically the kind of hubris that comes of being placed somehow above one’s fellows (4). That was the point that Richard Lee was trying to drive home when he wrote “Eating Christmas in the Kalahari”. One  old guy’s comment was: “If a man is praised for sharing the meat of his kill, he may come to think he is better (more important) than other people. Someday he might kill someone.” 

It has taken years of research to uncover this aspect of our human nature. To uncover the fact that the assumption of authority or wealth, even the the conformity that prompts a person to suspend their own judgement to a higher authority, can give rise to evil actions that hurt other people.  Even in an experimental setting putting people into roles that permit harm to others somehow turns off empathy and compassion. It seems that even just being richer than others, or higher up in the chain of a corporate or civil service ladder, can set in motion the “banality of evil.”.  This is a human characteristic that is far beyond normal fractiousness  and occasional hissy fits, and it gives rise to far more serious trauma and human tragedy than mere incidents of rage and tears.

The only good thing in this research is that it does not happen to everyone – there are people who see what is happening and fight it. People who say “this is wrong”. Often they are the folks who either stop the experiment, or in real life will resist tyranny and injustice.  They risk their lives – or die on the barricades. Human beings do have the capacity to act with heroism. The fact that we have a word for this in every known culture should tell us something.

By the way, the word for “hero” among foragers is often translated incorrectly as “warrior” since it means one who fights on behalf of others. I have a feeling that the first battles among human beings were fought, in fact, by heroes of this kind.  In his book, Hierarchy in the Forest, Christopher Boehm suggested that one of the very early developments on the path that led to the evolution of our species, was an overthrow of aggression-based dominance hierarchy.  This led to an egalitarian revolution led by coalitions of people who resisted bullies and protected the vulnerable.  If so, this converted the desirable ideal of adulthood from a self-serving “alpha” into a heroic “first among equals”.. the epitome of the trusted leader.

A human being who lives as a hunter-gatherer could thus refuse injustice; could fight for equal treatment – or walk away. Personal faults and foibles, jealousies and temper tantrums were possibly part of  human nature evolved to create a relatively antifragile economy where high mobility makes it possible to vote with one’s feet. A hunter-gatherer inhabits an economic system that preserved and even enhanced the stability and diversity of the ecosystem that supported that way of life.   A hunter-gatherer cannot be thrown out of their job or lodgings.

But most humans on this planet can, and frequently are. Entire peoples have had their whole landscape taken taken out from under them. Look at the Scottish highland clearances. And that was done by their own clan leaders. And the pain of people under such circumstances, and the guts it takes for them to try to remake their lives elsewhere, is heart-breaking. Makes me weep. And we wonder why the world is full of people in a rage, crying out for justice and radicalized; while those who are relatively well-off tend to develop elaborate explanations that affirm their own superiority. 


1) Brian Hayden Big Man, Big Heart? The Political Role of Aggrandizers in Egalitarian and Transegalitarian Societies


Anthropological theories of elites (leaders) in traditional societies tend to focus on how elites can be viewed as helping the community at large. The origin of elites is cast in functionalist or communitarian terms (viewing societies as adaptive systems). A minority opinion argues that elites were not established by communities for the community benefit, but emerged as a result of manipulative strategies used by ambitious, exploitative individuals (aggrandizers). While the communitarian perspective may be appropriate for understanding simple hunter/gatherer communities, I argue that elites in complex hunter/gatherer communities and horticultural communities operate much more in accordance with aggrandizer principles, and that it is their pursuit of aggrandizer self-interests that really explains the initial emergence of elites. This occurs preferentially under conditions of resource abundance and involves a variety of strategies used to manipulate community opinions, values, surplus production, and surplus use.

2) Although Hare does suggest that psychopaths might be more successful within aggressively competitive systems, their comparative rarity even after some five thousand years of hierarchical civilization tends to weaken arguments that such systems are functionally dependent upon the success of a type of personality. It seems more likely to me that the development of stratified societies may have occasionally increased the chances of highborn psychopaths not being spotted and eliminated.

3) See: “Pathways to power: Principles for creating socioeconomic inequalities” in Foundation of Social Inequality edited by T. D. Price and G. Feinman. 1995.“Pathways+to+power:+Principles+for+creating+socioeconomic+inequalities”+in+Foundation+of+Social+Inequality+edited+by+T.+D.+Price+and+G.+Feinman.&source=gbs_navlinks_s

(4) see  Monbiot on “the Self-affirmation Fallacy” where he summarizes recent research showing that socio-economic inequality generates precisely the kinds of narcissism that Hayden wishes us to believe is psychopathology  expressed in hierarchical leaders. “The findings of the psychologist Daniel  Kahneman, winner of a Nobel economics prize, are devastating to the beliefs that financial high-fliers entertain about themselves . He discovered that their apparent success is a cognitive illusion. For example, he studied the results achieved by 25 wealth advisers, across eight years. He found that the consistency of their performance was zero. “The results resembled what you would expect from a dice-rolling contest, not a game of skill.” Those who received the biggest bonuses had simply got lucky.

Such results have been widely replicated. They show that traders and fund managers across Wall Street receive their massive remuneration for doing no better than would a chimpanzee flipping a coin. When Kahneman tried to point this out they blanked him. “The illusion of skill … is deeply ingrained in their culture.”

So much for the financial sector and its super-educated analysts. As for other kinds of business, you tell me. Is your boss possessed of judgment, vision and management skills superior to those of anyone else in the firm, or did he or she get there through bluff, bullshit and bullying?”

In contrast, of course, the operation of networks – which can be sensitive communicators of reputations based on observed ethical and kind behavior, continue to do, in these other forms of economic system, exactly what they do in hunting and gathering economies:

Being Yourself vs. Being Original

“It is unhealthy, and extremely modern, to worry over one’s originality. The Elizabethan poets used to rewrite each other’s poems to try to improve on them. That was a far superior attitude.”—Aaron Haspel

westworld3-700x525If a time machine like the one described in David Fiore’s Hypocritic Days (2014) was discovered tomorrow, and I was asked to write a travel brochure for the 21st-century West next week, I’d be sure to mention individualism as one of our era’s big attractions. The freedom to be yourself, do your own thing, choose your own profession, move to a new place, break with tradition, make a new family, be a little weird, have a little privacy: we take these things for granted far too often. Many of our ancestors would kill for what we have. Many of mine died for it.

Many of yours too.

Still, individualism is a human thing, and, like all human things, it’s flawed. And it comes with a cost. Sometimes a hefty cost. So don’t get me wrong: I know full well how much trouble the emancipation of the individual has caused. But I would nevertheless argue that the freedom to be yourself is one of our culture’s greatest accomplishments. It’s well worth fighting for, despite its drawbacks.

At some point, however, in the not-so-distant past, we seem to have collectively forgotten what it is that we were fighting for all along, what it really means to be authentic, what it really means to be yourself—and I think I know why: we’ve confused being yourself with being original.

Recognizing your own ordinariness can be hard when you’ve been raised to believe that originality is a cardinal virtue. But it’s a bitter pill that most of us have to swallow. Because we can’t all be original. Just as there’s a limited amount of beachfront property in the world, there’s a limited number of people who can be first, unique, singular, and truly original (sui generis). To some extent this is a function of the limited number of geniuses in the world. But it’s mostly a function of dumb luck: some people just happen to be the first one to think or do something new. After all, someone has to be first.

If, like Sam in Garden State (2004), you think that to be an individual, to be yourself, you’ve got to “do something that has never, ever been done before . . . throughout human existence,” you’re bound to go through life profoundly disappointed with yourself. Because this is an unrealistic goal, a silly ideal. You’re setting yourself up for failure. It’s time to return to the sensible authenticity proposed by the Roman Stoic Epictetus. In The Art of Living, he maintains that “one of the best ways to elevate your character immediately is to find worthy role models to emulate. . . . Invoke the characteristics of the people you admire most and adopt their manners, speech, and behavior as your own. There is nothing false in this. We all carry the seeds of greatness within us, but we need an image as a point of focus in order that they may sprout.”

Schopenhauer makes a similar point in “On Thinking for Yourself” (1851), wherein he stresses that being the first one to think a particular thought isn’t what’s important; what’s important is that you make a thought your own. What’s important is that this newly discovered idea enter “into the whole system of your thought” as “an integral part, a living member”; “that it stand in complete and firm relation with what you already know; that it is understood with all that underlies it and follows from it; that it wears the color, the precise shade, the distinguishing mark, of your own way of thinking . . . . This is the perfect application of Goethe’s advice to earn our inheritance for ourselves so that we may really possess it: ‘What you have inherited from your fathers, earn over again for yourselves or it will not be yours.’”

It occurs to me now, and only in retrospect, that this is probably the original purpose of that annoying high school injunction: don’t just copy it out, rephrase it in your own words. I always found that exercise tedious and pointless. Drove me nuts. Seemed like a complete and utter waste of time. After all, if Aristotle said it so well, why can’t I just quote him? I remember asking a few of my teachers questions of this stamp. Not once did I receive a good answer. And I strongly suspect that this is due to the fact that they didn’t have one to give.

But I do. Now. Finally. At 42.

Rephrasing one of, say, Nietzsche’s aphorisms, in your own words, using examples derived from your own lived experience, is in fact a worthwhile exercise. I see that now, at long last. Because to do it, and do it well, you have to truly grasp the idea Nietzsche’s referring to; and if you can truly grasp the idea, it’s yours just as much as it’s Nietzsche’s. This isn’t plagiarism; it’s pedagogy. The ideas I present to my students semester after semester are no more “mine” than the air we breathe in the classroom or the water we drink in the hall. They’re a part of a vast spiritual commons, part of the shared intellectual property of the most fascinating animal ever to walk on God’s Green Earth.

—John Faithful Hamer, From Here (2017)

Pink Elephants with Purple Polka Dots

the-pink-elephant-marquette-statue“Don’t think about a pink elephant with purple polka dots!” My students burst out laughing whenever I say this, with faux-seriousness, because it’s impossible to heed the injunction. The very mention of such a comical creature causes the image of a pink elephant with purple polka dots to spring to mind with a reflexive immediacy that bypasses all rational thought. The same is true of the emotionally-charged categories we use to make sense of what’s happening to us. For instance, whenever I smell freshly-baked bread, I remember the bread my mother made everyday from scratch in the early afternoon. I remember the way its intoxicating smell permeated every corner of our little basement apartment on Airlie Street. I remember the way you could smell it in the building, long before you got to our apartment. At times, you could even smell it on the street, long before you got to our building! In short, whenever I smell freshly-baked bread, I’m 7-years-old again. Likewise, whenever I burn my tongue, I remember the first time I burned my tongue, when I was 12-years-old, on some hot chocolate at Bad Boys, the 24-hour doughnut shop on Wellington Street. When you’re having an emotionally-charged experience, you remember every other experience you’ve had of that kind. It happens instantaneously, automatically—with a reflexive immediacy that bypasses all rational thought. As such, asking your partner, in the middle of an argument, to refrain from bringing up ancient history—that is, bad experiences of a similar stamp—is about as silly as asking them to refrain from thinking about pink elephants with purple polka dots.

—John Faithful Hamer, The Myth of the Fuckbuddy (2016)

The Year of Living Homerically

emile_levy_-_circeGetting sucked into the insanity of the 2016 election was like getting sucked into an ancient myth. One minute you’re living your life, next minute you’re a character in Homer’s Odyssey. Seriously, I feel like I should write a sequel to A. J. Jacobs’s The Year of Living Biblically (2007) entitled The Year of Living Homerically (2017). Were we not, like Odysseus’s men, turned into swine? Were we not, like Odysseus, bewitched? Did we not lose track of time, trumping till two, night after night? Waking up this past weekend, after a thoroughly unhealthy, year-long obsession with American politics, I felt like disoriented Odysseus, coming to his senses on the Island of Ogygia.

Angry people are incredibly easy to manipulate. Same is true of the self-righteous. The more “political” you become, the more you become a mere pawn in someone else’s chess game. Your ideas are no longer your own. They’re not even your friends’ ideas. They are, instead, prefabricated ideas, manufactured by spin-doctors, mad scientists of the spirit, who understand human nature better than most, and are practiced in the art of deception. These master manipulators understand that the pleasures of politics may be ugly pleasures, but they’re pleasures nonetheless. Anger feels good. Self-righteousness feels good.

But these pleasures come at a cost. Politics erodes your creativity far more than it erodes your humanity. I can’t believe how boring I’ve become. I can’t believe how boring many of my friends have become. Thinking prefabricated ideas all the time is sort of like moving into a prefabricated suburban row house. You get to choose the drapes, what color to paint the walls, little else.

Oh Aristotle, stop snickering in the back row! Yes, yes, yes, I know! Man is indeed the political animal. But it’s equally true that the political too often brings out the animal in the man. And you, Edmund, for God’s sake, save your breath! I know what you’re gonna say: “All that is necessary for the triumph of evil is that good men do nothing.” Of course there’s truth to what you say, much truth. But can you not conceive of a species of evil that’s akin to quicksand? Can you not see why Epicurus admonished his followers to shun politics?

—John Faithful Hamer, The Myth of the Fuckbuddy (2016)

Don’t Trust Any Idea Over 30?

Humanities Heuristic: If every book on the syllabus is younger than your mom, drop the class.

Don Draper

When student activist Jack Weinberg declared “Don’t trust anyone over 30”—at the height of the Free Speech Movement at UC Berkeley in the mid-1960s—he was, to some extent, speaking for an entire generation, a generation that had lost faith in the wisdom of their elders, a generation that had concluded that the present had little or nothing to learn from the past. But he was also giving voice to an intuition that flows quite naturally out of cultural currents that predate the babyboomers, such as the theory of the avant-garde, the Whiggish faith in progress, the modernist obsession with all things new—which the philosopher Nassim Nicholas Taleb has aptly dubbed neomania—and the sense, so well articulated by Henry Adams in The Education of Henry Adams (1907), that the modern world constitutes a radical break with history: “in essentials like religion, ethics, philosophy; in history, literature, art; in the concepts of all science, except perhaps mathematics, the American boy of 1854 stood nearer the year 1 than to the year 1900. The education he had received bore little relation to the education he needed. Speaking as an American of 1900, he had as yet no education at all. He knew not even where or how to begin.”

Is this modernist mistrust of the past justified? I used to think so. But lately, not so much. Inventions like the microscope and the telescope have made it possible for scientists in fields like molecular cell biology and particle physics to see things—faraway stars, subatomic particles, and microscopic viruses—which simply couldn’t be seen in the ancient world. As such, the rapidly changing received wisdom in fields which benefit from these amazing technological innovations is easy enough to explain and justify. The rapidly changing received wisdom in the humanities and the social sciences is far less easy to explain and justify. Is there any technological advance which has made it possible for us to “see” things about human nature which would have been “invisible” to thoughtful people in the ancient world? I can’t, for the life of me, seem to think of one. Has modern life, and everything it entails, so fundamentally rewired our brains that human nature is, in the twenty-first century, dramatically different from the human nature which prevailed in, say, the Egypt of the Pharaohs? I doubt it. And this doubt leads me to two troubling questions: If our capacity to “see” human nature hasn’t changed much, and human nature hasn’t changed much, how can we justify and explain the rapidly changing received wisdom in the humanities and the social sciences? What’s more, if little has changed, how can we justify the claim that the present has little or nothing to learn from the past?

—John Faithful Hamer, The Myth of the Fuckbuddy (2016)

Why Pick-Up Artists Should Be Sued For False Advertising

“Roosh is tall and well-built and actually rather good-looking for, you know, a monster.”—Laurie Penny, “I’m With The Banned,” Medium (July 21, 2016)

roosh-v-pua2If you’re hot for a guy who’s an asshole, it’s not because he’s an asshole; it’s probably because he’s hot. This is precisely why Pick-Up Artists aren’t just evil and gross, they’re also guilty of false advertising.

Take, for example, the reigning king of the Pick-Up Artists: Daryush Valizadeh (Roosh V). What a profoundly delusional idiot this guy is! He actually thinks that his sociopathic “skills” are what gets him laid. Of course it’s obvious to any objective outside observer with common sense—indeed, even to hard-core feminists like Laurie Penny who loathe him—that he gets laid a lot because he’s hot. Roosh V is guilty of what Nassim Nicholas Taleb refers to as the Green Lumber Fallacy.

As Taleb makes clear in Antifragile (2012), people who are successful at something are often blissfully unaware of why they’re successful at it. They might think they know why they’re successful, but they’re often dead wrong. He refers to this as the Green Lumber Fallacy, after the trader who made a fortune buying and selling green lumber without knowing what it was. Dude thought green lumber was actually “green” as opposed to freshly cut. Funny, I know. But what’s not funny is watching a homely computer programmer trying to apply Roosh V’s creepy techniques. They fail miserably because the techniques aren’t just morally repugnant, they aren’t effective.

What is effective? I’ve noticed three discernible trends when it comes to straight guys who get a lot of play: (1) they genuinely like women and/or (2) they’re hot and/or (3) they’re powerful, which is kinda hot. Successful Pick-Up Artists need to realize that they’re getting laid in spite of their douche-y-ness, not because of it. That being said, there’s something to the whole bad boy thing that Roosh V has got going on. Once again, however, it’s not what he thinks. After three games of pool and way too many shots of Jameson, a lesbian friend of mine once said to me: “Took me ten years to realize I didn’t wanna be with a bad boy, I wanted to be a bad boy.” I’ve suspected ever since that this is central to the bad boy’s appeal. What is the bad boy, after all, if not a person who flouts society’s rules? And who’s more oppressed by society’s rules: men or women?

—John Faithful Hamer, The Myth of the Fuckbuddy (2017)

The Much Misunderstood Alpha Male

“The ‘alpha male’ exists most loudly in the fantasy of omega losers, Last Men who dream they are the Overman.”—Joseph Gresham Miller

12o6dn (1)Pretty much everything you think you know about the alpha male is wrong. Our understanding of who they are, what they do, and their function has been seriously revised by the last two decades of ethological research. Two things strike me as especially fascinating:

(1) Alpha males are just as good at following as leading. They’re exceptionally good at working with others and deferring respectfully to the skills and superiority of others. If they’ve got a problem, it’s with horizontal (as opposed to vertical) relationships. In short, real alpha males have outstanding social skills.

(2) Alpha males are basically cops. They keep the peace within the group. And the biggest ongoing threat to social peace in most primate communities isn’t angry young men, by the way; it’s mothers. Yep, mom’s Public Enemy #1. A typical scenario runs something like this: two young primates are playing; one of them gets too rough and hurts the other; hurt playmate starts crying; hurt playmate’s mom freaks out, runs over, and smacks the one who hurt her kid; now that kid starts crying; and his mom freaks out, runs over, and smacks the mom who hurt her kid. Now you’ve got two adult females in a full-blown brawl. Within seconds the grandmothers and all of the aunts jump into the fray.

Fights like this can quickly escalate, splitting a group in two, which is bad for everyone because smaller groups are less able to defend themselves against predators and other groups. Anyhow, that’s where the alpha male comes in. The job of an alpha male is to step in and break up the fight before it escalates. And he never plays favorites, even if one of the moms happens to be his sister.

The alpha male fantasy one finds amongst pick-up artists, and on websites like A Voice for Men, is largely a product of the adolescent male imagination. Real alpha males aren’t self-centered pricks who get to do whatever they want, whenever they want. All to the contrary, real alpha males are often the greatest and most selfless servants of the common good.

—John Faithful Hamer, Blue Notes (2016)