All posts by helgavierich

About helgavierich


Thoughts on Fallacies


Once upon a time, Margaret Mead was in conversation, with James Baldwin, about the responsibility they felt for the future of their children.He said “The world is scarcely habitable for the conscious young… There is a tremendous national, global, moral waste.”

Mead replied: “I know.”


Baldwin went on: “And the question is, how can it be arrested? That’s the enormous question. Look, you and I both are whatever we have become, and whatever happens to us now doesn’t really matter. We’re done. It’s a matter of the curtain coming down eventually. But what should we do about the children? We are responsible; so far as we are responsible at all, our responsibility lies there, toward them. We have to assume that we are responsible for the future of this world.”

Mead eventually said: “then we come to a point where I would say it matters to know where we came from. That it matters to know the long, long road that we’ve come through. And this is the thing that gives me hope we can go further.” [1]

They were discussing racially motivated murders that happened during the Civil Rights movement, and they were discussing war and suffering around the world. Mead’s comment about the importance of knowing “the long, long road that we’ve come through” really jumped out at me, because Mead was an anthropologist.


Her “long road” is, therefore, not merely historical, it is evolutionary. Racism, terrorism, warfare, and genocide are the scourges of history, but are they the scourges of our entire evolutionary past? Do they represent some inevitable and enduring aspect of human nature? There are many people who would affirm that humans have always been hierarchical, xenophobic, and violent; that these are characteristics deeply engrained in our nature.

To explain human capacity for tolerance, charity, and gentleness many scholars refer to the effects of civilization. Thus, Thomas Hobbes, for example, believed that humans in a “state of nature,” or what today we would call hunter-gatherer societies, lived a life that was “solitary, poor, nasty, brutish and short” in which there existed a “war of all against all.” This led him to conclude, as many apologists for states have since, that a stable society required leadership in order to control the rapacious violence that was inherent to human nature. In Capitalism: The Unknown Ideal, Aynn Rand wrote that “Collectivism is the tribal premise of primordial savages who, unable to conceive of individual rights, believed that the tribe is a supreme, omnipotent ruler, that it owns the lives of its members and may sacrifice them whenever it pleases.” Rand advocated industrial capitalism to free humans of such fetters.

Meanwhile others were insisting that the human mind was a blank slate receptive to any social system to which it was exposed.

What a confused and tangled set of misconceptions about human nature! When otherwise educated people have a misconception, they tend not to take kindly to information that contradicts it. Of course, this is because they do not consider this to be a misconception, but rather a received truth. And, in the case of war, genocide, and xenophobia, after many thousands of years of such practices being widespread in those same societies responsible for recorded history, the idea that such behavior arises out of human nature is an understandable position.

So, what do we have to counter it? There are about a half dozen pieces of information: equally undeniable, that should give pause to even the most stalwart followers of Rand or Hobbes.

Fallacy #1) Collectivism suppresses individuality.

This is clearly a fallacy. The human species is intensely pro-social, thus all human societies are collective endeavors, even capitalism. Thus the attribute “collectivism” does not entail suppression of creativity and individuality – even the most “simple” economies have innovation, as well as conservation of knowledge and technologies. Their values and ideologies tend to channel, not prevent, individualism.

Fallacy #2) That modern civilization decreases mortality due to violence.

This one is still hotly disputed.  There are, in human societies, three main causes of deliberate death by conspecifics. These are a) interpersonal violence, b) lethal social controls, and c) warfare.  While the first two categories of violent death appear to occur in most cultures, the final one, most definitely, does not.  Warfare is a rare occurrence between groups in hunter-gatherer economies. This  does not appear to be an artifact of recent history: there is very limited evidence of inter-group warfare from the archaeological record from the Pleistocene, when everybody was a hunter-gatherer.

There IS some evidence of interpersonal violence, even cannibalism, but murder and eating people happens in contexts other than war. Even genocidal violence, as when a whole party of men women and children are massacred, can happen in other contexts, such as retaliatory vengeance, fear of disease or of spiritual contamination.

The steady decline of violence in state societies – popularized by Steven Pinker in his book, The Better Angels of our Nature, is not an empirical fallacy, but it is a statistical one, Transforming data on violent death, from the absolute numbers into percentages of total population, tends to produce a picture of declining rates. This is perhaps partly an artifact of the simple fact that population growth, in most agricultural economic systems, has far exceeded the increases in violent deaths for several thousand years now, and this has most clearly become exponential in the last hundred years.

Accepting this idea of declining “rates” further implies that there is actually some sort of inevitable rate of general mayhem, murder and violent death baked into human nature. If so, we then must ask what might be the cause of such rates, if indeed they are some inevitable part of the human condition? And how do some populations get stuck in more violent cultures than others?

More pointedly, we might ask ourselves what, if anything, does this ubiquitous human irascibility, and occasional lethal violence, got to do with warfare? If we plot mayhem caused by violence, crime, malnutrition, disease, toxic exposure, and poverty, we could play the same statistical game. Indeed, some people have done so. But what evidence do we have that such things as epidemics of disease, and natural disasters resulting in starvation, occur at some regular rate linked to any particular economy?

Here we enter the intellectual territory well trodden by students of animal ecology. Population regulation is well understood in  other species. It appears to be achieved, in most natural wild populations of animals, by density dependent changes: as the numbers approach carrying capacity, deaths due to stress-induced aggression, reproductive failure, and diseases increase – even before signs of malnutrition appear.

The experimental research on rats done years ago, as well as studies of wild rabbit colonies, of wolf packs, of caribou, and of relationships between wild hares and lynx, are interesting in this regard. They show that populations begin to fall long before food supplies run out. In fact it appears now many of the deaths – even in epidemics, result not from the introduction of the novel microbes, but rather, due to the stress-induced drop in immunity attendant upon over-crowded populations. Moreover, deaths by violence also increase in many species when they are overcrowded. Hunter-gatherers generally live at lower population densities than people in other economies, and yet at the highest densities, as in huge modern cities, there are quite low rates of violence.

There is thus another whole category of causality that has an effect on mortality, and that is “structural” violence. This is down to racism, socio-economic inequality, and discrimination against “deviant” forms of sexuality, minority religious beliefs, or even political ideology (for example, communism has been targeted as well as capitalism). These permit levels of hardship and social rejection that create extreme stress for disadvantaged people, and such that their lives are often shortened. Even the life expectancy of their descendants, if they manage to have any, can be reduced. Such structural violence is unknown among mobile hunter-gatherers, and yet has been a feature of state societies, with very few exceptions.

Fallacy #3) Humans are naturally prone to xenophobia.

This is also known as the “in-group vs out-group” to “tribal” tendency. Here we enter an other very contentious area.

However, I think it IS a fallacy.

Why? Well, for one thing, preference for, and defense of, known and familiar companions is not the same as hostility to unknown or unfamiliar people. There is no evidence that people, even in “a state of nature” are inevitably hostile towards strangers – or neighbors. Early encounters between explorers like Columbus and the native people of the Caribbean, for example, reported curiosity, friendly offers to trade, and high levels of hospitality – to the point that Columbus was enthusiastic about the potential enslavement of such innocents. The later hostility that greeted European settlers had as much to do with these early experiences of misunderstanding, and exploitation, as it did with the high handed attitude of new outsiders who came, clearly, with intent to usurp the lands of the people.

Children do not automatically show fear or dislike of age or play-mates based on skin color, dress, accents, or other aspects of superficial appearance. Experiments have shown, however, that assignment of people to “outsider” status does happen very quickly in young children. Rather than an evolved “tribal” tendency, an instinctive xenophobia, this is usually based on teachings.  It is when adults assignment of inferior moral or intellectual abilities – effectively “other-ing” those who are differentiated by appearance, behaviour, or symbolic tags. Jane Elliot’s work showed that children quickly catch on, and start actively being horrible even to former friends and classmates, and do so on the most arbitrary evidence of difference, such as eye colour.  All they need is a specific and authoritarian assertion that some tag indicates who ls an inferior or wicked person.

Three additional fallacies concern hypotheses about historical trends, that interrelate with one another to underpin the myth of progress.

Fallacy #4)  Human life span has been increasing since “the Stone Age”.

This one is very pervasive. In fact, however, it is life expectancy at birth which varies a great deal between cultures, not the age to which people CAN live. Life span appears to be species specific: humans can live about 30 years longer than most great apes; but many decades short of the life span of certain species of trees and tortoises.

Life expectancy on the other hand, is a feature of death rates at various ages, and thus represents at statistical probability of surviving to various ages. In a cultural ecology with high rates of malnutrition, stress, or infection, life expectancy will be low. This was the case in 17th century France, where life expectancy for males was under 30, as it was throughout most of human history, and is among some Pygmies in the Congo today.


Life expectancy might very well have got far lower even in industrialized economies had it not been for the invention of vaccines and the discovery of antibiotics. Highest rates of mortality tend to occur at the youngest ages as immune systems get their training wheels, so prevention of death caused by microbes caused a massive jump in life expectancy over the past hundred years. Life expectancy varies with income throughout the industrial world, and tends to be lowest among colonized people, whether they are Scots in the UK or native Canadians or Aboriginal Australians today.


Fallacy #5) The assertion that all economies, prior to the industrial age, were inadequate in meeting human needs.

The entire colonial program summarized by the unfortunate phrase “White Man’s Burden” as well as the overt racism in Rand’s view of “primitives” stems from this. International food aid programs and the activities undertaken by the Ford and Rockefeller Foundation institutes to spread “green revolution” technologies, were often predicated on the assumption that traditional societies had woefully inadequate systems of farming and animal husbandry.

The idea is still very widespread that this inadequacy is responsible for malnutrition in the “third world”. This is related to the previous point, in that it mistakes the causes of innovation. Rand’s assumption was that things tend to be invented due to individual striving for perfection and are manifestations of genius.

In fact, there is considerable evidence that slash and burn horticulture, nomadic pastoral, and forager economies are adequate, and even produce abundant food at the cost of considerably less arduous labour than was typical of agricultural economies until the mechanization of farming. Certainly these economies featured higher rates of infant and childhood mortality, but so did the pre-industrial feudal society.  Humanitarian concerns leading to widespread vaccination and health care also caused unprecedented population growth. What tends to be overlooked is the fact that this in turn led to changes in land use, which resulted in malnutrition, local competition over resources, and suffering due to violence, racism, and poverty.

The historically accurate view is that innovations tend to occur to solve problems. Seen thus, the whole industrial era could be seen as a scramble to innovate fast enough to solve all the problems arising from previous innovations!

Not so much progress, as redress, then.


Fallacy #6) The assumption that there is some kind of evolutionary master plan programmed into humans.

The evolutionary trajectory – both physical and economic, of our species, is often pictured as “progress”. Thus, cultural “evolution” is tacked on, to models of prehistory showing descent of bipedal creatures from tree-dwelling apes, gradual increases in brain size and technological sophistication, and the emergence of anatomically modern humans.


This sometimes creates the impression that the growth of population, and the shifts in economic and organizational complexity, over the last 10,000 years, occurred because of increased cognitive prowess – or “genetic pacification” or “self-domestication”.  This is often presented as the march of evolutionary progress, in human welfare and even in consciousness.


Fallacy #7)  That all human societies tend to be hierarchically organized, resulting from competition, so the strongest males dominate everyone else, and males tend to dominate most females.

This is clearly a fallacy, since most hunter-gatherers tend to have levelling mechanisms that create a relatively egalitarian access to food, shelter, solace, and reproductive opportunities. If anything, what has been proposed for much of the human evolutionary period, is a kind of reversal of dominance, where the strongest individuals actively ensure the welfare of the young and more vulnerable members of their groups.
Socio-economic inequality is not an inevitable outcome of the Neolithic revolution, either.

Fallacy #8) Humans are special snowflakes because God said so.

Can we really posit that humans are that different from other animals? Does the idea that density dependent changes in behaviour occur in humans seem so threatening to modern people, most of whom live in densely populated urban areas… so threatening that we cannot even explore it? Can we also look at how humans function as part of an ecosystem, in fact, often playing an active part for good or ill, in the web of life?

I would like to end by decrying a false dichotomy.  This is created when someone  presents the human past, evolving within a hunter-gatherer economy, as the representatives of a lost and peaceful Eden, and “evolutionary environment” that shaped our species and made us ill-suited to the denser aggregations, carbohydrate-rich diets, and fast pace of life in civilization.

There is no real evidence that humans are genetically shaped for the activities of any particular economy.  People only a few generations removed from living as hunter-gatherers take readily to careers in livestock farming, computer science, banking, stand-up comedy, and so on.  Conversely, hunter-gatherer diets even today vary considerably, and many are hardly lacking in carbohydrates from cereal or starchy roots. Indeed, it is because of this that these were among the first domesticated plants.

The point of most contention is always the issue of war – or, as some phrase it “coalitional inter-group lethal violence”.

The relative absence of war among mobile hunter-gatherers is often mistaken for assertions that all such societies, so typical of our evolutionary past, were pacifist paradises occupied by “noble savages”.  Critics, having first erected this straw man, then contest the evidence by pointing to reports of violence and murder in ethnographic reports and archaeological discoveries. They also tend to confuse the issue by mixing in reports from extant or prehistoric sedentary hunter-gatherers, and even from horticultural or pastoral economies.

Few go so far as to publish insinuations that researchers specializing in the study of hunter-gatherers were, at best, suffering from romantic delusions, or, at worst, dishonest. The presentation of modern day hunter-gatherers, as if their economy survived only due to isolation, is of course closely linked to Fallacy #5. If you believe that hunting and gathering was riskier and more arduous than keeping livestock and growing crops, naturally what follows is an assumption that people only need to see these more desirable options and they will then emulate them. Closely linked to this fallacy is the assumption that there is a progressive directionality in economic and cultural change as innovations (like domestication and more substantial housing) are acquired because they “make life easier” or less risky.

What if the truth is stranger? What if sedentary life, food storage, plant and animal domestication, and institutions dedicated to leadership and social control were in fact developed to deal with the repeated failures?  What if the accumulative inventory of creative  solutions sometimes resulted in economic practices even MORE arduous and risky?  Does this destroy anything at all beyond our myth of progress?

I would like to plead for another piece of middle ground. Research among modern day hunter-gatherers may have overturned Hobbes, but does demolition of such previous negative stereotypes necessarily require that we depreciate either farming or civilization? They are riskier ventures, true, less stable in extreme densities, but no less bear stunning testimony to the adaptive scope and power of the collective cognitive niche; the fusion of two heritable but very different replicators.



Emotional Intelligence and Hissy Fits: The Cultural Ecology of Antifragility

turkey prozac

We all have experienced this at times: other people can drive us crazy! We love our families and friends, so why this old saying: fish and house-guests stink after three days?   Why can’t we live together peacefully, like elephants? Why aren’t we rational enough to avoid doing things that annoy each other?

Look at the list of things about, um, other people that can grind our gears… and even drive friends and family wild with frustration, or even apart with resentful anger: recklessness, cruelty, meanness, inconsistency, pranking, deceit, maudlin sentimentality, duplicity, illogical beliefs, gullibility, hubris, sanctimoniousness, jealousy, manipulative wheedling, conniving, and sheer over-the-top emotionality (making “a scene”, being a “drama queen”)

What if I suggested that such things about human behavior are not bugs but features? What if they are all part of the overall adaptation of human nature, that somehow helped turn our adjustments to living in social groups into the building blocks of a whole second replicator?

I suggest that “rationality” and analytical intelligence are evolved traits, with a starring role in shifting our species into a new level of networking and communicating, bumping up the flow of information, and personnel, within much larger communities and much wider geographical ranges than are characteristic of any other primate.   Inter-links between people at several or more degrees of separation meant that  individual networking actually disarticulated the individual from restriction to any local group. I suggest that even territoriality, linked to defensive aggression, and such a normal feature of the behavior of many primates, fell under negative selection in hominids at some point in our evolutionary history.

I, furthermore, suggest that dominance hierarchies and ranking systems, based on aggression, were actively curtailed. They had to be, to permit the evolution of the degree of infant helplessness, and the longer childhoods that accompanied brain enlargement during human evolution.   Sure, humans are capable of violence, especially in groups.   But I am suggesting that this was because violently aggressive individuals have always had to be contained and countered by coalitions of the brave and compassionate.   Without such opposition from the “good guys” who rally behind heroes, there would never have been sufficient blow-back to keep bullies and killers in line.

We individual humans are, for the most part, the products of a long evolutionary history that has favored compassion and cooperation, but that does not mean we are uniformly so kind and rational that we never lose our tempers, never yearn to get our own way, never wish for the personal luxury of solitude, having a beautiful object (a bauble or a blanket…!)

Now we might ask ourselves, what exactly was the evolutionary environment that gave a thumbs up to hyper-sociability, and a thumbs down to inter-group and intra-group competition and aggression? What possible environment generated higher fitness for individuals whose activity tended to flatten gradients of stress and life expectancy?

My initial insights in trying to answer this question arose from a field study among a patient and kindly bunch of hunter-gatherers. The Kua were my teachers for three years, and yet, as I left the Kalahari, my dominant sensation was not that I was leaving a group of peaceful and “noble savages”, but rather that this foraging economy produced individuals as ordinary, as flawed, as insightful, wistful, funny, and sometimes as intensely annoying, as any other humans I have ever known. It was merely a different economy, not another way of being human.

I have thought about this over the intervening years. What if our obvious capacity, for small deceptions, fractiousness,  and occasional surliness,  actually balances our kindness and sociability not by accident but, rather, as it were, by design? We can hardly ignore these aspects of human interpersonal antics today… well, what if it was precisely some kind of continuing see-saw between naughty and nice, convivial and argumentative, politeness interspersed with occasional huffy misunderstandings and temperamental behaviour, that was precisely the behavioural mechanism that kept these bipedal apes ecologically solvent?

2cab2e339136fb565536e7576f611f5cWhat if, in the long game of playing off individual genetic destinies against benefits to the collective cognitive niche, the occasionally explosive mix of emotional and irrational behavior was the key to generating “antifragile” cultural ecologies that were less likely to over-exploit any given local resource?

Thus, as humans evolved, reflection literally was an after-thought. As irritations and small conflicts increased, even as individuals found themselves holding back from escalating an argument, even as everyone’s impulse control was tested, there was always “the last straw”: an emotional scene that might set everyone packing to leave.   And, just as we still often find ourselves doing today, reflection after the event will then supply “good reasons” to justify it.

The fact that this pattern is at least partly learned, and not just an innate drive, made it more flexible still. It permitted more condensed and sedentary organization in richer ecosystems, more dispersed and mobile organization in poorer ones. Further, as learned system, it could incorporate the tighter social control during the more condensed phases within a cultural repertoire or an annual round of economic activity, without sacrificing the overall scope of individual networking.

People, today, when living in more crowded and sedentary communities, still tend to establish networks, through marriage and friendship, and those of each individual are still variable and rarely identical even among siblings. Furthermore, these tend not to be limited to a single community or neighbourhood. indeed, many individuals have maintained networks spanning the globe.

Despite the idea of “tribal” tendencies that cause links between people in groups to converge, individual life histories among human beings still tend to create ties (even “weak” ties) to more physically distant relatives, acquaintances, “pen pals”, and “old childhood friends”. Such links tend to be kept up more actively by some individuals. Sociological research into networks has suggested that such people are hubs in terms of information flows between communities. The idea that people across continents are hardly ever more than six links away from everyone else – the “six degrees of separation” model, has been experimentally confirmed many times. It began with the appearance, n 1961, of a seminal piece of work, in the form of a doctoral thesis by Michael Gurevitch, entitled “The social structure of acquaintanceship networks”.  This was presented and accepted by the  Department of Economics and Social Science at the Massachusetts Institute of Technology.

This research, and the many studies that followed, suggest that extensive networking is a human adaptation to culture, an aspect of the “social brain”: so perhaps it is not a contingency of any one kind of economic system. It is species specific, not culture specific. And we come by it through our evolutionary history as social mammals, and particularly, as social apes.

People appear to activate networks to achieve some consensus about who should undertake leadership roles.   In small scale subsistence economies, such leadership roles – in rituals, in setting up task forces, in dispute resolution, and in disciplinary courts, and in safeguarding community assets – often go to quiet and modest people that can be trusted not to abuse their positions. Often such responsibilities fall upon older people, especially those who are already hubs within local networks.

A reputation-based system of rank, thus, imposes a burden of responsibility on the most trusted elders, so they have authority over communal working groups, as well as for the convening of assemblies to undertake dispute resolution.

Even mobile hunter-gatherers can stick it out despite arguments with neighbors and even intimate betrayal, especially at times of greater aggregation, given that such ephemeral institutions for conflict resolution emerge at such times.  The rest of the year,  impulse control and reflective philosophizing over human foibles comes into its own.  And this is incorporated into even the most mobile forager culture. Networks of family and friends, therefore, can effectively restrain people: no one wants to lose a hard-won reputation for strength of character.

That the historical and ethnographic record from hunter-gatherer societies suggests that such roles can disappear and reappear with the seasonal cycles of aggregation and dispersal is critical. Mobile hunter-gatherers are not nomadic in the sense of wandering ceaselessly in search of food: on the contrary, they circulate through a variety of locations with known resources.

Arrangements between families to meet at particular localities to camp together are often made during seasonal aggregations, and are always negotiated via networks among friends and relatives. So the times of aggregation could be characterized as a kind of network convergence, pulled toward those particular gregarious and trusted persons who serve a hubs linking many individual networks together. And this temporary integration of networks in a larger gathering, under leadership of the most trusted and respected persons, affords people the necessary time to negotiate camping parties and permissions with those who hold primary rights to each small local part of the overall territory within the aggregate.

It is conceivable that this flexibility – what Julian Steward called various “levels of integration” above simple “bands” – represents a capacity for organizational complexity not often attributed to foragers. And yes, it does indicate that even mobile foragers have the capacity for political and social organizational arrangements well beyond the scale and scope of the simple camping party.

Recently, David Graeber and David Wengrow suggested that the emergence of such leadership and more complex organization, during hunter-gatherer aggregations, indicates that humans have an innate tendency to develop political hierarchy. Is the term hierarchy the correct one in this case?   The term is synonymous with “pecking order” and has often been used to describe the way dominance of one animal over another in a ranked system is related to access to food and solace.   It conjures up a flow of authority and even coercion from the individual at the “top” which controls the movement and opportunities of individuals further down.

Brian Hayden has even suggested that “aggrandizer” personalities make use of these emerging hierarchies during periods of aggregation to seize power over others, partly by persuasion and partly by Machiavellian manipulation of others.

Hayden suggests that these self-promoting persons may have some overlap with the sociopathic traits seen on Hare’s checklist. In other words, when people live in more settled aggregations, they become vulnerable to the self-serving aspirations of a narcissistic and psychopathic minority, who make themselves “big Men” and assume power over others. In other words, the emergence of the bully gang explains the way hierarchical political power evolved in humans. (1)

One of the difficulties with this interpretation is that it does not always correspond with observed behaviour in people who are diagnosed as psychopaths today (2).  Another is that it does not situate the cultural behavior (or the ruthless individual) in terms of the consequences within that particular environment (3).  The most striking aspect is, of course, the way both the New Guinea and the NW coastal systems of leadership tend to exhort their communities to produce surpluses.   There is an obligation to contribute to a communal store of fish or other food and even material goods, a store managed by a trusted – and haranguing – senior leader. This results in higher overall productivity than is called for by the simple calculus of dependency ratios.

This communal store is risk insurance. Food and other assistance can be secured for families who meet with illness or injury. I would suggest that is why leadership in a band or tribal system is a function of trust and respect; if leaders merely hoarded or extorted tribute for personal gain, they would not last long.

Such surpluses also fuel a certain level of recurrent ceremonial socializing. Feasts can be planned which assemble people from many more surrounding communities. Thus, while a display of generosity towards those in hardship within a community can demonstrate the character of the leader, any display of generosity where a village hosts many of its neighbors during a festival goes well beyond this. It demonstrates the quality of the people of the hosting community. The net effect is that the people in each community are given additional motivation to work harder.

Why is this important? I suggest that such regional festivals also redistribute food across regions where not all harvests of are likely to be equal. Each local community is thus less exposed to risks of famine. The community, with the most surplus food in any given year, trades this food for higher prestige and simultaneously reduces the chances that hungry neighbors will come to raid.

What happens if the concentrated settlement becomes more permanent: a village? Organizational improvisations can become entrenched institutions, with people developing hereditary rights to leadership roles – especially in adjudicating disputes.   Vested interests that resist change can entail internal conflict, which can be resolved by proof of generosity and earned reputation for diligence. In this case, the famous “potlatch” can also offset conflicts between neighboring communities over access to fixed resources.   Political and judicial roles maintain cooperation, restore peace, and offset risks in a sedentary community.

Lineages and “big man” systems, therefore, appear to be risk aversion strategies – aspects of cultural adaptation, not evidence of selection pressures on human genomes causing novel shifts in innate behaviours during the Holocene.  Hierarchies of coercion and the self-affirming narcissists are not, as Hayden suggests, products of evolutionary genetic change, but rather, I think,  illustrations of the behavioral plasticity of human beings, and the way people have learned to collectively cope with higher environmental risk.

Meanwhile, we see further cultural reification of emotional sensitivities to behavior causing physical or reputational damage to other persons. This takes the form of legal codes, ethics, human rights, and codes of polite behavior. This always involves symbolic evaluation; labeling behaviors as negative, positive and even sacred and profane.

However the danger under such circumstances comes not from people who are born psychopaths but from brain changes caused by power.  What the foragers seem to all have understood only too well was that the human “behavioural plasticity” can take a wicked turn: people have a great emotional weakness- the “sin” of pride, more specifically the kind of hubris that comes of being placed somehow above one’s fellows (4). That was the point that Richard Lee was trying to drive home when he wrote “Eating Christmas in the Kalahari”. One  old guy’s comment was: “If a man is praised for sharing the meat of his kill, he may come to think he is better (more important) than other people. Someday he might kill someone.” 

It has taken years of research to uncover this aspect of our human nature. To uncover the fact that the assumption of authority or wealth, even the the conformity that prompts a person to suspend their own judgement to a higher authority, can give rise to evil actions that hurt other people.  Even in an experimental setting putting people into roles that permit harm to others somehow turns off empathy and compassion. It seems that even just being richer than others, or higher up in the chain of a corporate or civil service ladder, can set in motion the “banality of evil.”.  This is a human characteristic that is far beyond normal fractiousness  and occasional hissy fits, and it gives rise to far more serious trauma and human tragedy than mere incidents of rage and tears.

The only good thing this research discovered is that it does not happen to everyone – there are people who see what is happening and fight it. People who say “this is wrong”. Often they are the folks who either stop the experiment, or in real life will resist tyranny and injustice.  They risk their lives – or die on the barricades. Human beings do have the capacity to act with heroism. The fact that we have a word for this in every known culture should tell us something.

By the way, the word for “hero” among foragers is often translated incorrectly as “warrior” since it means one who fights on behalf of others. I have a feeling that the first battles among human beings were fought, in fact, by heroes of this kind.  In his book, Hierarchy in the Forest, Christopher Boehm suggested that one of the very early developments on the path that led to the evolution of our species, was an overthrow of aggression-based dominance hierarchy.  This led to an egalitarian revolution led by coalitions of people who resisted bullies and protected the vulnerable.  If so, this converted the desirable ideal of adulthood from a self-serving “alpha” into a heroic “first among equals”.. the epitome of the trusted leader.

A human being who lives as a hunter-gatherer could thus refuse injustice; could fight for equal treatment – or walk away. Personal faults and foibles, jealousies and temper tantrums were possibly part of  human nature evolved to create a relatively antifragile economy where high mobility makes it possible to vote with one’s feet. A hunter-gatherer inhabits an economic system that preserved and even enhanced the stability and diversity of the ecosystem that supported that way of life.   A hunter-gatherer cannot be thrown out of their job or lodgings.

But most humans on this planet can, and frequently are. Entire peoples have had their whole landscape taken taken out from under them. Look at the Scottish highland clearances. And that was done by their own clan leaders. And the pain of people under such circumstances, and the guts it takes for them to try to remake their lives elsewhere, is heart-breaking. Makes me weep. And we wonder why the world is full of people in a rage, crying out for justice and radicalized; while those who are relatively well-off tend to develop elaborate explanations that affirm their own superiority. 


1) Brian Hayden Big Man, Big Heart? The Political Role of Aggrandizers in Egalitarian and Transegalitarian Societies


Anthropological theories of elites (leaders) in traditional societies tend to focus on how elites can be viewed as helping the community at large. The origin of elites is cast in functionalist or communitarian terms (viewing societies as adaptive systems). A minority opinion argues that elites were not established by communities for the community benefit, but emerged as a result of manipulative strategies used by ambitious, exploitative individuals (aggrandizers). While the communitarian perspective may be appropriate for understanding simple hunter/gatherer communities, I argue that elites in complex hunter/gatherer communities and horticultural communities operate much more in accordance with aggrandizer principles, and that it is their pursuit of aggrandizer self-interests that really explains the initial emergence of elites. This occurs preferentially under conditions of resource abundance and involves a variety of strategies used to manipulate community opinions, values, surplus production, and surplus use.

2) Although Hare does suggest that psychopaths might be more successful within aggressively competitive systems, their comparative rarity even after some five thousand years of hierarchical civilization tends to weaken arguments that such systems are functionally dependent upon the success of a type of personality. It seems more likely to me that the development of stratified societies may have occasionally increased the chances of highborn psychopaths not being spotted and eliminated.

3) See: “Pathways to power: Principles for creating socioeconomic inequalities” in Foundation of Social Inequality edited by T. D. Price and G. Feinman. 1995.“Pathways+to+power:+Principles+for+creating+socioeconomic+inequalities”+in+Foundation+of+Social+Inequality+edited+by+T.+D.+Price+and+G.+Feinman.&source=gbs_navlinks_s

(4) see  Monbiot on “the Self-affirmation Fallacy” where he summarizes recent research showing that socio-economic inequality generates precisely the kinds of narcissism that Hayden wishes us to believe is psychopathology  expressed in hierarchical leaders. “The findings of the psychologist Daniel  Kahneman, winner of a Nobel economics prize, are devastating to the beliefs that financial high-fliers entertain about themselves . He discovered that their apparent success is a cognitive illusion. For example, he studied the results achieved by 25 wealth advisers, across eight years. He found that the consistency of their performance was zero. “The results resembled what you would expect from a dice-rolling contest, not a game of skill.” Those who received the biggest bonuses had simply got lucky.

Such results have been widely replicated. They show that traders and fund managers across Wall Street receive their massive remuneration for doing no better than would a chimpanzee flipping a coin. When Kahneman tried to point this out they blanked him. “The illusion of skill … is deeply ingrained in their culture.”

So much for the financial sector and its super-educated analysts. As for other kinds of business, you tell me. Is your boss possessed of judgment, vision and management skills superior to those of anyone else in the firm, or did he or she get there through bluff, bullshit and bullying?”

In contrast, of course, the operation of networks – which can be sensitive communicators of reputations based on observed ethical and kind behavior, continue to do, in these other forms of economic system, exactly what they do in hunting and gathering economies:

Are Humans Innately Warlike?

this_is_sparta_300_king_leonidas_warrior_sword_shout_rage_4043_1280x960A book by Steven LeBlanc, anthropologist, has me in a kind of outraged shock. It seems that he has fallen for the view that humans are naturally violent, aggressive, deceitful, manipulative. Machiavellian, in fact. Here is his article, the text of which I have included below, along with my response:

“Not only are human societies never alone, but regardless of how well they control their own population or act ecologically, they cannot control their neighbors’ behavior. Each society must confront the real possibility that its neighbors will not live in ecological balance but will grow its numbers and attempt to take the resources from nearby groups. Not only have societies always lived in a changing environment, but they always have neighbors. The best way to survive in such a milieu is not to live in ecological balance with slow growth, but to grow rapidly and be able to fend off competitors as well as take resources from others.

“To see how this most human dynamic works, imagine an extremely simple world with only two societies and no unoccupied land. Under normal conditions, neither group would have much motivation to take resources from the other. People may be somewhat hungry, but not hungry enough to risk getting killed in order to eat a little better. A few members of either group may die indirectly from food shortages—via disease or infant mortality, for example—but from an individual’s perspective, he or she is much more likely to be killed trying to take food from the neighbors than from the usual provisioning shortfalls. Such a constant world would never last for long. Populations would grow and human activity would degrade the land or resources, reducing their abundance.

“Even if, by sheer luck, all things remained equal, it must be remembered that the climate would never be constant: Times of food stress occur because of changes in the weather, especially over the course of several generations. When a very bad year or series of years occurs, the willingness to risk a fight increases because the likelihood of starving goes up.

“If one group is much bigger, better organized, or has better fighters among its members and the group faces starvation, the motivation to take over the territory of its neighbor is high, because it is very likely to succeed. Since human groups are never identical, there will always be some groups for whom warfare as a solution is a rational choice in any food crisis, because they are likely to succeed in getting more resources by warring on their neighbors.

“Now comes the most important part of this overly simplified story: The group with the larger population always has an advantage in any competition over resources, whatever those resources may be. Over the course of human history, one side rarely has better weapons or tactics for any length of time, and most such warfare between smaller societies is attritional. With equal skills and weapons, each side would be expected to kill an equal number of its opponents. Over time, the larger group will finally overwhelm the smaller one. This advantage of size is well recognized by humans all over the world, and they go to great lengths to keep their numbers comparable to their potential enemies.

“This is observed anthropologically by the universal desire to have many allies, and the common tactic of smaller groups inviting other societies to join them, even in times of food stress.

“Assume for a moment that by some miracle one of our two groups is full of farsighted, ecological geniuses. They are able to keep their population in check and, moreover, keep it far enough below the carrying capacity that minor changes in the weather, or even longer-term changes in the climate, do not result in food stress. If they need to consume only half of what is available each year, even if there is a terrible year, this group will probably come through the hardship just fine. More important, when a few good years come along, these masterfully ecological people will /not/ grow rapidly, because to do so would mean that they would have trouble when the good times end. Think of them as the ecological equivalent of the industrious ants.

“The second group, on the other hand, is just the opposite—it consists of ecological dimwits. They have no wonderful processes available to control their population. They are forever on the edge of the carrying capacity, they reproduce with abandon, and they frequently suffer food shortages and the inevitable consequences. Think of this bunch as the ecological equivalent of the carefree grasshoppers. When the good years come, they have more children and grow their population rapidly. Twenty years later, they have doubled their numbers and quickly run out of food at the first minor change in the weather. Of course, had this been a group of “noble savages” who eschewed warfare, they would have starved to death and only a much smaller and more sustainable group survived.

“This is not a bunch of noble savages; these are ecological dimwits and they attack their good neighbors in order to save their own skins. Since they now outnumber their good neighbors two to one, the dimwits prevail after heavy attrition on both sides. The “good” ants turn out to be dead ants, and the “bad” grasshoppers inherit the earth.

“The moral of this tale is that if any group can get itself into ecological balance and stabilize its population even in the face of environmental change, it will be tremendously disadvantaged against societies that do not behave that way. The long-term successful society, in a world with many different societies, will be the one that grows when it can and fights when it runs out of resources. It is useless to live an ecologically sustainable existence in the “Garden of Eden” unless the neighbors do so as well. Only one non-conservationist society in an entire region can begin a process of conflict and expansion by the “grasshoppers” at the expense of the Eden-dwelling “ants.”

“This smacks of a Darwinian competition—survival of the fittest—between societies. Note that the “fittest” of our two groups was not the more ecological, it was the one that grew faster. The idea of such Darwinian competition is unpalatable to many, especially when the “bad” folks appear to be the winners.”

Helga’s response:

My first objection to LeBlanc’s scenario is twofold: one, that human populations do not always grow; secondly, that their birthrates (let alone their whole cultural systems) are not necessarily under conscious control.

Think about it.

INTENTIONAL ecological balance? What, now humans are in some kind of intentional control over their cultural systems? Surely no one could be that naive? One might try to create such control with careful permaculture systems under strictly controlled laboratory conditions — but there always seem to be element of chaos that intervene, some of which are social, some microbial, and some just oversights of reality.

No, truly, such things could hardly have evolved. Why would they? For most of our evolutionary history, humans were foragers. Among mobile foragers on a diet of wild plants and animals, the mechanisms of birth spacing, of infant mortality, of accidental death, of periodic diseases and natural accidents and predation would have balanced the population without any thought being required. And this would have been the case during 99 % of human evolution.

The only time thought was required was when too many kids started being born too closely spaced, and enough of them survived to accelerate the doubling time to the point where local game and wild plant foods became scarce. This might have happened, once in a while, to sedentary foraging peoples based on fixed resources like annual fish spawning  runs or huge stands of wild grain, but it would hardly have been typical of most mobile forager groups.

However, when, throughout a culture area, reciprocal access to resources was no longer a viable strategy for long-term survival, then the scenario so skillful imagined by Steve DOES obtain. THEN the whole game has to change to the nastier one where you simply went over to your neighbors and took their food away (if they had any, and killed them all so that next year they would not do the same to you.)

This is of course a pretty awful but effective survival strategy.  There are plenty of indications that it became increasingly common during the Mesolithic period just before food production systems got underway (another adaptation to resource scarcity and local plant depletion).  In fact, the evidence for this pattern is so overwhelming for this period, and so common among contemporary horticultural, pastoral, and agricultural cultures that it was the subject of a well-researched book by Laurence Keeley.  Reading this, it is fairly easy to forget that there is also a case to be made that it was an adaptation first seen for the small fraction of humanity who got stuck in a demographic trap.

Which means it is within the human range of possible responses to high population.  It is a behaviour algorithm that requires a trigger. That trigger, it appears, was usually an upward shift in population: resource ratios over a large culture area, a shift that precluded options based on reciprocal access (redistributive feasting, trade, and migration) and made raiding and warfare into an adaptive strategy for long term control to keep that ratio from getting much higher.

Just because the resort to inter-group violence is within the range of human behavior does not, however, make it a likely part of our evolutionary environment of adaptation. The scientific evidence, both archaeological and ethnographic, does not support such a conclusion. The Mesolithic was only, at most, 12-15,000 years ago, and it did not begin then for all humanity, but only for a TINY proportion of the world’s human population. Most humans were still foragers until well into the last three thousand year period, indeed, in Australian, much of North America and Sub-equatorial Africa, they were most foragers until 150 years ago.

Steve LeBlanc seems to assume that population growth rates are under conscious control. There is no real evidence that this is really true of most human cultures. There is some evidence that individuals and family groups might make decisions to kill or abort the occasional child due for various reasons, but no evidence that anyone has fully understood the relationship between breastfeeding, prolonged weaning, hormonal cascades affecting ovulation, and the profound effects on this system of high calorie weaning foods.

Among many mobile hunter-gatherers, the birth spacing is much longer than among farming or pastoral people because of breast-feeding that continued well into the third year of a child’s life. Regular stimulation of the mother’s nipples, by suckling, causes a cascade of hormonal responses that tends to prevent ovulation – as long as regular breast-feeding frequency is sustained throughout the 24 hours cycle (every 2-3 hours). As long as the infant sleeps with its mother, breastfeeding can continue throughout the night without much disturbing the sleep of the parent. Among hunter-gatherers, where high calorie weaning foods such as cereals and animal milk or not available, this continuing lactation gives the child’s gut time to grow large enough to handle enough fruit, vegetables and meat to complete the weaning process during the fourth year of life.

Steve LaBlanc does not go into any of this. He ASSUMES a rate of population growth, similar to that of a modern farming community, was true of Paleolithic hunter-gatherers. Many archaeologists do. However, we have lots of evidence that mobile foragers did NOT have this level of population growth. And, while it is known that foragers have a variable birth spacing depending upon diet and activity levels, no past forager culture had viable alternatives to maternal lactation.

I would suggest that there was an fairly rapid shortening of the birth-spacing interval – from an average of 48 months to about 24 months- with the onset of sedentary villages around stores of food cached for long periods (like dried fish, cereal grains, potatoes etc. Generally, these stored foods provided high-calorie weaning foods of a kind that mobile hunter-gatherers did not have on hand very often. So then, since their infants did not continue to suckle as frequently, mothers got pregnant sooner than they would have under the old forager system.

This means that LeBlanc’s book is not about humans during the first 99% of their history; not about how evolutionary forces shaped human nature.  No, it is about the demographic trap that happened during the Mesolithic, that led to war, starvation, rich and poor, domestication of animals and plants, and eventually, civilization.” It does not describes in detail just how the process of settling into more permanent villages around food storage facilities holding millions of calories (of cereals, dried vegetation, meat and fish) led to a demographic trap that no one could have foreseen, and resulted in a population explosion.  But that is what would have had to happen leading BEFORE unfolding into the sort of scenario that Steve LeBlanc discusses.

HOWEVER,  you have to remember:  this scenario is rarely true of human inter-group relations during the earlier period – in a world of foragers, things would be a bit different.

SO, no, we are not the dazed survivors of millions of years of little territorial groups who survived because we frequently went out and beat the shit our of each other and stole each other’s lands and females. Please. We evolved to be smarter and considerably more nuanced in our inter-group behavior than such a chimp-based model would suggest.

We would hardly need all those inhibitory brain connections leading out of the prefrontal cortex into the old brain. Now there is an algorithm-generating module with an interesting agenda: it is the CEO of the final actions taken by the system, unless overridden by high emotion, fear, or “orders” from some political hierarchy… and it also permits humans to “stand back” mentally and evaluate impulses initiated by both rational and intuitive parts of the brain.

I suspect that the rapid expansion of the prefrontal cortex in our species was to permit the full integration of information and careful evaluation of options for responding to culturally complex situations both within and between cultures. I think we evolved to be strategic thinkers, not only in the Machiavellian sense, but also in the Humanist sense – we tend consider the long-term benefits of alliances and trading partnerships (both in terms of expanding our own groups options in times of scarcity and also in terms of expanding our access to a wider gene pool).

Finessing inter-cultural relations that permitted trading networks to span entire continents took subtlety and self-control far superior to that involved in resorting to violence every time someone had resources you wanted or needed. There is a reason we humans evolved a brain that can easily handle not just one but many languages, and not just one but many cultural inter-faces.

Sorry to be a bit short-tempered about all this, but, just because it is the “man as nasty beast” model, that is currently popular (it has been since the days of Plato and Aristotle), does not mean it is based on the science. It is based on the wishful thinking that some kind of state control system must control human badness (which is assumed to be inevitable) and is therefore justifiable. That part of the philosophers toolkit of ideas was always propaganda justifying a ruling class and a mythology to rationalize the expansion taking land away from hunter-gatherers all over Eurasia.

Newsflash: Man is not a nasty beast. He is smart and funny and, given half a chance, would rather talk things over than get into a fight that might hurt him or sour relationships with potential trading partners and allies – or even potential mates and in-laws. Give humanity credit for having evolved to be a bit smarter than other chimps. Please.

The competitiveness of the cultures in Steve LeBlanc’s example only obtains if there is an ecological constraint – an eventual limitation of resources such that if one culture keeps expanding its population, it must also keep expanding its territory, and that it must do so at the expense of neighbouring cultures.

My second objection is a bit more complex. You see, if you consider the kind of cultural pattern that LeBlanc suggests would be successful under conditions of competition between cultures for access to resources, it is the aggressive, pro-nalist, warrior-culture. Even if humans are born innocent of any genetically mediated tendencies for aggression and violence, what LeBlanc proposes is that most humans on the planet today are descended from the winners of a fairly deadly competition between rival cultural systems.

Yet the archaeological and ethnographic record does not support this. For 99% of our evolutionary history (which spans about 5 million years) we were foragers, and we were pretty thin on the ground.

Cultures would not have needed to compete over resources. In fact, points of contact with various neighboring cultures would have been conduits through which a forager society might gain access through trading and gifting relationships to resources from a much wider range of ecosystems than lay within their own annual round of movement. We have evidence of such exchange within hunter-gatherer societies during the last 200,000 years which spanned entire continents.

We have evidence, from hunter-gatherer cultures living 12,000 years ago, 2000 years before the domestication of plants and animals, of cooperative efforts which appear to have created purely ritual and ceremonial sites bringing hundreds and possibly thousands of people together -possibly several times a year- from over vast inhabited wilderness teeming with wildlife and rich plant life. These were not competing cultures, they were cooperating, The sites were places of healing and ritual. In one article, one of the sites is even even fancifully referred to as a possible source of the myth of a garden of Eden. These cultures were not associated with any evidence of warfare or violent death.

We evolved without much need to be “warding off the neighbouring tribe” since there WERE no “tribes” until fairly recently. Tribal organization is due to the development of a combination of corporate groups based on lineal descent (patrilineal or matrilineal) involved in allocating primary rights to a fixed natural resource (a salmon run, an area of land producing wild cereals reliably, a herd of animals like reindeer or cattle etc). It also usually involves sedentism for a part of the year at least, a higher rate of population growth than among mobile hunter-gatherers, and some sort of status ranking, both among individuals and also among various lineages. Higher population growth rates would inevitably lead to competition over fixed resources, and this, inevitably to some fighting between groups with opposing claims. Hence, warfare. but we do not see any real evidence of warfare much before 10,000 BP, although of course we do see evidence of murder and cannibalism.

Proposition: It is unlikely that anatomically modern humans evolved in a context of frequent violent group conflict among themselves. Most contemporary studies of mobile foragers have revealed a consistent economic pattern involving reciprocal access to resources. This means that when the rain did not come, or the antelope failed to migrate near your own home range, you did not have to go take away your more fortunate neighbour’s food or territory, you simply went and lived with them for the duration. Since your neighbours were usually relatives of one kind or another -even fictive kin will do- they could and did do the same when the position was reversed.

In fact, you might just want to go visit your neighbours anyway, in the course of a yearly round, and they might just want to come visit you. Picture this not in terms of any permanent houses and villages, but as a set of inter-related people, say 2000 strong, spread out over thousands of square miles, all of them living in small camping groups.

Mobile foragers live in camping groups of 3-5 families, and these are fluid rather than fixed in their membership. Every few weeks or months when camps break up and move, chances are that at least some families will go camp with other friends or relatives than the ones they were living with before. Camps are loosely organized around kinship lines, but residential patterns are neither nor necessarily matrilocal nor patrilocal.

We know a good deal about the way modern and recent mobile hunter-gatherers live in the region of sub-Saharan Africa where all humanity originated. We also have recently confirmed that the “San” (formerly often known as “Bushmen”) hunter-gatherers of this region are the most genetically diverse of all human groups and are the modern day representatives of the source population from which all the rest of humanity came.

I lived with a group call the Kua San, who were primarily mobile foragers. They had fairly typical bilateral kinship reckoning (meaning both the father’s and the mother’s relations were considered equally important and the child was not a “member” of one kin-group to the exclusion of others). They were ruthlessly egalitarian. I say ruthless because even children could not be ordered about by adults or be made to do work for them, or be sent to bed.

There was no rape. No child “abuse”, no wife-beating, no chief or headman, no permanent public roles of leadership. Social control was a matter of strict sharing protocols, public put-downs, by , mocking and gossip, of any and all even potentially pretentious behaviour. The most respected and sought after people were the most generous, diligent, witty, and “open-hearted”.

If there was competition for dominance or socially acknowledged rank, it was played out in this arena of behaviour. Also greatly valued was intelligent foresight, in terms of organizing camp movements and anticipating timing and locations of resource windfalls (like local peaks in berry production seven months after a fire, or the movements of migratory herds). So “dominance” was achieved neither by aggression or wealth, and certainly not by any kind of swaggering and never by being acknowledged as a dangerous person.

There was, however, murder, or at least murderous attack. Most of this was due to some explosion of passion due to adultery or injustice, and endlessly discussed in shaming gossip.

Most of the incidents I recorded were of fights or attacks that stopped short of being lethal (just for their subsistence activities, the men are all lethally armed with poisoned arrows and sharp knives; women always carry knives).

Apparently, murder was rare enough in any one generation that people had vivid memories of such events. The most recent murders happened only forty years earlier, and was especially remembered because it involved a man who showed no remorse, who had frequently “bothered” others because he seemed to have a “distant heart” (no empathy) and was inclined be selfish and even deliberately to eat alone.  He did this especially when he found those kind of treats that any self-respecting person would normally have shared with as many people as possible (like the tails of a a particularly tasty lizard, or recently laid water-bird eggs). He was charming enough to survive in this society well into his twenties, but then there was some dispute with a young lady he was courting and the girl was killed, or was severely injured and may have later died (my informants varied in their accounts of this).

In any event, since all these people are excellent trackers, the evidence in the sand clearly indicated that this fellow had been responsible. He denied, and then he admitted, but claimed it was not his fault that others drove him into a temper. Yet the killing did not seem to have been done in temper, but by stealth and surprise.

After some time, everyone was very uneasy about the murderer. They did not like to have him in their camping group, so he became a bit of a wanderer, for even his parents and siblings did not like to have him close. He did make some friends among more distant cousins and spent time with them.

Then, apparently, it nearly happened again. This time the victim lived. It was enough. To make a long story short, his closest relatives took the responsibility and set a trap for him – he was ambushed, and killed. They showed me where he was staked out for the scavengers… because, sadly, he was not a human.

I have no proof, but this person sounds to me like a psychopath… and, if so, this was how they dealt with a psychopath.

I did further interviews to find out more about the “low hearted” people. It seems that those who simply failed to show the expected kind of active empathy for others – especially if this was seen from childhood on – were gradually marginalized.  They could not be trusted, and generally kept under control by gossip, mocking, and -when anything occurred that showed their selfish tendencies or any evidence of unconcern for another person- with incredulous laughter. These people were the only ones in the society that generally wound up camped out alone on the margins of each camp group. Everyone felt sorry for them, and they were not of course, excluded from sharing networks. There was one women of this kind among the Kua when I lived among them, and this adult had failed to attract any permanent mates, although she had one child.

If anything, I suspect that this kind of neurological defect (if that is what it is) has become more common (due to the lessening of negative selection of the kind I implied in the example above) since tribal societies developed during the Mesolithic and after the Neolithic.  Especially since highly stratified societies began to occur, where a psychopath could be born into a highly placed lineage and be protected by his rank from ordinary social controls. Today, given the exponential rise in human numbers, we undoubtedly have millions of them around, just due to the sheer volume of humans.

I doubt that, aside possibly from such psychopathology,  that any human being is “born bad”. Steve LeBlanc is suggesting that there is a certain inevitable tendency for aggressive and selfish cultures to eventually out-compete peaceful human groups who controlled their population and lived sustainably. In other words, his model suggests that modern humans are predominantly descendants of a long evolutionary history favouring those who did not control their population growth and therefore aggressively expanded their territories at the expense of their neighbours.

LeBlanc’s model, then,  could be taken as support for the idea that most of humanity is doomed to be irrational and aggressive because we are mostly the descendants of what he calls “ecological dimwits” Who are these people? Read on: “They have no wonderful processes available to control their population. They are forever on the edge of the carrying capacity, they reproduce with abandon, and they frequently suffer food shortages and the inevitable consequences”.

Take a look at a later line from his article:

This is not a bunch of noble savages; these are ecological dimwits and they attack their good neighbors in order to save their own skins. Since they now outnumber their good neighbors two to one, the dimwits prevail after heavy attrition on both sides. The “good” ants turn out to be dead ants, and the “bad” grasshoppers inherit the earth.”

Note that Steve does not make it explicit whether the state of ecological dimwittedness is occasioned by “human nature” (it is in our genes), or whether it is due to cultural conditioning.

This is quite clever of him, for to have stated outright that it was biological would put him squarely in the camp of Robert Audrey (the “Territorial Imperative”, Laurence Keeley “War before Civilization”, Malcolm Potts (“Sex and War”) and even John Grey (Straw Dogs) and Steven Pinker among many others, starting with Plato and Aristotle, whose works of political philosophy take greed and violence of humans in a “state of nature” for granted, thus declaring that humans did much better if “governed” by elites consisting of wise and learned men, within a city state. Later philosophers like Hobbes and even John Locke essentially took the same position – and of course, Hobbes is famous for describing human life in a “state of Nature” as “brutish, nasty and short”.

I do not particularly like the idea of humans being “born bad” – this is not what the science shows.

Whether we look at the evidence from the study of young children’s behaviour, cognitive functioning, or neural imagery, or the evidence from the ethnographic record of foragers, or archaeological evidence from the pre-Neolithic period, we find evidence of widespread trade and intermarriage among neighboring cultures, and even evidence of cooperative ventures such as building massive ritual sites.

Few ethnographers have lived with hunter-gatherers. But, of those who have, many have questioned this judgment.  Some have gone further, like Richard Lee, whose work among the foragers of the Kalahari turned Hobbes on its head.

Proposition 2: There is every indication that humans evolved to be adapted to learning a cultural system and a language. So if LeBlanc’s evolutionary winners were the “dimwits” they must, in my view, have been made irrational and ecologically dimwitted by their upbringing – in other words, they were taught those ways of thinking and behaving by their parents and by the rest of the culture they were born into.

WHY is this important?  Well, consider the implications – what is the half-life of a species doomed by its very “nature” to be “ecological dimwits”.  Either we are “born bad” or we learn to be stupid (ecologically) in some cultural systems.  It makes a big difference – in the first model, the whole human species is doomed, in the alternative view, only certain cultural systems are.

We are a relatively young species. Compared to various species of sharks, which have been around for millions of years, we are newly minted, barely 200,000 years old. Perhaps I speak partly out of personal hope, that we might be here for a while longer, but I also contest the very evidence used in support of the idea that humans are hard-wired for a level of aggression and competitiveness that will ultimately be self-destructive.  I think we can marshal plenty of evidence that indicates that these “bad” behaviors are even more subject to the parameters and exigencies of culture than are “good” behaviors like altruism and compassion.

Has Homo sapiens not spent more time becoming genetically and cognitively fine-tuned to be cooperative and pragmatic in our dealings with con-specific neighbors than we have to been fine-tuned to be competitive and hostile?

Let’s look at the evidence.  First of all, there is even evidence that cooperation and compassion is found in our nearest living relations: the Bonobo (Pan paniscus, or pygmy chimpanzee).

Frans de Waal’s recent work, summarized by the following quote in Wikipedia: “His research into the innate capacity for empathy among primates has led De Waal to the conclusion that non-human great apes and humans are simply different types of apes, and that empathic and cooperative tendencies are continuous between these species. His belief is illustrated in the following quote from The Age of Empathy:

“We start out postulating sharp boundaries, such as between humans and apes, or between apes and monkeys, but are in fact dealing with sand castles that lose much of their structure when the sea of knowledge washes over them. They turn into hills, leveled ever more, until we are back to where evolutionary theory always leads us: a gently sloping beach.”

What always amazes me is the power of our dominant cultural paradigms. The idea of original sin, for instance, was most likely a notion seized on during the late Mesolithic/Early Neolithic period. It arose in those cultures where an organized priesthood was developing to prop up the rights of a ruling class in an increasingly crowded and stratified society. It is a made up story, no matter how it is phrased (whether in terms of some inherent wickedness or in terms of a soul’s long progression towards perfection through numerous lifetimes).  And it is made up – in fact, designed as the perfect tool for social control. Clearly, if all of our present reality (including the conditions we are born into) is divinely ordained and purposeful, then we only need to be shown the rule book to get through it and on to something better. God forbid we should rebel, kill our rulers, end injustice, and live better, if it is our “lot” in life to be born poor. Hence, the widespread appeal of Christianity, which makes a kind of back-assed virtue out of poverty and suffering.

It is amazing to see people succumb to this idea of human nature being inherently “bad”, violent, flawed, “rapacious” (as in John Gray’s Straw Dogs). The popularity of Malcolm Potts book Sex and War is another example. John Gray, whose work has been compared to Richard Dawkins in influencing modern scholarship concerning the human condition within an evolutionary paradigm, is an author I respect and admire, but even he makes a classic error (or should I say falls victim to his cultural paradigm) when he says things like the following:

The destruction of the natural world is not the result of global capitalism, industrialization, ‘Western civilization’ or any flaw in human institutions. It is a consequence of the evolutionary success of an exceptionally rapacious primate. Throughout all of history and prehistory, human advance has coincided with ecological devastation. ” ~ John Gray, Straw Dogs

Well no. I lived with one of hunter-gatherers in the Central Kalahari, a people who have been foragers since the dawn of our species, and that region is home to one of the highest known biomasses of wildlife on the planet today. As mobile foragers, within the environment where we evolved, we are hardly “rapacious primates”.

The key term here is perhaps “human advance” – but surely this is highly ambiguous? Does he mean “progress” in some absolute sense of greater numbers, knowledge, or other parameter? Or does he mean physical spread out of Africa? If the latter, then he is treading on precarious logical ground. He is confusing the results of adding a new species to an ecosystem (often a disruptive thing, just look at how rabbits practically ate Australia) with -dare I say it? – some kind of flaw in human nature (sounds like “original sin” to me).

So why war?

“The first time this issue was brought up in the mainstream scientific community was in 1986 when scientists from around the world got together to discuss the psychological and biological evidence proving that human nature is no excuse for violent behavior. The findings that were released came to be known as “The Seville Statement”.

This statement made 5 propositions, which are:
1. “It is scientifically incorrect to say that we have inherited a tendency to make war from our animal ancestors.”
2. “It is scientifically incorrect to say that war or any other violent behavior is genetically programmed into our human nature.”
3. “It is scientifically incorrect to say that in the course of human evolution there has been a selection for aggressive behavior more than for other kinds of behavior.”
4. “It is scientifically incorrect to say that humans have a ‘violent brain’.”
5. “It is scientifically incorrect to say that war is caused by ‘instinct’ or any single motivation.”
Since the Seville statement there have been many more studies reconfirming the propositions put forward. Just this past February a new study by a biologist named Frans de Waal showed that animals are naturally prone to cooperation when in the right circumstances.”

(Source of quote: )

Some anthropologists have suggested that warfare and the subjugation of women (and things like female infanticide) are cultural adaptations to overpopulation dangers inherent in the sedentary lifestyle and high cereal-based diet in sedentary Post-Neolithic societies.  Remember that I previously mentioned that birth spacing in foragers tends to be about 48 months, compared to 24 months or less in farming economies.  This leads to overpopulation and attendant danger of starvation and extinction.  Most larger intensive agricultural civilizations have failed.  In smaller societies reliant upon shifting cultivation and usufruct tenure (where land is held in common and use rights are temporary) the ratio of forest and secondary growth in “long fallow” to cultivated area in any given year is very high – often only about 20% is cultivated.

If the population rises beyond the point that can be supported by the food grown on that 20%, then the length of the fallow period must drop, and forest barely has time to grow back before it is again cut down.  Fertility suffers, so even more land must be put under cultivation in any given year… and it rapidly reaches a crisis where soil degradation reaches the point where the whole system collapses. (Or, as in a few places, historically, an even more intensive system featuring use of irrigation, ploughing, and animal -or human- manure was instigated. Deforestation and soil losses mark the birth of “civilization”. All of these changes generally leading to such an increase in need for labour, and in competition for land, that the result has been expansive and predatory warfare to procure both and counter the greater risk of total collapse.  Elites managed common welfare and kept internal peace; all the while supervising external expansion by violent means, or annexation by threat of such violence.  These eventually became the phenomenon we know today as the “state-level” society, and these kinds of systems have now enveloped the whole human world and are in the process of adding the resources and/or labour of the last surviving hunter-gatherers, horticulturalists, and pastoral economies to their futile Ponzi scheme.

The horticultural economies still found in the world at present are generally those that have found a way to avoid this demographically induced disaster.  The way that most of them have done it is through the combination of persistent endemic warfare, feuding a raiding between villages, the development of a warrior cult, and the simultaneous abasement of the status of women.  Women often become the subject of raids, and their levels of emotional stress, abuse, malnutrition, and even genital mutilation, tend to rather high.  This is often coupled with dietary restrictions during pregnancy and higher rates of death in childbirth, and of course, a much higher rate of female infanticide.

That certainly keeps the rate of population growth down.  Meanwhile, because of on-going hostilities and fear of raiding, villages tends to be spaced widely,  This means that there are larger zones of forested wild lands between villages, which supply wild plant food, medicines, and animal protein.   Predictably, for example, the most warlike and violent villages in the Yanomami studied by Napoleon Chagnon were also the ones with the most territory and the healthiest people.

Cultural adaptations that “work” and result in sustainability within an ecosystem are not always the most pleasant for the individuals within those systems, although that is why each such culture contains vehement rationalizations for misery. Explaining away women’s subjugation as a necessary condition following from some kind of natural inferiority of the female mind or temperament is common. The Victorians actually believed that higher education or participation in a profession withered women reproductive organs or resulted in hysteria behaviour!

It is all very frustrating, but it is much easier to counter if it is not seen as an ideological issue, but rather as an outcome of systemic problems with the nature of agricultural (and, recently, industrial) economies. If horticultural systems had effective birth control, I imagine, none of this war and infanticide would have developed: it was only RELATIVELY more successful than peaceful alternatives. And all this hopeful rhetoric about the way birthrates have fallen in as child mortality has gone down in industrial societies only goes to show is that, given a choice, most couples would rather have fewer children and invest much more in each one, as was the case throughout most of human evolution. In fact, large industrial states did and still do have an over population problem since they have only managed agricultural surpluses by relying on fossil fuel-based chemicals and machines… and this is unsustainable without extreme damage to the ecosystem now that we have past Peak Oil.

—Helga Ingeborg Vierich