All the notes were taken directly from the source mentioned.
– – –
Between 1976 and 2005 the average homicide rate for white Americans was 4.8, while the average rate for black Americans was 36.9.
These United States. When it comes to violence, the United States is not a country; it’s three countries.
North-South difference is not a by-product of the white-black difference. Southern whites are more violent than northern whites, and southern blacks are more violent than northern blacks.
Why has the South had such a long history of violence? The most sweeping answer is that the civilizing mission of government never penetrated the American South as deeply as it had the Northeast, to say nothing of Europe.
American South is marked by an obsession with credible deterrence, otherwise known as a culture of honor.
Is there an exogenous cause that might explain why the South rather than the North developed a culture of honor? Certainly the brutality needed to maintain a slave economy might have been a factor, but the most violent parts of the South were backcountry regions that never depended on plantation slavery
So it’s sufficient to assume that settlers from the remote parts of Britain ended up in the remote parts of the South, and that both regions were lawless for a long time, fostering a culture of honor.
These regions were peopled by young, single men who had fled impoverished farms and urban ghettos to seek their fortune in the harsh frontier. The one great universal in the study of violence is that most of it is committed by fifteen-to-thirty-year-old men.102 Not only are males the more competitive sex in most mammalian species, but with Homo sapiens a man’s position in the pecking order is secured by reputation,
A famous study that tracked a thousand low-income Boston teenagers for forty-five years discovered that two factors predicted whether a delinquent would go on to avoid a life of crime: getting a stable job, and marrying a woman he cared about and supporting her and her children.
In fact, the proportion of unwanted children could even have increased if women were emboldened by the abortion option to have more unprotected sex in the heat of the moment, but then procrastinated or had second thoughts once they were pregnant.
Several studies have borne this out. Young pregnant women who opt for abortions get better grades, are less likely to be on welfare, and are more likely to finish school than their counterparts who have miscarriages or carry their pregnancies to term. The availability of abortion thus may have led to a generation that is more prone to crime because it weeded out just the children who, whether through genes or environment, were most likely to exercise maturity and self-control.
The first is that the Leviathan got bigger, smarter, and more effective. The second is that the Civilizing Process, which the counterculture had tried to reverse in the 1960s, was restored to its forward direction. Indeed, it seems to have entered a new phase.
Imprisonment physically removes the most crime-prone individuals from the streets, incapacitating them and subtracting the crimes they would have committed from the statistics.
Incarceration is especially effective when a small number of individuals commit a large number of crimes. A classic study of criminal records in Philadelphia, for example, found that 6 percent of the young male population committed more than half the offenses.
Moreover, people who commit violent crimes get into trouble in other ways, because they tend to favor instant gratification over long-term benefits. They are more likely to drop out of school, quit work, get into accidents, provoke fights, engage in petty theft and vandalism, and abuse alcohol and drugs.158
Incarceration can also reduce violence by the familiar but less direct route of deterrence. An ex-convict might think twice about committing another crime once he gets out of jail, and the people who know about him might think twice about following in his footsteps.
But proving that incarceration deters people (as opposed to incapacitating them) is easier said than done, because the statistics at any time are inherently stacked against it. The regions with the highest rates of crime will throw the most people in jail, creating the illusion that imprisonment increases crime rather than decreasing it. But with suitable ingenuity (for example, correlating increases in imprisonment at one time with decreases in crime at a later time, or seeing if a court order to reduce the prison population leads to a subsequent increase in crime), the deterrence effect can be tested.
Mass incarceration, even if it does lower violence, introduces problems of its own. Once the most violent individuals have been locked up, imprisoning more of them rapidly reaches a point of diminishing returns, because each additional prisoner become less and less dangerous, and pulling them off the streets makes a smaller and smaller dent in the violence rate.162
Also, since people tend to get less violent as they get older, keeping men in prison beyond a certain point does little to reduce crime.
James Q. Wilson and George Kelling in their famous Broken Windows theory, was that an orderly environment serves as a reminder that police and residents are dedicated to keeping the peace, whereas a vandalized and unruly one is a signal that no one is in charge.166
Criminologist Franklin Zimring put it in The Great American Crime Decline,
The researchers argued that an orderly environment fosters a sense of responsibility not so much by deterrence (since Groningen police rarely penalize litterers) as by the signaling of a social norm: This is the kind of place where people obey the rules.170 Ultimately, we must look to a change in norms to understand the 1990s crime bust, just as it was a change in norms that helped explain the boom three decades earlier. Though policing reforms almost certainly contributed to the headlong decline in American violence, particularly in New York, remember that Canada and Western Europe saw declines as well (albeit not by the same amount), and they did not bulk up their prisons or police to nearly the same degree. Even some of the hardest-headed crime statisticians have thrown up their hands and concluded that much of the explanation must lie in difficult-to-quantify cultural and psychological changes.171 The Great Crime Decline of the 1990s was part of a change in sensibilities that can fairly be called a recivilizing process.
As by the signaling of a social norm: This is the kind of place where people obey the rules.170
The researchers argued that an orderly environment fosters a sense of responsibility not so much by deterrence
Only a sample of criminal behavior can ever be detected and punished, and the sampling should be fair enough that citizens perceive the entire regime to be legitimate.
She realized that she belonged among the weak, in the camp of the weak, in the country of the weak, and that she had to be faithful to them precisely because they were weak and gasped for breath in the middle of sentences.
But when the strong were too weak to hurt the weak, the weak had to be strong enough to leave.
Most of the infractions that sent a person to the rack or the stake were nonviolent, and today many are not even considered legally punishable, such as heresy, blasphemy, apostasy, criticism of the government, gossip, scolding, adultery, and unconventional sexual practices. Both the Christian and secular legal systems, inspired by Roman law, used torture to extract a confession and thereby convict a suspect, in defiance of the obvious fact that a person will say anything to stop the pain.
All of the first complex civilizations were absolutist theocracies which punished victimless crimes with torture and mutilation.6
In the modern West and much of the rest of the world, capital and corporal punishments have been effectively eliminated, governments’ power to use violence against their subjects has been severely curtailed, slavery has been abolished, and people have lost their thirst for cruelty. All this happened in a narrow slice of history, beginning in the Age of Reason in the 17th century and cresting with the Enlightenment at the end of the 18th.
But in Inventing Human Rights, the historian Lynn Hunt notes that human rights have been conspicuously affirmed at two moments in history. One was the end of the 18th century, which saw the American Declaration of Independence in 1776 and the French Declaration of the Rights of Man and Citizen in 1789. The other was the midpoint of the 20th century, which saw the Universal Declaration of Human Rights in 1948, followed by a cascade of Rights Revolutions in the ensuing decades
The persecution of witches began to subside during the 17th century, when several European states abolished it. The year 1716 was the last time a woman was hanged as a witch in England, and 1749 was the last year a woman was burned as a witch anywhere in Europe.26
When England introduced drop hanging in 1783 and France introduced the guillotine in 1792, it was a moral advance, because an execution that instantly renders the victim unconscious is more humane than one that is designed to prolong his suffering.
It was upheld in the Hebrew and Christian Bibles, and was justified by Plato and Aristotle as a natural institution that was essential to civilized society. So-called democratic Athens in the time of Pericles enslaved 35 percent of its population,
Qatar in 1952 ; Saudi Arabia and Yemen in 1962; Mauritania in 1980.77
Credit checks, credit ratings, loan insurance, and credit cards are just some of the ways that economic life continued after borrowers could no longer be deterred by the threat of legal coercion.
While the real number of slaves is the largest there has ever been, it is also probably the smallest proportion of the world population ever in slavery.
Today, we don’t have to win the legal battle; there’s a law against it in every country. We don’t have to win the economic argument; no economy is dependent on slavery (unlike in the 19th century, when whole industries could have collapsed). And we don’t have to win the moral argument; no one is trying to justify it any more.
Caesar was one of the thirty-four Roman emperors (out of the total of forty-nine that reigned until the division of the empire) who were killed by guards, high officials, or members of their own families.
One in eight European monarchs was murdered in office, mostly by noblemen, and that a third of the killers took over the throne.95
In the 5th century BCE the Chinese philosopher Mozi, the founder of a rival religion to Confucianism and Taoism,
In the 15th, 16th, and 17th centuries, wars broke out between European countries at a rate of about three new wars a year.
One of the foremost was gentle commerce, the theory that the positive-sum payoff of trade should be more appealing than the zero-sum or negative-sum payoff of war.
The Abby de Saint Pierre (1713), Montesquieu (1748), Adam Smith (1776), George Washington (1788), and Immanuel Kant (1795) were some of the writers who extolled free trade because it yoked the material interests of nations and thus encouraged them to value one another’s well-being.
He then outlined his three conditions for perpetual peace. The first is that states should be democratic. Kant himself preferred the term republican, because he associated the word democracy with mob rule; what he had in mind was a government dedicated to freedom, equality, and the rule of law. Democracies are unlikely to fight each other, Kant argued, for two reasons. One is that a democracy is a form of government that by design (“having sprung from the pure source of the concept of law”) is built around nonviolence. A democratic government wields its power only to safeguard the rights of its citizens. Democracies, Kant reasoned, are apt to externalize this principle to their dealings with other nations, who are no more deserving of domination by force than are their own citizens. More important, democracies tend to avoid wars because the benefits of war go to a country’s leaders whereas the costs are paid by its citizens.
Kant’s second condition for perpetual peace was that “the law of nations shall be founded on a Federation of Free States” – “League of Nations,”
The third condition for perpetual peace is “universal hospitality” or “world citizenship.” People from one country should be free to live in safety in others, as long as they don’t bring an army in with them.
…”violation of rights in one place is felt throughout the world.”
In explaining the Humanitarian Revolution, then, we don’t have to decide between unspoken norms and explicit moral argumentation.
The most sweeping change in everyday sensibilities left by the Humanitarian Revolution is the reaction to suffering in other living things.
In the two centuries after Gutenberg, publishing became a high-tech venture, and productivity in printing and papermaking grew more than twentyfold, faster than the growth rate of the entire British economy during the Industrial Revolution.
The growth of writing and literacy strikes me as the best candidate for an exogenous change that helped set off the Humanitarian Revolution.
Reading is a technology for perspective-taking. When someone else’s thoughts are in your head, you are observing the world from that person’s vantage point. Not only are you taking in sights and sounds that you could not experience firsthand, but you have stepped inside that person’s mind and are temporarily sharing his or her attitudes and reactions.
Richard Henry Dana’s Two Years Before the Mast: A Personal Narrative of Life at Sea (1840) and Herman Melville’s White Jacket helped end the flogging of sailors. In the past century Erich Maria Remarque’s All Quiet on the Western Front, George Orwell’s 1984, Arthur Koestler’s Darkness at Noon, Aleksandr Solzhenitsyn’s One Day in the Life of Ivan Denisovich , Harper Lee’s To Kill a Mockingbird, Elie Wiesel’s Night, Kurt Vonnegut’s Slaughterhouse-Five, Alex Haley’s Roots, Anchee Min’s Red Azalea, Azar Nafisi’s Reading Lolita in Tehran, and Alice Walker’s Possessing the Secret of Joy
Uncle Tom’s Cabin
Charles Dickens’s Oliver Twist (1838) and Nicholas Nickleby (1839)
The human mind is adept at packaging a complicated idea into a chunk, combining it with other ideas into a more complex assembly, packaging that assembly into a still bigger contrivance, combining it with still other ideas, and so on.144 But to do so it needs a steady supply of plug-ins and subassemblies, which can come only from a network of other minds.
Classical Athens, Renaissance Venice, revolutionary Boston and Philadelphia, and the cities of the Low Countries are examples of cities where new democracies were gestated, and today urbanization and democracy tend to go together.
The ideas of thinkers like Hobbes, Spinoza, Descartes, Locke, David Hume, Mary Astell, Kant, Beccaria, Smith, Mary Wollstonecraft, Madison, Jefferson, Hamilton, and John Stuart Mill
The two explanations overlap––both appeal to an expansion of empathy and to the pacifying effects of positive-sum cooperation––but they differ in which aspect of human nature they emphasize.
Lewis Fry Richardson (1881––1953) was a physicist, meteorologist, psychologist, and applied mathematician.
Arnold Toynbee (1889––1975)
George Santayana’s advisory to remember the past so as not to repeat it, we need to discern patterns in the past, so we can know what to generalize to the predicaments of the present.
Hitler, Mussolini, Stalin, and Imperial Japan; the Holocaust; Stalin’s purge; the Gulag; and two atomic explosions
The first is that while the 20th century certainly had more violent deaths than earlier ones, it also had more people.
The population of the world in 1950 was 2.5 billion, which is about two and a half times the population in 1800, four and a half times that in 1600, seven times that in 1300, and fifteen times that of 1 CE.
The cognitive psychologists Amos Tversky and Daniel Kahneman have shown that people intuitively estimate relative frequency using a shortcut called the availability heuristic: the easier it is to recall examples of an event, the more probable people think it is.
The second illusion is historical myopia: the closer an era is to our vantage point in the present, the more details we can make out.
Today 8 percent of the men who live within the former territory of the Mongol Empire share a Y chromosome that dates to around the time of Genghis,
Maybe the only reason it appears that so many were killed in the past 200 years is because we have more records from that period.
The War of Attrition is one of those paradoxical scenarios in game theory (like the Prisoner’s Dilemma, the Tragedy of the Commons, and the Dollar Auction) in which a set of rational actors pursuing their interests end up worse off than if they had put their heads together and come to a collective and binding agreement.
Richardson reached two broad conclusions about the statistics of war: their timing is random, and their magnitudes are distributed according to a power law.
The time span is the era that began in the late 1400s, when gunpowder, ocean navigation, and the printing press are said to have inaugurated the modern age
What were they fighting over? The motives were the “three principal causes of quarrel” identified by Hobbes: predation (primarily of land), preemption of predation by others, and credible deterrence or honor.
1400 to 1559, the Age of Dynasties.
1559 as the inception of the Age of Religions,
Historians consider the Treaty of Westphalia of 1648 not only to have put out the Wars of Religion but to have established the first version of the modern international order.
Age of Sovereignty
wars were getting less frequent but more damaging.
Richardson showed, when we hold area constant, there are far fewer civil wars within national boundaries than there are interstate wars crossing them. (Just think of England, which hasn’t had a true civil war in 350 years, but has fought many interstate wars since then.)
Military revolution. States got serious about war. This was partly a matter of improved weaponry, especially cannons and guns, but it was more a matter of recruiting greater numbers of people to kill and be killed.
But during the military revolution of the 16th and 17th centuries, states began to form professional standing armies.
Combination of drill, indoctrination, and brutal punishment to train them for organized combat.
Another “advance” was the tapping of the Industrial Revolution, beginning in the 19th century, to feed and equip ever larger quantities of soldiers and transport them to the battlefront more quickly.
Powers like Holland, Sweden, Denmark, Portugal, and Spain stopped competing in the great power game and redirected their energies from conquest to commerce.
As we saw in chapter 4, this tranquillity was a part of the Humanitarian Revolution connected with the Age of Reason, the Enlightenment, and the dawn of classical liberalism.
Sovereign states were becoming commercial powers, which tend to favor positive-sum trade over zero-sum conquest. Popular writers were deconstructing honor, equating war with murder, ridiculing Europe’s history of violence, and taking the viewpoints of soldiers and conquered peoples.
Philosophers were redefining government from a means of implementing the whims of a monarch to a means for enhancing the life, liberty, and happiness of individual people, and tried to think up ways to limit the power of political leaders and incentivize them to avoid war.
1789 as the start of the Age of Nationalism.
This new age was populated by states that were better aligned with nations and that competed with other nation-states for preeminence.
Age of Ideology,
Age of Nationalism in 1917.
Close it with “1989.”
Intervals of peace (1815––54 and 1871––1914) in the middle.
Michael Howard has argued, is to see them as a battle for influence among four forces––Enlightenment humanism, conservatism, nationalism, and utopian ideologies––which
Napoleon… he seized power in a coup, stamped out constitutional government, reinstituted slavery, glorified war, had the Pope crown him emperor, restored Catholicism as the state religion, installed three brothers and a brother-in-law on foreign thrones, and waged ruthless campaigns of territorial aggrandizement with a criminal disregard for human life.
Edmund Burke’s conservatism, which held that a society’s customs were time-tested implementations of a civilizing process that had tamed humanity’s dark side and as such deserved respect alongside the explicit formal propositions of intellectuals and reformers.
The world the century after that.” During those two centuries Burkean conservatism, Enlightenment liberalism, and romantic nationalism played off one another in shifting alliances
Concert of Europe was a forerunner of the League of Nations, the United Nations, and the European Union. This international Leviathan deserves much of the credit for the long intervals of peace in 19th-century Europe.
In fact, as of May 15, 1984, the major powers of the world had remained at peace with one another for the longest stretch of time since the Roman Empire.
Though scores of nations have gained independence since 1945, and several have broken apart, most of the lines on a world map of 1950 are still present on a world map in 2010. This too is an extraordinary development in a world in which rulers used to treat imperial expansion as part of their job description.
it is not surprising that in our universe it was the first half of the 20th century that was shaped by a Princip and a Hitler, and the second half by a Kennedy, a Khrushchev, and a Gorbachev.
“Perpetual Peace,” Immanuel Kant reasoned that three conditions should reduce the incentives of national leaders to wage war without their having to become any kinder or gentler.
The first is democracy. Democratic government is designed to resolve conflicts among citizens by consensual rule of law, and so democracies should externalize this ethic in dealing with other states.
When both countries were fully democratic, the chance of a dispute fell by more than half.226
Robert Wright, who gave reciprocity pride of place in Nonzero,
History suggests many examples in which freer trade correlates with greater peace.
A country that is open to the global economy is less likely to find itself in a militarized dispute.241
The researchers concluded that Kant got it right three out of three times: democracy favors peace, trade favors peace, and membership in intergovernmental organizations favors peace.
Even more Kantian cause: a willingness to resolve conflicts by means that are acceptable to all the affected parties, rather than by the stronger party imposing its will on the weaker one.
Practices that passed from unexceptionable to controversial to immoral to unthinkable to not-thought-about during the Humanitarian Revolution.
Democracies may be better equipped to learn from their catastrophes, because of their openness to information and the accountability of their leaders.256
Countries with an abundance of nonrenewable, easily monopolized resources have slower economic growth, crappier governments, and more violence.
Foreign aid, so beloved of crusading celebrities, can be another poisoned chalice, because it can enrich and empower the leaders through whom it is funneled rather than building a sustainable economic infrastructure.
Civil wars were not more likely to break out in countries that were ethnically or religiously diverse, that had policies which discriminated against minority religions or languages, or that had high levels of income inequality.
No one found much romance in the frumpy institutions of the Civilizing Process, namely a competent government and police force and a dependable infrastructure for trade and commerce. Yet history suggests that these institutions are necessary for the reduction of chronic violence,
We have already seen that the first leg of the peace, democracy, does not reduce the number of civil conflicts, particularly when it comes in the rickety form of an anocracy. But it does seem to reduce their severity. The political scientist Bethany Lacina has found that civil wars in democracies have fewer than half the battle deaths of civil wars in nondemocracies, holding the usual variables constant.
The theory of the Kantian Peace places the weight of peace on three legs, the third of which is international organizations.
She found that the presence of peacekeepers reduced the risk of recidivism into another war by 80 percent. This doesn’t mean that peacekeeping missions are always successful––the genocides in Bosnia and Rwanda are two conspicuous failures––just that they prevent wars from restarting on average.
Why does peacekeeping work? The first reason comes right out of Leviathan: the larger and better-armed missions can retaliate directly against violators of a peace agreement on either side, raising the costs of aggression.
The very act of accepting intrusive peacekeepers is a costly (hence credible) signal that each side is serious about not attacking.
Even small missions can be effective at keeping a peace because they can free the adversaries from a Hobbesian trap in which each side is tempted to attack out of fear of being attacked first. The very act of accepting intrusive peacekeepers is a costly (hence credible) signal that each side is serious about not attacking.
A war that doesn’t even bother to invite the government represents the ultimate failure of the state’s monopoly on violence.
One often reads that a century ago only 10 percent of the deaths in war were suffered by civilians, but that today the figure is 90 percent.
Deaths from malnutrition and hunger in the developing world have been dropping steadily over the years, and that the civil wars of today, which are fought by packs of insurgents in limited regions of a country, have not been destructive enough to reverse the tide.
Genocide (killing people because of their race, religion, ethnicity, or other indelible group membership), politicide (killing people because of their political affiliation), or democide (any mass killing of civilians by a government or militia),
Most atrocitologists agree that in the 20th century more people were killed by democides than by wars.80
When a dehumanized people is in a position to defend itself or turn the tables, it can set a Hobbesian trap of group-against-group fear. Either side may see the other as an existential threat that must be preemptively taken out. After the breakup of Yugoslavia in the 1990s, Serbian nationalists’ genocide of Bosnians and Kosovars was partly fueled by fears that they would be the victims of massacres themselves.101
Daniel Goldhagen points out that not all genocides have the same causes. He classifies them according to whether the victim group is dehumanized (a target of moralized disgust), demonized (a target of moralized anger), both, or neither.114
As Solzhenitsyn pointed out, to kill by the millions you need an ideology.
Divisive ideologies include Christianity during the Crusades and the Wars of Religion (and in an offshoot, the Taiping Rebellion in China); revolutionary romanticism during the politicides of the French Revolution; nationalism during the genocides in Ottoman Turkey and the Balkans; Nazism in the Holocaust; and Marxism during the purges, expulsions, and terror-famines in Stalin’s Soviet Union, Mao’s China, and Pol Pot’s Cambodia.
Utopian ideologies invite genocide for two reasons. One is that they set up a pernicious utilitarian calculus. In a utopia, everyone is happy forever, so its moral value is infinite.
Suppose it were a hundred million lives one could save by diverting the trolley, or a billion, or––projecting into the indefinite future––infinitely many. How many people would it be permissible to sacrifice to attain that infinite good?
The second genocidal hazard of a utopia is that it has to conform to a tidy blueprint.
If you are designing the perfect society on a clean sheet of paper, why not write these eyesores out of the plans from the start?
Democides are often scripted into the climax of an eschatological narrative, a final spasm of violence that will usher in millennial bliss.
Memoirs of deportations and death camps by Elie Wiesel and Primo Levi were published in the 1960s, and today Frank’s Diary and Wiesel’s Night are among the world’s most widely read books.
Aleksandr Solzhenitsyn, Anchee Min, and Dith Pran shared their harrowing memories of the communist nightmares in the Soviet Union, China, and Cambodia.
Totalitarian regimes were responsible for 138 million deaths, 82 percent of the total, of which 110 million (65 percent of the total) were caused by the communist regimes. Authoritarian regimes, which are autocracies that tolerate independent social institutions such as businesses and churches, came in second with 28 million deaths.
Three-quarters of all the deaths from all 141 democidal regimes were committed by just four governments, which Rummel calls the dekamegamurderers: the Soviet Union with 62 million, the People’s Republic of China with 35 million, Nazi Germany with 21 million, and 1928––49 nationalist China with 10 million.153 Another 11 percent of the total were killed by eleven megamurderers, including Imperial Japan with 6 million, Cambodia with 2 million, and Ottoman Turkey with 1.9 million.
Democracies commit fewer democides because their form of governance, by definition, is committed to inclusive and nonviolent means of resolving conflicts. More important, the power of a democratic government is restricted by a tangle of institutional restraints, so a leader can’t just mobilize armies and militias on a whim to fan out over the country and start killing massive numbers of citizens.
The world has seen nothing close to the bloodletting of the 1940s since then; in the four decades that followed, the rate (and number) of deaths from democide went precipitously, if lurchingly, downward.
On the contrary, the peak in mass killing (putting aside China in the 1950s) is located in the mid-1960s to late 1970s. Those fifteen years saw a politicide against communists in Indonesia (1965––66, “the year of living dangerously,” with 700,000 deaths), the Chinese Cultural Revolution (1966––75, around 600,000), Tutsis against Hutus in Burundi (1965––73, 140,000), Pakistan’s massacre in Bangladesh (1971, around 1.7 million), north-against-south violence in Sudan (1956––72, around 500,000), Idi Amin’s regime in Uganda (1972––79, around 150,000), the Cambodian madness (1975––79, 2.5 million), and a decade of massacres in Vietnam culminating in the expulsion of the boat people (1965––75, around half a million).165 The two decades since the end of the Cold War have been marked by genocides in Bosnia from 1992 to 1995 (225,000 deaths), Rwanda (700,000 deaths), and Darfur (373,000 deaths from 2003 to 2008). These are atrocious numbers, but as the
Harff did discover six risk factors that distinguished the genocidal from the nongenocidal crises in three-quarters of the cases.167 One was a country’s previous history of genocide, presumably because whatever risk factors were in place the first time did not vanish overnight. The second predictor was the country’s immediate history of political instability––to be exact, the number of regime crises and ethnic or revolutionary wars it had suffered in the preceding fifteen years. Governments that feel threatened are tempted to eliminate or take revenge on groups they perceive to be subversive or contaminating, and are more likely to exploit the ongoing chaos to accomplish those goals before opposition can mobilize.168 A third was a ruling elite that came from an ethnic minority, presumably because that multiplies the leaders’ worries about the precariousness of their rule.
Democracies are less likely to wage interstate wars, to have large-scale civil wars, and to commit genocides.
Countries that depend more on international trade, Harff found, are less likely to commit genocides, just as they are less likely to fight wars with other countries and to be riven by civil wars.
The last predictor of genocide is an exclusionary ideology.
Marxist regimes in the Soviet Union and China, and more circuitously, it contributed to the one committed by the Nazi regime in Germany. Hitler read Marx in 1913, and although he detested Marxist socialism, his National Socialism substituted races for classes in its ideology of a dialectical struggle toward utopia, which is why some historians consider the two ideologies “fraternal twins.”170
The appearance of Marxist ideology in particular was a historical tsunami that is breathtaking in its total human impact. It led to the dekamegamurders by Marxist regimes in the Soviet Union and China, and more circuitously, it contributed to the one committed by the Nazi regime in Germany. Hitler read Marx in 1913, and although he detested Marxist socialism, his National Socialism substituted races for classes in its ideology of a dialectical struggle toward utopia, which is why some historians consider the two ideologies “fraternal twins.”170 Marxism also set off reactions that led to politicides by militantly anticommunist regimes in Indonesia and Latin America, and to the destructive civil wars of the 1960s, 1970s, and 1980s stoked by the Cold War superpowers. The point is not that Marxism should be morally blamed for these unintended consequences, just that any historical narrative must acknowledge the sweeping repercussions of this single idea.
Marxist regimes was justified with the saying “You can’t make an omelet without breaking eggs.”172
With such a small number of data points causing such a large share of the devastation, we will never really know how to explain the most calamitous events of the 20th century. The ideologies prepared the ground and attracted the men, the absence of democracy gave them the opportunity, but tens of millions of deaths ultimately depended on the decisions of just three individuals.
Terrorism is generally understood as premeditated violence perpetrated by a nonstate actor against noncombatants (civilians or off-duty soldiers) in pursuit of a political, religious, or social goal, designed to coerce a government or to intimidate or convey a message to a larger audience.
The first is fathomability: it’s better to deal with
Cognitive psychologists such as Tversky, Kahneman, Gigerenzer, and Slovic have shown that the perceived danger of a risk depends on two mental hobgoblins. The first is fathomability: it’s better to deal with the devil you know than the devil you don’t. People are nervous about risks that are novel, undetectable, delayed in their effects, and poorly understood by the science of the day. The second contributor is dread. People worry about worst-case scenarios, the ones that are uncontrollable, catastrophic, involuntary, and inequitable (the people exposed to the risk are not the ones who benefit from it).
How Terrorism Ends, the political scientist Audrey Cronin examined a larger dataset: 457 terrorist campaigns that had been active since 1968.
Terrorist groups die off exponentially over time, lasting, on average, between five and nine years.
No small terrorist organization has ever taken over a state, and 94 percent fail to achieve any of their strategic aims.197
Global terrorism rose in the late 1970s and declined in the 1990s for the same reasons that civil wars and genocides rose and fell during those decades.
The bulge in the late 1970s and early 1980s is mainly the handiwork of terrorists in Latin America (El Salvador, Nicaragua, Peru, and Colombia), who were responsible for 61 percent of the deaths from terrorism between 1977 and 1984.
Most of these attacks were carried out by Islamist groups whose expressed motives were at least partly religious.201 According to the most recent data from the National Counterterrorism Center, in 2008 Sunni Islamic extremists were responsible for almost two-thirds of the deaths from terrorism
Ordinarily a human being is unwilling to die, the legacy of half a billion years of natural selection. How have terrorist leaders overcome this obstacle?
Natural selection works on averages, so a willingness to take a small chance of dying as part of an aggressive coalition that offers a large chance of a big fitness payoff––more land, more women, or more safety––can be favored over the course of evolution.205
Warriors may accept the risk of death in battle for another reason. The evolutionary biologist J.B.S. Haldane, when asked whether he would lay down his life for his brother, replied, “No, but for two brothers or eight cousins.” He was invoking the phenomenon that would later be known as kin selection, inclusive fitness, and nepotistic altruism. Natural selection favors any genes that incline an organism toward making a sacrifice that helps a blood relative, as long as the benefit to the relative, discounted by the degree of relatedness, exceeds the cost to the organism. The reason is that the genes would be helping copies of themselves inside the bodies of those relatives and would have a long-term advantage over their narrowly selfish alternatives.
Among the contributors to the perception of kinship are the experience of having grown up together, having seen one’s mother care for the other person, commensal meals, myths of common ancestry, essentialist intuitions of common flesh and blood, the sharing of rituals and ordeals, physical resemblance (often enhanced by hairdressing, tattoos, scarification, and mutilation), and metaphors such as fraternity, brotherhood, family, fatherland, motherland, and blood.212 Military leaders use every trick in the book to make their soldiers feel like genetic relatives and take on the biologically predictable risks.
Far from being ignorant, impoverished, nihilistic, or mentally ill, suicide terrorists tend to be educated, middle class, morally engaged, and free of obvious psychopathology.
Scott Atran has refuted many common misconceptions about them. Far from being ignorant, impoverished, nihilistic, or mentally ill, suicide terrorists tend to be educated, middle class, morally engaged, and free of obvious psychopathology. Atran concluded that many of the motives may be found in nepotistic altruism.217
Commitment to the group is intensified by religion, not just the literal promise of paradise but the feeling of spiritual awe that comes from submerging oneself in a crusade, a calling, a vision quest, or a jihad.
Young men in all societies do foolish things to prove their courage and commitment, especially in groups, where individuals may do something they know is foolish because they think that everyone else in the group thinks it is cool.
Terrorist cells often begin as gangs of underemployed single young men who come together in cafes, dorms, soccer clubs, barbershops, or Internet chat rooms and suddenly find meaning in their lives by a commitment to the new platoon.
Blackwell and Sugiyama note that 99 percent of Palestinian suicide terrorists are male, that 86 percent are unmarried, and that 81 percent have at least six siblings, a larger family size than the Palestinian average.
Is not so much the Koran or religious teachings as a thrilling cause and call to action that promises glory and esteem in the eyes of friends, and through friends, eternal respect and remembrance in the wider world that they will never live to enjoy….
Though about a fifth of the world’s population is Muslim, and about a quarter of the world’s countries have a Muslim majority, more than half of the armed conflicts in 2008 embroiled Muslim countries or insurgencies.241 Muslim countries force a greater proportion of their citizens into their armies than non-Muslim countries do, holding other factors constant.
Their leaders receive farcically high percentages of the vote, and they exercise the power to jail opponents, outlaw opposition parties, suspend parliament, and cancel elections.
According to Amnesty International, almost three-quarters of Muslim countries execute their criminals, compared to a third of non-Muslim countries, and many use cruel punishments such as stoning, branding, blinding, amputation of tongues or hands, and even crucifixion. Every year more than a hundred million girls in Islamic countries have their genitals mutilated, and when they grow up they may be disfigured with acid or killed outright if they displease their fathers, their brothers, or the husbands who have been forced upon them.
The authors documented that Arab nations were plagued by political repression, economic backwardness, oppression of women, widespread illiteracy, and a self-imposed isolation from the world of ideas.
Arab Human Development Report,
Why did Islam blow its lead and fail to have an Age of Reason, an Enlightenment, and a Humanitarian Revolution?
It began in 1948, when Harry Truman ended segregation in the U.S. armed forces; accelerated through the 1950s, when the Supreme Court banned segregated schools, Rosa Parks was arrested for refusing to give up her bus seat to a white man, and Martin Luther King organized a boycott in response; climaxed in the early 1960s, when two hundred thousand people marched on Washington and heard King give perhaps the greatest speech in history; and culminated with the passage of the Voting Rights Act of 1965 and the Civil Rights Acts of 1964 and 1968.
1990s were a decade in which Oprah Winfrey, Michael Jordan, and Colin Powell were repeatedly named in polls as among the most admired Americans,
Rape was seen as an offense not against the woman but against a man––the woman’s father, her husband, or in the case of a slave, her owner.
A rape entangles three parties, each with a different set of interests: the rapist, the men who take a proprietary interest in the woman, and the woman herself.
Evolutionary psychologists and many radical feminists agree that rape is governed by the economics of human sexuality.
“A man wants what a woman has––sex. He can steal it (rape), persuade her to give it away (seduction), rent it (prostitution), lease it over the long term (marriage in the United States) or own it outright (marriage in most societies).”
Among humans, the male may use coercion to get sex when certain risk factors line up: when he is violent, callous, and reckless by temperament; when he is a loser who cannot attract sexual partners by other means; when he is an outcast and has little fear of opprobrium from the community; and when he senses that the risks of punishment are low, such as during conquests and pogroms.43
But the costs of a partner’s infidelity are different for the two sexes, and accordingly a man’s jealousy has been found to be more implacable, violent, and tilted toward sexual (rather than emotional) infidelity.46
Well into the 1970s marital rape was not a crime in any state, and the legal system underweighted the interests of women in other rapes.
The human mind thrives on metaphor, and in the case of women’s sexuality the recurring figure of thought is property.47
Property laws entitle owners to sell, exchange, and dispose of their property without encumbrance, and to expect the community to recognize their right to redress if the property is stolen or damaged by others.
Men may also protect their investment by holding the woman strictly liable for any theft or damage of her sexual value.
… the victim. The same genetic calculus that predicts that men might sometimes be inclined to pressure women into sex, and that the victim’s kin may experience rape as an offense against themselves, also predicts that the woman herself should resist and abhor being raped.51 It is in the nature of sexual reproduction that a female should evolve to exert control over her sexuality. She should choose the time, the terms, and the partner to ensure that her offspring have the fittest, most generous, and most protective father available, and that the offspring are born at the most propitious time.
Our current moral understanding does not seek to balance the interests of a woman not to be raped, the interests of the men who may wish to rape her, and the interests of the husband and fathers who want to monopolize her sexuality.
The first wave of feminism, bookended in the United States by the Seneca Falls Convention of 1848 and the ratification of the Nineteenth Amendment to the Constitution in 1920, gave women the right to vote, to serve as jurors, to hold property in marriage, to divorce, and to receive an education.
It took the second wave of feminism in the 1970s to revolutionize the treatment of rape.
Against Our Will helped put the reform of rape laws and judicial practices onto the national agenda. When the book was published, marital rape was not a crime in any American state; today it has been outlawed in all fifty, and in most of the countries of Western Europe.58
It seems that rape may be the one thing that you can’t put into a video game…. Killing scores of people in a game, often brutally, or even destroying entire cities is clearly worse than rape in real life. But in a video game, allowing someone to press the X-button to rape another character is off-limits. The “it’s just a game” justification seems to fall flat when it comes to rape…. Even in the virtual world of Role Playing Games, rape is taboo.
The legal scholar Francis X. Shen has performed a content analysis of video games dating back to the 1980s and discovered a taboo that was close to absolute: It seems that rape may be the one thing that you can’t put into a video game…. Killing scores of people in a game, often brutally, or even destroying entire cities is clearly worse than rape in real life. But in a video game, allowing someone to press the X-button to rape another character is off-limits. The “it’s just a game” justification seems to fall flat when it comes to rape…. Even in the virtual world of Role Playing Games, rape is taboo.
It shows that in thirty-five years the rate has fallen by an astonishing 80 percent, from 250 per 100,000 people over the age of twelve in 1973 to 50 per 100,000 in 2008. In fact, the decline may be even greater than that, because women have almost certainly been more willing to report being raped in recent years, when rape has been recognized as a serious crime, than they were in earlier years, when rape was often hidden and trivialized.
We learned in chapter 3 that the 1990s saw a decrease in all categories of crime, from homicide to auto theft.
The feminists won the battle against rape partly because there were more women in positions of influence, the legacy of technological changes that loosened the age-old sexual division of labor which had shackled women to hearth and children. But they also won the battle because both sexes had become increasingly feminist.
Pornography for men is visual, anatomical, impulsive, floridly promiscuous, and devoid of context and character. Erotica for women is far more likely to be verbal, psychological, reflective, serially monogamous, and rich in context and character. Men fantasize about copulating with bodies; women fantasize about making love to people.
Male desire can be indiscriminate in its choice of a sexual partner and indifferent to the partner’s inner life––indeed, “object” can be a more fitting term than “partner.”
In the case of rape, the correct belief is that rape has nothing to do with sex and only to do with power.
Brownmiller put it, “From prehistoric times to the present, I believe, rape has played a critical function. It is nothing more or less than a conscious process of intimidation by which all men keep all women in a state of fear.”66
Human practices such as veiling, chaperoning, chastity belts, claustration, segregation by sex, and female genital cutting appear to be culturally sanctioned mate-guarding tactics.
Adultery was a tort against the husband by his romantic rival, entitling him to damages, divorce (with refund of the bride-price), or violent revenge.
In some couples, one partner threatens the other with force, controls the family finances, restricts the other’s movements, redirects anger and violence against the children or pets, and strategically withholds praise and affection.
Archer found that countries in which women are better represented in government and the professions, and in which they earn a larger proportion of earned income, are less likely to have women at the receiving end of spousal abuse. Also, cultures that are classified as more individualistic, where people feel they are individuals with the right to pursue their own goals…
Selection acts to maximize an organism’s expected lifetime reproductive output, and that requires that it negotiate the tradeoff between investing in a new offspring and conserving its resources for current and future offspring. Mammals are extreme among animals in the amount of time, energy, and food they invest in their young, and humans are extreme among mammals. Pregnancy and birth are only the first chapter in a mother’s investment career, and a mammalian mother faces an expenditure of more calories in suckling the offspring to maturity than she expended in bearing it.108
Eighty-seven percent of the reasons fit the triage theory: the infant was not sired by the woman’s husband, the infant was deformed or ill, or the infant had strikes against its chances of surviving to maturity, such as being a twin, having an older sibling close in age, having no father around, or being born into a family that had fallen on hard economic times.
Female infanticide is biologically mysterious. Every child has a mother and a father, so if people are concerned about posterity, be it for their genes or their dynasty, culling their own daughters is a form of madness.
The biologist Robert Trivers and the mathematician Dan Willard reasoned that even though sons and daughters are expected to yield the same number of grandchildren on average, the maximum number that each sex can promise is different.
On the other hand a daughter is a safer bet––an unfit son will lose the competition with other men and end up childless, whereas an unfit daughter almost never lacks for a willing sex partner.
What drove down the Western rate of infanticide by more than three orders of magnitude? The first step was to criminalize it.
If abortion counts as a form of violence, the West has made no progress in its treatment of children.
The harsh treatment was not unique to Europe. The beating of children has been recorded in ancient Egypt, Sumeria, Babylonia, Persia, Greece, Rome, China, and Aztec Mexico,
One paradigm shift came from John Locke’s Some Thoughts Concerning Education, which was published in 1693 and quickly went viral.167 Locke suggested that a child was “only as white Paper, or Wax, to be moulded and fashioned as one pleases”––a doctrine also called the tabula rasa (scraped tablet) or blank slate.
In his 1762 treatise câmile, or On Education, Rousseau wrote, “Everything is good as it leaves the hand of the Author of things, and everything degenerates in the hands of man.” Foreshadowing the theories of the 20th-century psychologist Jean Piaget, Rousseau divided childhood into a succession of stages centered on Instinct, Sensations, and Ideas.
We have seen that during periods of humanitarian reform, a recognition of the rights of one group can lead to a recognition of others by analogy,
And as with many declines in violence, it’s hard to disentangle all the changes that were happening at once––enlightened ideas, increasing prosperity, reformed laws, changing norms.
Benjamin Spock’s perennial bestseller Baby and Child Care was considered radical in 1946 because it discouraged mothers from spanking their children, stinting on affection, and regimenting their routines.
Locke, Rousseau, and the 19th-century reformers had set in motion an escalator of gentleness in the treatment of children, and in recent decades its rate of ascent has accelerated.
“Children Should Never, Ever, Be Spanked No Matter What the Circumstances.”176 The expert opinion recommends against spanking for three reasons. One is that spanking has harmful side effects down the line, including aggression, delinquency, a deficit in empathy, and depression.
The second reason not to spank a child is that spanking is not particularly effective in reducing misbehavior compared to explaining the infraction to the child and using nonviolent measures like scolding and time-outs.
Here is Straus’s third reason why children should never, ever be spanked: “Spanking contradicts the ideal of nonviolence in the family and society.”
In the interim, he worked for the British decryption unit during World War II and helped to crack the cipher used by the Nazis to communicate with their U-boats, which was instrumental in defeating the German naval blockade and turning around the war. When the war was over, Turing wrote a paper (still widely read today) that equated thinking with computation, thereby offering an explanation of how intelligence could be carried out by a physical system.217
While many legal systems single out male homosexuality for criminalization, no legal system singles out lesbianism, and hate crimes against gay men outnumber hate crimes against gay women by a ratio of almost five to one.220
Homophobia is an evolutionary puzzle, as is homosexuality itself.221
Montesquieu and Voltaire argued that homosexuality should be decriminalized, though they didn’t go so far as to say that it was morally acceptable. In 1785 Jeremy Bentham took the next step. Using utilitarian reasoning, which equates morality with whatever brings the greatest good to the greatest number, Bentham argued that there is nothing immoral about homosexual acts because they make no one worse off.
Today homosexuality has been legalized in almost 120 countries, though laws against it remain on the books of another 80, mostly in Africa, the Caribbean, Oceania, and the Islamic world. Worse, homosexuality is punishable by death in Mauritania, Saudi Arabia, Sudan, Yemen, parts of Nigeria, parts of Somalia, and all of Iran
As late as 1969, homosexuality was illegal in every state but Illinois,
A 2009 Gallup poll showed that the six in ten Americans who have an openly gay friend, relative, or co-worker are more favorable to legalized homosexual relations and to gay marriage than the four in ten who don’t.
Statistics are available only for the years since 1996, when the FBI started to publish data on hate crimes broken down by the motive, the victim, and the nature of the crime. Even these numbers are iffy, because they depend on the willingness of the victims to report a crime and on the local police to categorize it as a hate crime and report it to the FBI.
Since 1996 there has been no significant change in the incidence of three of the four major kinds of hate crimes against gay people: aggravated assault, simple assault, or homicide
Remaining category, which has declined, namely intimidation
So while we can’t say for sure that gay Americans have become safer from assault, we do know they are safer from intimidation, safer from discrimination and moral condemnation, and perhaps most importantly, completely safe from violence from their own government.
Unlike the other Rights Revolutions, the movement for animal rights was not advanced by the affected parties themselves: the rats and pigeons were hardly in a position to press their case. Nor has it been a by-product of commerce, reciprocity, or any other positive-sum negotiation; the animals have nothing to offer us in exchange for our treating them more humanely. And unlike the revolution in children’s rights, it does not hold out the promise of an improvement in the makeup of its beneficiaries later in life. The recognition of animal interests was taken forward by human advocates on their behalf, who were moved by empathy, reason, and the inspiration of the other Rights Revolutions.
But any intuition that vegetarianism and humanitarianism go together was shattered in the 20th-century by the treatment of animals under Nazism.266 Hitler and many of his henchmen were vegetarians, not so much out of compassion for animals as from an obsession with purity, a pagan desire to reconnect to the soil, and a reaction to the anthropocentrism and meat rituals of Judaism.
Leonardo da Vinci, became a vegetarian himself.
Peter Singer’s 1975 book Animal Liberation, the so-called bible of the animal rights movement.
The argument begins with the realization that it is consciousness rather than intelligence or species membership that makes a being worthy of moral consideration. It follows that we should not inflict avoidable pain on animals any more than we should inflict it on young children or the mentally handicapped. And a corollary is that we should all be vegetarians.
In the 1970s it was a good thing to be a socialist, fruit-juice drinker, nudist, sandal-wearer, sex-maniac, Quaker, Nature Cure quack, pacifist, and feminist, sometimes all at the same time.
I have already mentioned that since 2005 the British aristocracy has had to retire its bugles and bloodhounds, and in 2008 Louisiana became the last American state to ban cockfights, a sport that had been popular throughout the world for centuries.
In 2004 the city of Barcelona outlawed the deadly contests between matador and beast, and in 2010 the ban was extended to the entire region of Catalonia.
In his 1932 book Death in the Afternoon, Ernest Hemingway explained the primal appeal of the bullfight:
The faux-meat section of my local supermarket offers Soyburgers, Gardenburgers, Seitanburgers, Veggie Burger Meatless Patties, Tofu Pups, Not Dogs, Smart Dogs, Fakin Bacon, Jerquee, Tofurky, Soy Sausage, Soyrizo, Chik Patties, Meatless Buffalo Wings, Celebration Roast, Tempeh Strips, Terkettes, Veggie Protein Slices, Vege-Scallops, and Tuno.
Veggie Breakfast Strips with Tofu Scramblers, perhaps in an omelet with Soya Kaas, Soymage, or Veganrella.
And for dessert there’s Ice Bean, Rice Dream, and Tofutti, perhaps garnished with Hip Whip and a cherry on top.
Will our 22nd-century descendants be as horrified that we ate meat as we are that our ancestors kept slaves?
One impediment is meat hunger and the social pleasures that go with the consumption of meat.
Animals eat our houses, our crops, and occasionally our children. They make us itch and bleed. They are vectors for diseases that torment and kill us. They kill each other, including endangered species that we would like to keep around. Without their participation in experiments, medicine would be frozen at its current state, and billions of living and unborn people would suffer and die for the sake of mice. An ethical calculus that gave equal weight to any harm suffered by any sentient being, allowing no chauvinism toward our own species, would prevent us from trading off the well-being of animals for an equivalent well-being of humans––for example, shooting a wild dog to save a little girl.
Insofar as violence is immoral, the Rights Revolutions show that a moral way of life often requires a decisive rejection of instinct, culture, religion, and standard practice.
We force ourselves into the shoes (or paws) of other sentient beings and consider their interests, starting with their interest in not being hurt or killed, and we ignore superficialities that may catch our eye such as race, ethnicity, gender, age, sexual orientation, and to some extent, species.
This conclusion, of course, is the moral vision of the Enlightenment and the strands of humanism and liberalism that have grown out of it. The Rights Revolutions are liberal revolutions. Each has been associated with liberal movements,
Money can buy education, police, social science, social services, media penetration, a professional workforce with more women, and better care of children and animals.
If I were to put my money on the single most important exogenous cause of the Rights Revolutions, it would be the technologies that made ideas and people increasingly mobile. The decades of the Rights Revolutions were the decades of the electronics revolutions: television, transistor radios, cable, satellite, long-distance telephones, photocopiers, fax machines, the Internet, cell phones, text messaging, Web video. They were the decades of the interstate highway, high-speed rail, and the jet airplane. They were the decades of the unprecedented growth in higher education and in the endless frontier of scientific research. Less well known is that they were also the decades of an explosion in book publishing. From 1960 to 2000, the annual number of books published in the United States increased almost fivefold.
Remember what went wrong in the Islamic world: it may have been a rejection of the printing press and a resistance to the importation of books and the ideas they contain.
Why should the spread of ideas and people result in reforms that lower violence? There are several pathways. The most obvious is a debunking of ignorance and superstition.
Such as that members of other races and ethnicities are innately avaricious or perfidious; that economic and military misfortunes are caused by the treachery of ethnic minorities; that women don’t mind being raped; that children must be beaten to be socialized; that people choose to be homosexual as part of a morally degenerate lifestyle; that animals are incapable of feeling pain.
African Americans and gay people appeared as entertainers on variety shows, then as guests on talk shows and as sympathetic characters on sitcoms and dramas. Their struggles were depicted in real-time footage of fire hoses and police dogs, and in bestselling books
Feminists made their case on talk shows, and their views came out of the mouths of characters in soap operas and sitcoms.
Successful innovators not only stand on the shoulders of giants; they engage in massive intellectual property theft, skimming ideas from a vast watershed of tributaries flowing their way.
We have seen that cultures of honor, whose overriding ethic is tribal loyalty and blood revenge, can survive in mountainous regions long after their lowland neighbors have undergone a civilizing process.
By the standards of history, a striking feature of the late-20th-century Rights Revolutions is how little violence they employed or even provoked.
There must be at least a grain of truth in conceptions of the human mind that grant it more than one part––theories like faculty psychology, multiple intelligences, mental organs, modularity, domain-specificity, and the metaphor of the mind as a Swiss army knife. Human nature accommodates motives that impel us to violence, like predation, dominance, and vengeance, but also motives that––under the right circumstances––impel us toward peace, like compassion, fairness, self-control, and reason.
People spend large amounts of their time and income immersing themselves in any of a number of genres of bloody virtual reality: Bible stories, Homeric sagas, martyrologies, portrayals of hell, hero myths, Gilgamesh, Greek tragedies, Beowulf, the Bayeux Tapestry, Shakespearean dramas, Grimm’s fairy tales, Punch and Judy, opera, murder mysteries, penny dreadfuls, pulp fiction, dime novels, Grand Guignol, murder ballads, films noirs, Westerns, horror comics, superhero comics, the Three Stooges, Tom and Jerry, the Road Runner, video games, and movies starring a certain ex-governor of California.
Even in peaceable societies, people are fascinated by the logic of bluff and threat, the psychology of alliance and betrayal, the vulnerabilities of a human body and how they can be exploited or shielded. The universal pleasure that people take in violent entertainment, always in the teeth of censorship and moralistic denunciation, suggests that the mind craves information on the conduct of violence.
People fantasize about and make art out of illicit sex vastly more often than they engage in it.
Symons suggests that higher consciousness itself is designed for low-frequency, highimpact events.
If violence is stamped into our childhoods, our fantasy lives, our art, and our brains, then how is it possible that soldiers are reluctant to fire their guns in combat, when that is what they are there to do?
That means that the first move toward harming a fellow human simultaneously accomplishes two things: 1. It increases the chance that the target will come to harm. 2. It gives the target an overriding goal of harming you before you harm him. Even if you prevail by killing him, you will have given his kin the goal of killing you in revenge. It stands to reason that initiating serious aggression in a symmetrical standoff is something a Darwinian creature must consider very, very carefully––a reticence experienced as anxiety or paralysis. Discretion is the better part of valor; compassion has nothing to do with it.
In The Blank Slate I argued that the modern denial of the dark side of human nature––the doctrine of the Noble Savage––was a reaction against the romantic militarism, hydraulic theories of aggression, and glorification of struggle and strife that had been popular in the late 19th and early 20th centuries.
Thanks to a brilliant analysis by the social psychologist Roy Baumeister in his book Evil.19 Baumeister was moved to study the commonsense understanding of evil when he noticed that the people who perpetrate destructive acts, from everyday peccadilloes to serial murders and genocides, never think they are doing anything wrong. How can there be so much evil in the world with so few evil people doing it?
Most people get angry at least once a week, and nearly everyone gets angry at least once a month, so there was no shortage of material.21 Both perpetrators and victims recounted plenty of lies, broken promises, violated rules and obligations, betrayed secrets, unfair acts, and conflicts over money.
The Moralization Gap is part of a larger phenomenon called self-serving biases. People try to look good. “Good” can mean effective, potent, desirable, and competent, or it can mean virtuous, honest, generous, and altruistic. The drive to present the self in a positive light was one of the major findings of 20th-century social psychology. An early expose was the sociologist Erving Goffman’s The Presentation of Self in Everyday Life, and recent summaries include Carol Tavris and Elliot Aronson’s Mistakes Were Made (but Not by Me), Robert Trivers’s Deceit and Self-Deception, and Robert Kurzban’s Why Everyone (Else) Is a Hypocrite.
Among the signature phenomena are cognitive dissonance, in which people change their evaluation of something they have been manipulated into doing to preserve the impression that they are in control of their actions, and the Lake Wobegon Effect (named after Garrison Keillor’s fictitious town in which all the children are above average), in which a majority of people rate themselves above average in every desirable talent or trait.
People congregate in groups not because they are robots who are magnetically attracted to one another but because they have social and moral emotions. They feel warmth and sympathy, gratitude and trust, loneliness and guilt, jealousy and anger.
We lie to ourselves so that we’re more believable when we lie to others. At the same time, an unconscious part of the mind registers the truth about our abilities so that we don’t get too far out of touch with reality.
Self-deception is an exotic theory, because it makes the paradoxical claim that something called “the self” can be both deceiver and deceived.
It’s not just that there are two sides to every dispute. It’s that each side sincerely believes its version of the story, namely that it is an innocent and long-suffering victim and the other side a malevolent and treacherous sadist. And each side has assembled a historical narrative and database of facts consistent with its sincere belief.
The United States had imposed a hostile embargo of oil and machinery on Japan, had anticipated possible attacks, had sustained relatively minor military damage, eventually sacrificed 100,000 American lives in response to the 2,500 lost in the attack, forced innocent Japanese Americans into concentration camps, and attained victory with incendiary and nuclear strikes on Japanese civilians that could be considered among history’s greatest war crimes. Even in matters when no reasonable third party can doubt who’s right and who’s wrong, we have to be prepared, when putting on psychological spectacles, to see that evildoers always think they are acting morally.
Yet Hitler, like all sentient beings, had a point of view, and historians tell us that it was a highly moralistic one. He experienced Germany’s sudden and unexpected defeat in World War I and concluded that it could be explained only by the treachery of an internal enemy. He was aggrieved by the Allies’ murderous postwar food blockade and their vindictive reparations. He lived through the economic chaos and street violence of the 1920s. And Hitler was an idealist: he had a moral vision in which heroic sacrifices would bring about a thousand-year utopia.35
Evil is the intentional and gratuitous infliction of harm for its own sake, perpetrated by a villain who is malevolent to the bone, inflicted on a victim who is innocent and good. The reason that this is a myth (when seen through psychological spectacles) is that evil in fact is perpetrated by people who are mostly ordinary, and who respond to their circumstances, including provocations by the victim, in ways they feel are reasonable and just.
The myth of pure evil gives rise to an archetype that is common in religions, horror movies, children’s literature, nationalist mythologies, and sensationalist news coverage. In many religions evil is personified as the Devil––Hades, Satan, Beelzebub, Lucifer, Mephistopheles––or as the antithesis to a benevolent God in a bilateral Manichean struggle. In popular fiction evil takes the form of the slasher, the serial killer, the bogeyman, the ogre, the Joker, the James Bond villain, or depending on the cinematic decade, the Nazi officer, Soviet spy, Italian gangster, Arab terrorist, inner-city predator, Mexican druglord, galactic emperor, or corporate executive. The evildoer may enjoy money and power, but these motives are vague and ill formed; what he really craves is the infliction of chaos and suffering on innocent victims. The evildoer is an adversary––the enemy of good––and the evildoer is often foreign. Hollywood villains, even if they are stateless, speak with a generic foreign accent.
The standpoint of the scientist resembles the standpoint of the perpetrator, while the standpoint of the moralizer resembles the standpoint of the victim, the scientist is bound to be seen
Because the standpoint of the scientist resembles the standpoint of the perpetrator, while the standpoint of the moralizer resembles the standpoint of the victim, the scientist is bound to be seen as “making excuses” or “blaming the victim,” or as trying to vindicate the amoral doctrine that “to understand all is to forgive all.”
While violence is certainly common in the animal kingdom, to think of it as arising from a single impulse is to see the world through a victim’s eyes. Consider all the destructive things that members of our species do to ants. We eat them, poison them, accidentally trample them, and deliberately squish them. Each category of formicide is driven by an utterly distinct motive. But if you were an ant, you might not care about these fine distinctions. We are humans, so we tend to think that the terrible things that humans do to other humans come from a single, animalistic motive. But biologists have long noted that the mammalian brain has distinct circuits that underlie very different kinds of aggression.
One of the oldest discoveries in the biology of violence is the link between pain or frustration and aggression.
Scientists have known that the orbital cortex is involved with regulating the emotions since 1848, when a railroad foreman named Phineas Gage tamped down some blasting powder in a hole in a rock and ignited an explosion that sent the tamping iron up through his cheekbone and out the top of his skull.
Today’s descriptions of patients with damage to their orbital cortex could have applied to Phineas Gage: “Disinhibited, socially inappropriate, susceptible to misinterpreting others’ moods, impulsive, unconcerned with the consequences of their actions, irresponsible in everyday life, lacking in insight into the seriousness of their condition, and prone to weak initiative.”64
People with antisocial personality disorder make up a large proportion of violent felons, and a subset of them, who possess glibness, narcissism, grandiosity, and a superficial charm, are called psychopaths (or sometimes sociopaths).
The orbital cortex, then (together with its ventromedial neighbor), is involved in several of the pacifying faculties of the human mind, including self-control, sympathy to others, and sensitivity to norms and conventions.
The first category of violence may be called practical, instrumental, exploitative, or predatory. It is the simplest kind of violence: the use of force as a means to an end.
The first category of violence is not really a category at all, because its perpetrators have no destructive motive like hate or anger. They simply take the shortest path to something they want, and a living thing happens to be in the way.
The second root of violence is dominance––the drive for supremacy over one’s rivals (Baumeister calls it “egotism”).
The third root of violence is revenge––the drive to pay back a harm in kind.
The fourth root is sadism, the joy of hurting.
The fifth and most consequential cause of violence is ideology, in which true believers weave a collection of motives into a creed and recruit other people to carry out its destructive goals.
Immanuel Kant stated the second formulation of his Categorical Imperative––that an act is moral if it treats a person as an end in itself and not as a means to an end––he
Here are a few examples: Romans suppressing provincial rebellions; Mongols razing cities that resist their conquest; free companies of demobilized soldiers plundering and raping; colonial settlers expelling or massacring indigenous peoples; gangsters whacking a rival, an informant, or an uncooperative official; rulers assassinating a political opponent or vice versa; governments jailing or executing dissidents; warring nations bombing enemy cities; hoodlums injuring a victim who resists a robbery or carjacking; criminals killing an eyewitness to a crime; mothers smothering a newborn they feel they cannot raise. Defensive and preemptive violence––doing it to them before they do it to you––is also a form of instrumental violence.
Psychopaths make up 1 to 3 percent of the male population, depending on whether one uses the broad definition of antisocial personality disorder, which embraces many kinds of callous troublemakers, or a narrower definition that picks out the more cunning manipulators.80 Psychopaths are liars and bullies from the time they are children, show no capacity for sympathy or remorse, make up 20 to 30 percent of violent criminals, and commit half the serious crimes.81 They also perpetrate nonviolent crimes like bilking elderly couples out of their life savings and running a business with ruthless disregard for the welfare of the workforce or stakeholders.
Thanks to the Moralization Gap, they will minimize their own first strike as necessary and trivial while magnifying the reprisal as unprovoked and devastating. Each side will count the wrongs differently––the perpetrator tallying an even number of strikes and the victim an odd number––and the difference in arithmetic can stoke a spiral of revenge, a dynamic we will explore in a later section.
Why should people be so deluded? Positive illusions make people happier, more confident, and mentally healthier, but that cannot be the explanation for why they exist, because it only begs the question of why our brains should be designed so that only unrealistic assessments make us happy and confident, as opposed to calibrating our contentment against reality. The most plausible explanation is that positive illusions are a bargaining tactic, a credible bluff. In recruiting an ally to support you in a risky venture, in bargaining for the best deal, or in intimidating an adversary into backing down, you stand to gain if you credibly exaggerate your strengths.
As Winston Churchill noted, “Always remember, however sure you are that you can easily win, that there would not be a war if the other man did not think he also had a chance.”
Richard Wrangham, inspired by Barbara Tuchman’s The March of Folly: From Troy to Vietnam and by Robert Trivers’s theory of self-deception, suggested that military incompetence is often a matter not of insufficient data or mistakes in strategy but of overconfidence.
Overconfidence and War: The Havoc and Glory of Positive Illusions, Johnson vindicated Wrangham’s hypothesis by looking at the predictions made by leaders on the verge of war and showing that they were unrealistically optimistic and contradicted by information available to them at the time.
Johnson expected that wars stoked by overconfidence should be less common in democracies, where the flow of information is more likely to expose the illusions of leaders to cold splashes of reality. But he found that it was the flow of information itself, rather than just the existence of a democratic system, that made the difference.
Dominance tends to erupt in violence within small groups like gangs and isolated workplaces, where a person’s rank within the clique determines the entirety of his social worth. If people belong to many groups and can switch in and out of them, they are more likely to find one in which they are esteemed, and an insult or slight is less consequential.104
The primatologist Frans de Waal discovered that in most primate species, after two animals have fought, they will reconcile.105 They may touch hands, kiss, embrace, and in the case of bonobos, have sex. This makes one wonder why they bother to fight in the first place if they were just going to make up afterward, and why they make up afterward if they had reason to fight. The reason is that reconciliation occurs only among primates whose long-term interests are bound together.
Chimpanzees, for example, reconcile after a fight within their community, but they never reconcile after a battle or raid with members of a different community.107
Surveys of personal values in men and women find that the men assign a lopsided value to professional status compared to all the other pleasures of life.108 They take greater risks, and they show more confidence and more overconfidence.109 Most labor economists consider these sex differences to be a contributor to the gender gap in earnings and professional success.110
And men are, of course, by far the more violent sex. Though the exact ratios vary, in every society it is the males more than the females who play-fight, bully, fight for real, carry weapons, enjoy violent entertainment, fantasize about killing, kill for real, rape, start wars, and fight in wars.111 Not only is the direction of the sex difference universal, but the first domino is almost certainly biological.
We have already seen why the sex difference evolved: mammalian males can reproduce more quickly than females, so they compete for sexual opportunities, while females tilt their priorities toward ensuring the survival of themselves and their offspring. Men have more to gain in violent competition, and also less to lose, because fatherless children are more likely to survive than motherless ones.
In men, testosterone levels rise in the presence of an attractive female and in anticipation of competition with other men, such as in sports. Once a match has begun, testosterone rises even more, and when the match has been decided, testosterone continues to rise in the winner but not in the loser. Men who have higher levels of testosterone play more aggressively, have angrier faces during competition, smile less often, and have firmer handshakes.
Violence is a problem not of too little self-esteem but of too much, particularly when it is unearned. Self-esteem can be measured, and surveys show that it is the psychopaths, street toughs, bullies, abusive husbands, serial rapists, and hatecrime perpetrators who are off the scale.
Psychohistory also has a legacy of fanciful psychoanalytic conjectures about what made Hitler Hitler: he had a Jewish grandfather, he had only one testicle, he was a repressed homosexual, he was asexual, he was a sexual fetishist.
Psychopaths and other violent people are narcissistic: they think well of themselves not in proportion to their accomplishments but out of a congenital sense of entitlement.
Tin-pot tyrants like Saddam Hussein, Mobutu Sese Seko, Moammar Khaddafi, Robert Mugabe, Idi Amin, Jean-Bedel Bokassa, and Kim Jong-il have immiserated their people on a scale that is smaller but still tragic.
Disregard for, and violation of, the rights of others”) and with borderline personality disorder (“instability in mood; black and white thinking; chaotic and unstable interpersonal relationships, self-image, identity, and behavior”).
… narcissistic personality disorder as “a pervasive pattern of grandiosity, need for admiration, and a lack of empathy.”
Psychopathy (“a pervasive pattern of disregard for, and violation of, the rights of others”) and with borderline personality disorder (“instability in mood; black and white thinking; chaotic and unstable interpersonal relationships, self-image, identity, and behavior”).
Men’s testosterone level rises when their team defeats a rival in a game, just as it rises when they personally defeat a rival in a wrestling match or in singles tennis.130 It also rises or falls when a favored political candidate wins or loses an election.131 The dark side of our communal feelings is a desire for our own group to dominate another group, no matter how we feel about its members as individuals.
A preference for one’s group emerges early in life and seems to be something that must be unlearned, not learned. Developmental psychologists have shown that preschoolers profess racist attitudes that would appall their liberal parents, and that even babies prefer to interact with people of the same race and accent.
An orientation away from social dominance, in contrast, inclines people to humanism, socialism, feminism, universal rights, political progressivism, and the egalitarian and pacifist themes in the Christian Bible.
A social dominance orientation, they show, inclines people to a sweeping array of opinions and values, including patriotism, racism, fate, karma, caste, national destiny, militarism, toughness on crime, and defensiveness of existing arrangements of authority and inequality.
The psychologists Jim Sidanius and Felicia Pratto have proposed that people, to varying degrees, harbor a motive they call social dominance, though a more intuitive term is tribalism: the desire that social groups be organized into a hierarchy, generally with one’s own group dominant over the others.134 A social dominance orientation, they show, inclines people to a sweeping array of opinions and values, including patriotism, racism, fate, karma, caste, national destiny, militarism, toughness on crime, and defensiveness of existing arrangements of authority and inequality. An orientation away from social dominance, in contrast, inclines people to humanism, socialism, feminism, universal rights, political progressivism, and the egalitarian and pacifist themes in the Christian Bible.
The phenomenon of nationalism can be understood as an interaction between psychology and history. It is the welding together of three things: the emotional impulse behind tribalism; a cognitive conception of the “group” as a people sharing a language, territory, and ancestry; and the political apparatus of government.
Many queens and empresses, including Isabella of Spain, Mary and Elizabeth I of England, and Catherine the Great of Russia, acquitted themselves well in internal oppression and external conquest, and several 20th-century heads of state, such as Margaret Thatcher, Golda Meir, Indira Gandhi, and Chandrika Kumaratunga, led their nations in war.149
Several ethnographic surveys of traditional cultures have found that the better a society treats its women, the less it embraces war.156
We don’t know what causes what, but biology and history suggest that all else being equal, a world in which women have more influence will be a world with fewer wars.
Anything that deflates the concept of dominance is likely to drive down the frequency of fights between individuals and wars between groups.
The Prisoner’s Dilemma has been called one of the great ideas of the 20th century, because it distills the tragedy of social life into such a succinct formula.183
The dilemma arises in any situation in which the best individual payoff is to defect while the partner cooperates, the worst individual payoff is to cooperate while the other defects, the highest total payoff is when both cooperate, and the lowest total payoff is when both defect.
The modeling of the evolution of cooperation has become increasingly byzantine, because so many worlds can be simulated so cheaply. But in the most plausible of these worlds, we see the evolution of the all-too-human phenomena of exploitation, revenge, forgiveness, contrition, reputation, gossip, cliques, and neighborliness.
Experiments have shown that people punish more severely, even at a price that is greater than the amount out of which they have been cheated, when they think an audience is watching.196 And as we saw, men are twice as likely to escalate an argument into a fight when spectators are around.197
These impulses implement what judicial theorists call specific deterrence: a punishment is targeted at a wrongdoer to prevent him from repeating a crime. The psychology of revenge also implements what judicial theorists call general deterrence: a publicly decreed punishment that is designed to scare third parties away from the temptations of crime.
Why doesn’t revenge work like the nuclear arsenals in the Cold War, creating a balance of terror that keeps everyone in line? Why should there ever be cycles of vendetta, with vengeance begetting vengeance? A major reason is the Moralization Gap. People consider the harms they inflict to be justified and forgettable, and the harms they suffer to be unprovoked and grievous.
In many parts of this book I have credited the Leviathan––a government with a monopoly on the legitimate use of force––as a major reducer of violence. Feuding and anarchy go together. We can now appreciate the psychology behind the effectiveness of a Leviathan. The law may be an ass, but it is a disinterested ass, and it can weigh harms without the self-serving distortions of the perpetrator or the victim.
In The Blank Slate I argued that even the elements of our judicial practices that seem to be motivated by just deserts may ultimately serve a deterrent function, because if a system ever became too narrowly utilitarian, malefactors would learn to game it. Just deserts can close off that option.206
The law of a country significantly predicted the degree to which its citizens indulged in antisocial revenge: the people in countries with an iffy Rule of Law were more destructively vengeful. With the usual spaghetti of variables, it’s impossible to be certain what caused what, but the results are consistent with the idea that the disinterested justice of a decent Leviathan induces citizens to curb their impulse for revenge before it spirals into a destructive cycle.
Beyond Revenge: The Evolution of the Forgiveness Instinct, the psychologist Michael McCullough shows that we do have this dimmer switch for revenge.209
We are apt to forgive our kin and close friends for trespasses that would be unforgivable in others. And when our circle of empathy expands (a process we will examine in the next chapter), our circle of forgivability expands with it.
A second circumstance that cranks down revenge is a relationship with the perpetrator that is too valuable to sever. We may not like them, but we’re stuck with them, so we had better learn to live with them.
The third modulator of revenge kicks in when we are assured that the perpetrator has become harmless. For all the warmth and fuzziness of forgiveness, you can’t afford to disarm if the person who harmed you is likely to do it again.
For the first time in history, the leaders of nations have elevated the ideals of historical truth and international reconciliation above self-serving claims of national infallibility and rectitude. In 1984 Japan sort of apologized for occupying Korea when Emperor Hirohito told the visiting South Korean president, “It is regrettable that there was an unfortunate period in this century.” But subsequent decades saw a string of ever-more-forthcoming apologies from other Japanese leaders. In the ensuing decades Germany apologized for the Holocaust, the United States apologized for interning Japanese Americans, the Soviet Union apologized for murdering Polish prisoners during World War II, Britain apologized to the Irish, Indians, and Maori, and the Vatican apologized for its role in the Wars of Religion, the persecution of Jews, the slave trade, and the oppression of women.
The political scientists William Long and Peter Brecke took up the question in their 2003 book War and Reconciliation: Reason and Emotion in Conflict Resolution.
Do apologies and other conciliatory gestures in the human social repertoire actually avert cycles of revenge? The political scientists William Long and Peter Brecke took up the question in their 2003 book War and Reconciliation: Reason and Emotion in Conflict Resolution.
The prototype for reconciliation after a civil conflict is South Africa. Invoking the Xhosa concept of ubuntu or brotherhood, Nelson Mandela and Desmond Tutu instituted a system of restorative rather than retributive justice to heal the country after decades of violent repression and rebellion under the apartheid regime.
They identify four ingredients of the successful elixir. The first is a round of uncompromised truth-telling and acknowledgment of harm. It may take the form of truth and reconciliation commissions, in which perpetrators publicly own up to the harms they did, or of national factfinding committees, whose reports are widely publicized and officially endorsed.
A second theme in successful reconciliations is an explicit rewriting of people’s social identities. People redefine the groups with which they identify. The perpetual victims of a society may take responsibility for running it. Rebels become politicians, bureaucrats, or businesspeople. The military surrenders its claim to embody the nation and demotes itself to their security guards.
The third theme appears to be the most important: incomplete justice. Rather than settling every score, a society has to draw a line under past violations and grant massive amnesty while prosecuting only the blatant ringleaders and some of the more depraved foot soldiers. Even then the punishments take the form of hits to their reputation, prestige, and privileges rather than blood for blood. There may, in addition, be reparations, but their restorative value is registered more on an emotional balance sheet than a financial one.
Harold Schechter’s authoritative compendium The Serial Killer Files.
Caligula, Nero, Bluebeard (probably based on the 15th-century knight Gilles de Rais), Vlad the Impaler, and Jack the Ripper are celebrity examples, and scholars have speculated that legends of werewolves, robber bridegrooms, and demon barbers may have been based on widely retold tales of actual serial killers.
Donatien Alphonse Fran√ßois, also known as the Marquis de Sade.
The development of sadism requires two things: motives to enjoy the suffering of others, and a removal of the restraints that ordinarily inhibit people from acting on them.
At least four motives to take satisfaction in the pain of others. One is a morbid fascination with the vulnerability of living things, a phenomenon perhaps best captured by the word macabre.
Another appeal of feeling someone’s pain is dominance.
The ultimate form of power over someone is the power to cause them pain at will.229
A third occasion for sadism is revenge, or the sanitized third-party version we call justice.
Revenge literally turns off the empathic response in the brain (at least among men), and it is consummated only when the avenger knows that the target knows that his suffering is payback for his misdeeds.231
Finally, there is sexual sadism.
With so many sources for sadism, why are there so few sadists?
The first that comes to mind is empathy. If people feel each other’s pain, then hurting someone else will be felt as hurting oneself. That is why sadism is more thinkable when the victims are demonized or dehumanized beings that lie outside one’s circle of empathy.
Baumeister points out that an additional emotion has to kick in for sympathy to inhibit behavior: guilt.
Another brake on sadism is a cultural taboo: the conviction that deliberate infliction of pain is not a thinkable option, regardless of whether it engages one’s sympathetic inhibitions.
But perhaps the most powerful inhibition against sadism is more elemental: a visceral revulsion against hurting another person. Most primates find the screams of pain of a fellow animal to be aversive, and they will abstain from food if it is accompanied by the sound and sight of a fellow primate being shocked.246
Sadism is literally an acquired taste.256 Government torturers such as police interrogators and prison guards follow a counterintuitive career trajectory.
The experience had not been as arousing as it had been in their imaginations. But as time passes and their appetite is rewhetted, they find the next one easier and more gratifying, and then they escalate the cruelty to feed what turns into an addiction.
The first hit of heroin is euphoric, and the withdrawal mild. But as the person turns into a junkie, the pleasure lessens and the withdrawal symptoms come earlier and are more unpleasant, until the compulsion is less to attain the euphoria than to avoid the withdrawal. According to Baumeister, sadism follows a similar trajectory.
The psychologist Paul Rozin has identified a syndrome of acquired tastes he calls benign masochism. These paradoxical pleasures include consuming hot chili peppers, strong cheese, and dry wine, and partaking in extreme experiences like saunas, skydiving, car racing, and rock climbing. All of them are adult tastes, in which a neophyte must overcome a first reaction of pain, disgust, or fear on the way to becoming a connoisseur. And all are acquired by controlling one’s exposure to the stressor in gradually increasing doses. What they have in common is a coupling of high potential gains (nutrition, medicinal benefits, speed, knowledge of new environments) with high potential dangers (poisoning, exposure, accidents). The pleasure in acquiring one of these tastes is the pleasure of pushing the outside of the envelope: of probing, in calibrated steps, how high, hot, strong, fast, or far one can go without bringing on disaster.
Like predatory or instrumental violence, ideological violence is a means to an end. But with an ideology, the end is idealistic: a conception of the greater good.
They include the Crusades, the European Wars of Religion, the French Revolutionary and Napoleonic Wars, the Russian and Chinese civil wars, the Vietnam War, the Holocaust, and the genocides of Stalin, Mao, and Pol Pot.
It allows any number of eggs to be broken to make the utopian omelet. And it renders opponents of the ideology infinitely evil and hence deserving of infinite punishment.
The cognitive prerequisite is our ability to think through long chains of means-ends reasoning, which encourage us to carry out unpleasant means as a way to bring about desirable ends.
Means-ends reasoning becomes dangerous when the means to a glorious end include harming human beings.
Let these ingredients brew in the mind of a narcissist with a lack of empathy, a need for admiration, and fantasies of unlimited success, power, brilliance, and goodness, and the result can be a drive to implement a belief system that results in the deaths of millions.
But the puzzle in understanding ideological violence is not so much psychological as epidemiological: how a toxic ideology can spread from a small number of narcissistic zealots to an entire population willing to carry out its designs.
Irving Janis called groupthink. Groups are apt to tell their leaders what they want to hear, to suppress dissent, to censor private doubts, and to filter out evidence that contradicts an emerging consensus. A third is animosity between groups. Imagine being locked in a room for a few hours with a person whose opinions you dislike––say, you’re a liberal and he or she is a conservative or vice versa, or you sympathize with Israel and the other person sympathizes with the Palestinians or vice versa. Chances are the conversation between the two of you would be civil, and it might even be warm. But now imagine that there are six on your side and six on the other. There would probably be a lot of hollering and red faces and perhaps a small riot. The overall problem is that groups take on an identity of their own in people’s minds, and individuals’ desire to be accepted within a group, and to promote its standing in comparison to other groups, can override their better judgment.
What did matter was the physical proximity of other people and how they behaved. When the experimenter was absent and his instructions were delivered over the telephone or in a recorded message, obedience fell. When the victim was in the same room instead of an adjacent booth, obedience fell. And when the participant had to work in tandem with a second participant (a confederate of the experimenter), then if the confederate refused to comply, so did the participant. But when the confederate complied, more than 90 percent of the time the participant did too.
The psychologists suspected that groups of people might fail to respond to an emergency that would send an isolated person leaping to action because in a group, everyone assumes that if no one else is doing anything, the situation couldn’t be all that dire.
Decades later Zimbardo wrote a book that analogized the unplanned abuses in his own faux prison to the unplanned abuses at the Abu Ghraib prison in Iraq, arguing that a situation in which a group of people is given authority over another group can bring out barbaric behavior in individuals who might never display it in other circumstances.
Psychology-experiment-cum-morality-play (conducted in 1971)…
Philip Zimbardo set up a mock prison in the basement of the Stanford psychology department, divided the participants at random into “prisoners” and “guards,” and even got the Palo Alto police to arrest the prisoners and haul them to the campus hoosegow.268
During the Holocaust, soldiers and policemen rounded up unarmed civilians, lined them up in front of pits, and shot them to death, not out of animus to the victims or a commitment to Nazi ideology but so that they would not shirk their responsibilities or let down their brothers-in-arms.
The question is: after four decades of fashionable rebellion, bumper stickers that advise the reader to Question Authority, and a growing historical consciousness that ridicules the excuse “I was only following orders,” do people still follow the orders of an authority to inflict pain on a stranger? The answer is that they do.
Also, conformity can be a virtue in what game theorists call coordination games, where individuals have no rational reason to choose a particular option other than the fact that everyone else has chosen it.
According to some theories, these “network externalities” explain the success of English spelling, the QWERTY keyboard, VHS videocassettes, and Microsoft software (though there are doubters in each case).
Whether you call it herd behavior, the cultural echo chamber, the rich get richer, or the Matthew Effect, our tendency to go with the crowd can lead to an outcome that is collectively undesirable.
There is a maddening phenomenon of social dynamics variously called pluralistic ignorance, the spiral of silence, and the Abilene paradox, after an anecdote in which a Texan family takes an unpleasant trip to Abilene one hot afternoon because each member thinks the others want to go.274 People may endorse a practice or opinion they deplore because they mistakenly think that everyone else favors it. A classic example is the value that college students place on drinking till they puke. In many surveys it turns out that every student, questioned privately, thinks that binge drinking is a terrible idea, but each is convinced that his peers think it’s cool.
Hall-of-fame experiment, Solomon Asch placed his participants in a dilemma right out of the movie Gaslight.276 Seated around a table with seven other participants (as usual, stooges), they were asked to indicate which of three very different lines had the same length as a target line, an easy call. The six stooges who answered before the participant each gave a patently wrong answer. When their turn came, three-quarters of the real participants defied their own eyeballs and went with the crowd.
During witch hunts and purges, people get caught up in cycles of preemptive denunciation. Everyone tries to out a hidden heretic before the heretic outs him. Signs of heartfelt conviction become a precious commodity. Solzhenitsyn recounted a party conference in Moscow that ended with a tribute to Stalin. Everyone stood and clapped wildly for three minutes, then four, then five . . . and then no one dared to be the first to stop. After eleven minutes of increasingly stinging palms, a factory director on the platform finally sat down, followed by the rest of the grateful assembly. He was arrested that evening and sent to the gulag for ten years.278 People in totalitarian regimes have to cultivate thoroughgoing thought control lest their true feelings betray them.
James Payne documented a common sequence in the takeover of Germany, Italy, and Japan by fascist ideologies in the 20th century. In each case a small group of fanatics embraced a “na√Øve, vigorous ideology that justifies extreme measures, including violence,” recruited gangs of thugs willing to carry out the violence, and intimidated growing segments of the rest of the populations into acquiescence.281
Once again the sociologists had demonstrated that people not only endorse an opinion they do not hold if they mistakenly believe everyone else holds it, but they falsely condemn someone else who fails to endorse the opinion. The extra step in this experiment was that Macy et al. got a new group of participants to rate whether the first batch of participants had sincerely believed that the nonsensical essay was good. The new raters judged that the ones who condemned the honest rater were more sincere in their misguided belief than the ones who chose not to condemn him. It confirms Macy’s suspicion that enforcement of a belief is perceived as a sign of sincerity, which in turn supports the idea that people enforce beliefs they don’t personally hold to make themselves look sincere. And that, in turn, supports their model of pluralistic ignorance, in which a society can be taken over by a belief system that the majority of its members do not hold individually.
How could ordinary people, even if they were acquiescing to what they thought was a popular ideology, overcome their own consciences and perpetrate such atrocities? The answer harks back to the Moralization Gap. Perpetrators always have at their disposal a set of self-exculpatory stratagems that they can use to reframe their actions as provoked, justified, involuntary, or inconsequential. In the examples I mentioned in.
This process is called cognitive dissonance reduction, and it is a major tactic of self-deception.285 Social psychologists like Milgram, Zimbardo, Baumeister, Leon Festinger, Albert Bandura, and Herbert Kelman have documented that people have many ways of reducing the dissonance between the regrettable things they sometimes do and their ideal of themselves as moral agents.286
Euphemisms can be effective for several reasons. Words that are literal synonyms may contrast in their emotional coloring, like slender and skinny, fat and Rubenesque, or an obscene word and its genteel synonym. In The Stuff of Thought I argued that most euphemisms work more insidiously: not by triggering reactions to the words themselves but by engaging different conceptual interpretations of the state of the world.
Collateral damage implies that a harm was an unintended by-product rather than a desired end, and that makes a legitimate moral difference. One could almost use collateral damage with a straight face to describe the hapless worker on the side track who was sacrificed to prevent the runaway trolley from killing five workers on the main one. All of these phenomena––emotional connotation, plausible deniability, and the ascription of motives––can be exploited to alter the way an action is construed.
A second mechanism of moral disengagement is gradualism. People can slide into barbarities a baby step at a time that they would never undertake in a single plunge, because at no point does it feel like they are doing anything terribly different from the current norm. An infamous historical example is the Nazis’ euthanizing of the handicapped and mentally retarded and their disenfranchisement, harassment, ghettoization, and deportation of the Jews, which culminated in the events referred to by the ultimate euphemism, the Final Solution.
It’s unlikely that any participant in the Milgram experiment would have zapped the victim with a 450-volt shock on the first trial; the participants were led to that level in an escalating series, starting with a mild buzz. Milgram’s experiment was what game theorists call an Escalation game, which is similar to a War of Attrition.
A third disengagement mechanism is the displacement or diffusion of responsibility. Milgram’s mock experimenter always insisted to the participants that he bore full responsibility for whatever happened. When the patter was rewritten and the participant was told that he or she was responsible, compliance plummeted. We have already seen that a second willing participant emboldens the first; Bandura showed that the diffusion of responsibility is a critical factor.
A fourth way of disabling the usual mechanisms of moral judgment is distancing. We have seen that unless people are in the throes of a rampage or have sunk into sadism, they don’t like harming innocent people directly and up close.
Paul Slovic has confirmed the observation attributed to Stalin that one death is a tragedy but a million deaths is a statistic.
A fifth means of jiggering the moral sense is to derogate the victim. We have seen that demonizing and dehumanizing a group can pave the way toward harming its members.
Social psychologists have identified other gimmicks of moral disengagement, and Bandura’s participants rediscovered most of them. They include minimizing the harm (“It would not hurt them too bad”), relativizing the harm (“Everyone is punished for something every day”), and falling back on the requirements of the task (“If doing my job as a supervisor means I must be a son of a bitch, so be it”). The only one they seem to have missed was a tactic called advantageous comparison: “Other people do even worse things.”
Dangerous ideologies erupt when these faculties fall into toxic combinations. Someone theorizes that infinite good can be attained by eliminating a demonized or dehumanized group. A kernel of like-minded believers spreads the idea by punishing disbelievers. Clusters of people are swayed or intimidated into endorsing it. Skeptics are silenced or isolated. Self-serving rationalizations allow people to carry out the scheme against what should be their better judgment.
When people moralize, they take a victim’s perspective and assume that all perpetrators of harm are sadists and psychopaths. Moralizers are thereby apt to see historical declines of violence as the outcome of a heroic struggle of justice over evil. The greatest generation defeated the fascists; the civil rights movement defeated the racists; Ronald Reagan’s buildup of arms in the 1980s forced the collapse of communism.
The second half of the 20th century was an age of psychology. Academic research increasingly became a part of the conventional wisdom, including dominance hierarchies, the Milgram and Asch experiments, and the theory of cognitive dissonance. But it wasn’t just scientific psychology that filtered into public awareness; it was the general habit of seeing human affairs through a psychological lens.
Increasingly we see our affairs from two vantage points: from inside our skulls, where the things we experience just are, and from a scientist’s-eye view, where the things we experience consist of patterns of activity in an evolved brain, with all its illusions and fallacies.
People, especially men, are overconfident in their prospects for success; when they fight each other, the outcome is likely to be bloodier than any of them thought. People, especially men, strive for dominance for themselves and their groups; when contests of dominance are joined, they are unlikely to sort the parties by merit and are likely to be a net loss for everyone. People seek revenge by an accounting that exaggerates their innocence and their adversary’s malice; when two sides seek perfect justice, they condemn themselves and their heirs to strife. People can not only overcome their revulsion to hands-on violence but acquire a taste for it; if they indulge it in private, or in cahoots with their peers, they can become sadists. And people can avow a belief they don’t hold because they think everyone else avows it; such beliefs can sweep through a closed society and bring it under the spell of a collective delusion.
An expansion of empathy may help explain why people today abjure cruel punishments and think more about the human costs of war. But empathy today is becoming what love was in the 1960s––a sentimental ideal, extolled in catchphrases (what makes the world go round, what the world needs now, all you need) but overrated as a reducer of violence.
My mind doesn’t stop and ponder what it would be like to be the victims of these kinds of violence and then recoil after feeling their pain. My mind never goes in these directions in the first place: they are absurd, ludicrous, unthinkable.
This chapter is about the better angels of our nature: the psychological faculties that steer us away from violence, and whose increased engagement over time can be credited for declines in violence. Empathy is one of these faculties, but it is not the only one.
What Hume wrote in 1751 is certainly true today:
The word empathy is barely a century old.
Einfhlung (feeling into) and used it to label a kind of aesthetic appreciation: a “feeling or acting in the mind’s muscles,”
The meteoric rise of empathy coincided with its taking on a new meaning, one that is closer to “sympathy” or “compassion.”
That beneficence toward other people depends on pretending to be them, feeling what they are feeling, walking a mile in their moccasins, taking their vantage point, or seeing the world through their eyes.8
So the sense of empathy that gets valorized today––an altruistic concern for others––cannot be equated with the ability to think what they are thinking or feel what they are feeling.
The original and most mechanical sense of empathy is projection––the ability to put oneself into the position of some other person, animal, or object, and imagine the sensation of being in that situation.
Jean Piaget famously showed that children younger than about six cannot visualize the arrangement of three toy mountains on a tabletop from the viewpoint of a person seated across from him, a kind of immaturity he called egocentrism.
Mind-reading, theory of mind, mentalizing, or empathic accuracy is the ability to figure out what someone is thinking or feeling from their expressions, behavior, or circumstances.
Mind-reading does not require that we experience the person’s experiences ourselves, nor that we care about them, only that we can figure out what they are. Mind-reading may in fact comprise two abilities, one for reading thoughts (which is impaired in autism), the other for reading emotions (which is impaired in psychopathy).12
If a child has been frightened by a barking dog and is howling in terror, my sympathetic response is not to howl in terror with her, but to comfort and protect her.
In one of the odder experiments in the field of behavioral economics, Ernst Fehr and his collaborators had people play a Trust game, in which they hand over money to a trustee, who multiplies it and then returns however much he feels like to the participant.31Half the participants inhaled a nasal spray containing oxytocin, which can penetrate from the nose to the brain, and the other half inhaled a placebo. The ones who got the oxytocin turned over more of their money to the stranger, and the media had a field day with fantasies of car dealers misting the hormone through their showroom ventilating systems to snooker innocent customers.
It seems likely that the oxytocin network is a vital trigger in the sympathetic response to other people’s beliefs and desires.
So what are the prospects that we can expand the circle of sympathy outward from babies, fuzzy animals, and the people bound to us in communal relationships, to lasso in larger and larger sets of strangers?
Let’s start with the sympathy-altruism hypothesis and compare it to the cynical alternative in which people help others only to reduce their own distress.
Together with other studies, the experiment suggests that by default people help others egoistically, to relieve their own distress at having to watch them suffer. But when they sympathize with a victim, they are overcome by a motive to reduce her suffering whether it eases their distress or not.
The most powerful exogenous sympathy trigger would be one that is cheap, widely available, and already in place, namely the perspective-taking that people engage in when they consume fiction, memoir, autobiography, and reportage.
The delayed influence is what researchers in persuasion call a sleeper effect. When people are exposed to information that changes their attitudes in a way they don’t approve of––in this case, warmer feelings toward murderers––they are aware of the unwanted influence and consciously cancel it out. Later, when their guard is down, their change of heart reveals itself.
Batson found that when people empathized with Sheri, a ten-year-old girl with a serious illness, they also opted for her to jump a queue for medical treatment ahead of other children who had waited longer or needed it more.
Mirror neurons notwithstanding, empathy is not a reflex that makes us sympathetic to everyone we lay eyes upon. It can be switched on and off, or thrown into reverse, by our construal of the relationship we have with a person. Its head is turned by cuteness, good looks, kinship, friendship, similarity, and communal solidarity.
The Old Testament tells us to love our neighbors, the New Testament to love our enemies. The moral rationale seems to be: Love your neighbors and enemies; that way you won’t kill them. But frankly, I don’t love my neighbors, to say nothing of my enemies. Better, then, is the following ideal: Don’t kill your neighbors or enemies, even if you don’t love them.
For empathy to matter, it must goad changes in policies and norms that determine how the people in those groups are treated.
The ubiquity of homicidal fantasies shows that we are not immune to the temptations of violence, but have learned to resist them. Self-control has been credited with one of the greatest reductions of violence in history, the thirtyfold drop in homicide between medieval and modern Europe.
People refrained from stabbing each other at the dinner table and amputating each other’s noses at the same time as they refrained from urinating in closets, copulating in public, passing gas at the dinner table, and gnawing on bones and returning them to the serving dish. A culture of honor, in which men were respected for lashing out against insults, became a culture of dignity, in which men were respected for controlling their impulses.
Lapses of self-control can also cause violence on larger scales.
The burning and looting of African American neighborhoods by their own residents following the assassination of Martin Luther King in 1968, and Israel’s pulverizing of the infrastructure of Lebanon following a raid by Hezbollah in 2006, are just two examples.74
Does the brain really contain competing systems for impulse and self-control? Is self-control a single faculty in charge of taming every vice, from overeating to promiscuity to procrastination to petty crime to serious aggression? If so, are there ways for individuals to boost their self-control? And could these adjustments proliferate through a society, changing its character toward greater restraint across the board?
First we must set aside pure selfishness––doing something that helps oneself but hurts others––and focus on self-indulgence––doing something that helps oneself in the short term but hurts oneself in the long term. Examples abound. Food today, fat tomorrow. Nicotine today, cancer tomorrow. Dance today, pay the piper tomorrow. Sex today, pregnancy, disease, or jealousy tomorrow. Lash out today, live with the damage tomorrow.
There is an optimum rate of discounting the future––mathematically, an optimum interest rate––which depends on how long you expect to live, how likely you will get back what you saved, how long you can stretch out the value of a resource, and how much you would enjoy it at different points in your life (for example, when you’re vigorous or frail).
Many experiments on many species have shown that when two rewards are far away, organisms will sensibly pick a large reward that comes later over a small reward that comes sooner.
Metcalfe, that the desire for instant gratification comes from a “hot system” in the brain, whereas the patience to wait comes from a “cool
The psychologist Walter Mischel, who conducted classic studies of myopic discounting in children (the kids are given the agonizing choice between one marshmallow now and two marshmallows in fifteen minutes), proposed, with the psychologist Janet Metcalfe, that the desire for instant gratification comes from a “hot system” in the brain, whereas the patience to wait comes from a “cool system.”
Even better, the neuroimagers could literally read the participants’ minds. When their lateral prefrontal cortex was more active than their limbic regions, the participants held off for the larger-later reward; when the limbic regions were as active or more active, they succumbed to the smaller-sooner one.
The rearmost margin, which abuts the parietal lobe, is called the motor strip, and it controls the muscles. Just in front of it are premotor areas that organize motor commands into more complex programs; these are the regions in which mirror neurons were first discovered. The portion in front of them is called the prefrontal cortex, and it includes the dorsolateral and orbital/ventromedial regions we have already encountered many times, together with the frontal pole at the
Traditional neurologists (the doctors who treat patients with brain damage rather than sliding undergraduates into scanners) were not surprised by the discovery that it is the frontal lobes that are most involved in self-control.
Patients with frontal lobe damage are said to be stimulus-driven.
Intact frontal lobes are necessary to liberate behavior from stimulus control––to bring people’s actions into the service of their goals and plans.
Phineas Gage’s freak accident, which sent a spike cleanly up through his orbital and ventromedial cortex and largely spared the lateral and frontmost parts, tells us that different parts of the frontal lobes implement different kinds of self-control. Gage, recall, was said to have lost the equilibrium “between his intellectual faculties and animal propensities.” Neuroscientists today agree that the orbital cortex is a major interface between emotion and behavior. Patients with orbital damage, recall, are impulsive, irresponsible, distractible, socially inappropriate, and sometimes violent.
The neuroscientist Etienne Koechlin summarizes the functioning of the frontal lobe in the following way. The rearmost portions respond to the stimulus; the lateral frontal cortex responds to the context; and the frontal pole responds to the episode. Concretely, when the phone rings and we pick it up, we are responding to the stimulus. When we are at a friend’s house and let it ring, we are responding to the context. And when the friend hops into the shower and asks us to pick up the phone if it rings, we are responding to the episode.
Adrian Raine, who previously showed that psychopaths and impulsive murderers have a small or unresponsive orbital cortex, recently carried out a neuroimaging experiment that supports the idea that violence arises from an imbalance between impulses from the limbic system and self-control from the frontal lobes.91
Walter Mischel began his studies of delay of gratification (in which he gave children the choice between one marshmallow now and two marshmallows later) in the late 1960s, and he followed the children as they grew up.93 When they were tested a decade later, the ones who had shown greater willpower in the marshmallow test had now turned into adolescents who were better adjusted, attained higher SAT scores, and stayed in school longer. When they were tested one and two decades after that, the patient children had grown into adults who were less likely to use cocaine, had higher self-esteem, had better relationships, were better at handling stress, had fewer symptoms of borderline personality disorder, obtained higher degrees, and earned more money.
They found that the students with higher scores got better grades, had fewer eating disorders, drank less, had fewer psychosomatic aches and pains, were
Self-control. They found that the students with higher scores got better grades, had fewer eating disorders, drank less, had fewer psychosomatic aches and pains, were less depressed, anxious, phobic, and paranoid, had higher self-esteem, were more conscientious, had better relationships with their families, had more stable friendships, were less likely to have sex they regretted, were less likely to imagine themselves cheating in a monogamous relationship, felt less of a need to “vent” or “let off steam,” and felt more guilt but less shame.96 Self-controllers are better at perspective-taking and are less distressed when responding to others’ troubles, though they are neither more nor less sympathetic in their concern for them.
And contrary to the conventional wisdom that says that people with too much self-control are uptight, repressed, neurotic, bottled up, wound up, obsessive-compulsive, or fixated at the anal stage of psychosexual development, the team found that the more self-control people have, the better their lives are.
Are people with low self-control more likely to perpetrate acts of violence? Circumstantial evidence suggests they are.
This change is partly driven by the physical maturation of the brain. The wiring of the prefrontal cortex is not complete until the third decade of life, with the lateral and polar regions developing last.
No one has done the twin and adoption studies that would be needed to show that performance on standard tests of self-control, such as the marshmallow test or the adult equivalent, are heritable. But it’s a good bet that they are, because pretty much every psychological trait has turned out to be partly heritable.
Baumeister team addressed them and over the following decade accumulated a raft of experiments showing that just about any task that requires an exercise of willpower can impede performance in any other task that requires willpower.
If we combine (1) Baumeister’s experiments, which found that reducing self-control in the lab can increase tendencies toward impulsive sex and violence; (2) the correlations across individuals between low self-control on the one hand and childhood misconduct, dissolute behavior, and crime on the other; (3) the neuroimaging studies that showed correlations between frontal lobe activity and self-control; and (4) the neuroimaging studies showing correlations between impulsive violence and impaired frontal lobe function, then we get an empirical picture that supports Elias’s conjecture that violence may be caused by weakness in an overarching neural mechanism of self-control.
We avoid shopping on an empty stomach. We throw out the brownies or the cigarettes or the booze at a time when we aren’t craving them, to foil ourselves at a time when we are. We put our alarm clock on the other side of the bedroom so we don’t turn it off and fall back asleep. We authorize our employers to invest a part of every paycheck for our retirement. We refrain from buying a magazine or a book or a gadget that will divert our attention from a work project until it is complete. We hand over money to a company like Stickk.com that returns it a fraction at a time if we meet certain goals, or donates it to a political organization we detest if we don’t. We make a public resolution to change, so our reputation takes a hit if we don’t.
Walter Mischel showed that even four-year-olds can wait out a long interval for a double helping of marshmallows if they cover the alluring marshmallow in front of them, look away from it, distract themselves by singing, or even reframe it in their minds as a puffy white cloud rather than a sweet tasty food.119
It is not implausible to suppose that real-world conditions that impair the frontal lobes––low blood sugar, drunkenness, druggedness, parasite load, and deficiencies of vitamins and minerals––could sap the self-control of people in an impoverished society and leave them more prone to impulsive violence.
If willpower is like a muscle that fatigues with use, drains the body of energy, and can be revived by a sugary pick-me-up, can it also be bulked up by exercise? Can people develop their raw strength of will by repeatedly flexing their determination and resolve? The metaphor shouldn’t be taken too literally––it’s unlikely that the frontal lobes literally gain tissue like bulging biceps––but it’s possible that the neural connections between the cortex and limbic system may be strengthened with practice. It’s also possible that people can learn strategies of self-control, enjoy the feeling of mastery over their impulses, and transfer their newfound tricks of discipline from one part of their behavioral repertoire to another.
In addition to being modulated by Ulyssean constraints, cognitive reframing, an adjustable internal discount rate, improvements in nutrition, and the equivalent of muscle gain with exercise, self-control might be at the mercy of whims in fashion.124 In some eras, self-control defines the paragon of a decent person: a grown-up, a person of dignity, a lady or a gentleman, a mensch.
The exogenous first domino is a change in law enforcement and opportunities for economic cooperation that objectively tilt the payoffs so that a deferral of gratification, in particular, an avoidance of impulsive violence, pays off in the long run. The knock-on effect is a strengthening of people’s self-control muscles that allow them (among other things) to inhibit their violent impulses, above and beyond what is strictly necessary to avoid being caught and punished. The process could even feed on itself in a positive feedback loop, “positive” in both the engineering and the human-values sense. In a society in which other people control their aggression, a person has less of a need to cultivate a hair trigger for retaliation, which in turn takes a bit of pressure off everyone else, and so on.
An interest rate is just such an index, because it reveals how much compensation people demand for deferring consumption from the present to the future. To be sure, an interest rate is partly determined by objective factors like inflation, expected income growth, and the risk that the investment will never be returned. But it partly reflects the purely psychological preference for instant over delayed gratification.
According to Hofstede’s data, countries differ along six dimensions. One of them is Long-Term versus Short-Term Orientation: “Long-term oriented societies foster pragmatic virtues oriented towards future rewards, in particular saving, persistence, and adapting to changing circumstances. Short-term oriented societies foster virtues related to the past and present such as national pride, respect for tradition, preservation of ‘face,’ and fulfilling social obligations.” Another dimension is Indulgence versus Restraint: “Indulgence stands for a society that allows relatively free gratification of basic and natural human drives related to enjoying life and having fun. Restraint stands for a society that suppresses gratification of needs and regulates it by means of strict social norms.” Both, of course, are conceptually related to the faculty of self-control, and not surprisingly, they are correlated with each other (with a coefficient of 0.45 across 110 countries). Elias would have predicted that both of these national traits should correlate with the countries’ homicide rates, and the prediction turns out to be true. The citizens of countries with more of a long-term orientation commit fewer homicides, as do the citizens of countries that emphasize restraint over indulgence.
Throughout this book I have assumed that human nature, in the sense of the cognitive and emotional inventory of our species, has been constant over the ten-thousand-year window in which declines of violence are visible, and that all differences in behavior among societies have strictly environmental causes. That is a standard assumption in evolutionary psychology, based on the fact that the few centuries and millennia in which societies have separated and changed are a small fraction of the period in which our species has been in existence.130
Nothing rules out the possibility that human populations have undergone some degree of biological evolution in recent millennia or even centuries, long after races, ethnic groups, and nations diverged.
As in most studies in behavioral genetics, the effects of being brought up in a given family were tiny to nonexistent, though other aspects of the environment that are not easily measurable by these techniques, such as effects of the neighborhood, subculture, or idiosyncratic personal experiences, undoubtedly do have effects.
Oxytocin, the so-called cuddle hormone which encourages sympathy and trust, acts on receptors in several parts of the brain, and the number and distribution of those receptors can have dramatic effects on behavior.
Testosterone. A person’s response to a challenge of dominance depends in part on the amount of testosterone released into the bloodstream and on the distribution of receptors for the hormone in his or her brain.150 The gene for the testosterone receptor varies across individuals, so a given concentration of testosterone can have a stronger effect on the brains of some people than others.
Neurotransmitters are the molecules that are released from a neuron, seep across a microscopic gap, and lock onto a receptor in the surface of another neuron, changing its activity and thereby allowing patterns of neural firing to propagate through the brain. One major class of neurotransmitters are the catecholamines, which include dopamine, serotonin, and norepinephrine (also called noradrenaline, and related to the adrenaline that triggers the fight-or-flight response). The catecholamines are used in several motivational and emotional systems of the brain, and their concentration is regulated by proteins that break them down or recycle them. One of those enzymes is monoamine oxidase-A, MAO-A for short, which helps to break down these neurotransmitters, preventing them from building up in the brain. When they do build up, the organism can become hyperreactive to threats and more likely to engage in aggression.
A more common kind of variation is found in the part of the gene that determines how much MAO-A is produced. People with a low-activity version of the gene build up higher levels of dopamine, serotonin, and norepinephrine in the brain. They also are more likely to have symptoms of antisocial personality disorder, to report that they have committed acts of violence, to be convicted of a violent crime, to have amygdalas that react more strongly and an orbital cortex that reacts less strongly to angry and fearful faces, and, in the psychology lab, to force a fellow participant to drink hot sauce if they think he has exploited them.
The low-activity version of the MAO-A gene makes people more prone to aggression primarily when they have grown up with stressful experiences, such as having been abused or neglected by their parents or having been held back in school.
So is there evidence that the Pacification or Civilizing Process ever rendered the pacified or civilized peoples constitutionally less susceptible to violence? Casual impressions can be misleading. History offers many examples in which one nation has considered another to be peopled by “savages” or “barbarians,” but the impressions were motivated more by racism and observations of differences in societal type than by any attempt to tease apart nature and nurture.
What about hard evidence from the revolution in evolutionary genomics? In their manifesto The 10,000 Year Explosion: How Civilization Accelerated Human Evolution,
So while recent biological evolution may, in theory, have tweaked our inclinations toward violence and nonviolence, we have no good evidence that it actually has. At the same time, we do have good evidence for changes that could not possibly be genetic, because they unfolded on time scales that are too rapid to be explained by natural selection, even with the new understanding of how recently it has acted. The abolition of slavery and cruel punishments during the Humanitarian Revolution; the reduction of violence against minorities, women, children, homosexuals, and animals during the Rights Revolutions; and the plummeting of war and genocide during the Long Peace and the New Peace, all unfolded over a span of decades or even years, sometimes within a single generation.
Second, moralized beliefs are actionable. While people may not unfailingly carry out Socrates’ dictum that “To know the good is to do the good,” they tacitly aspire to it. People see moral actions as intrinsically worthy goals, which need no ulterior motive.
One difference is that disapproval of a moralized act is universalized.
If you think that murder and torture and rape are immoral, then you cannot simply avoid these activities yourself and be indifferent to whether other people indulge in them. You have to disapprove of anyone committing such acts.
The psychologist Jonathan Haidt has underscored the ineffability of moral norms in a phenomenon he calls moral dumbfounding. Often people have an instant intuition that an action is immoral, and then struggle, often unsuccessfully, to come up with reasons why it is immoral.169
In the modern West, as we have seen, the avoidance of some kinds of violence, such as mercy-killing an abandoned child, retaliating for an insult, and declaring war on another developed state, consist not in weighing the moral issues, empathizing with the targets, or restraining an impulse, but in not having the violent act as a live option in the mind at all.
Anthropologist Richard Shweder and several of his students and collaborators have found that moral norms across the world cluster around a small number of themes.170 The intuitions that we in the modern West tend to think of as the core of morality––fairness, justice, the protection of individuals, and the prevention of harm––are just one of several spheres of concern that may attach themselves to the cognitive and emotional paraphernalia of moralization.
The intuitions that we in the modern West tend to think of as the core of morality––fairness, justice, the protection of individuals, and the prevention of harm––are just one of several spheres of concern that may attach themselves to the cognitive and emotional paraphernalia of moralization.
Shweder organized the world’s moral concerns in a threefold way.171 Autonomy, the ethic we recognize in the modern West, assumes that the social world is composed of individuals and that the purpose of morality is to allow them to exercise their choices and to protect them from harm. The ethic of Community, in contrast, sees the social world as a collection of tribes, clans, families, institutions, guilds, and other coalitions, and equates morality with duty, respect, loyalty, and interdependence. The ethic of Divinity posits that the world is composed of a divine essence, portions of which are housed in bodies, and that the purpose of morality is to protect this spirit from degradation and contamination. If a body is merely a container for the soul, which ultimately belongs to or is part of a god, then people do not have the right to do what they want with their bodies. They are obligated to avoid polluting them by refraining from unclean forms of sex, food, and other physical pleasures. The ethic of Divinity lies behind the moralization of disgust and the valorization of purity and asceticism.
When Haidt asked participants, for example, whether it would be all right for a brother and sister to have voluntary protected sex, for a person to clean a toilet with a discarded American flag, for a family to eat a pet dog that had been killed by a car, for a man to buy a dead chicken and have sex with it, or for a person to break a deathbed vow to visit the grave of his mother, they said no in each case. But when asked for justifications, they floundered ineffectually before giving up and saying, “I don’t know, I can’t explain it, I just know it’s wrong.”
Haidt took Shweder’s trichotomy and cleaved two of the ethics in two, yielding a total of five concerns that he called moral foundations.172 Community was bifurcated into In-group Loyalty and Authority/Respect, and Autonomy was sundered into Fairness/Reciprocity (the morality behind reciprocal altruism) and Harm/Care (the cultivation of kindness and compassion, and the inhibition of cruelty and aggression). Haidt also gave Divinity the more secular label Purity/Sanctity.
The system I find most useful was developed by the anthropologist Alan Fiske.
The first model, Communal Sharing (Communality for short), combines In-group Loyalty with Purity/Sanctity.
Communality, they freely share resources within the group, keeping no tabs on who gives or takes how much. They conceptualize the group as “one flesh,” unified by a common essence, which must be safeguarded against contamination. They reinforce the intuition of unity with rituals of bonding and merging such as bodily contact, commensal meals, synchronized movement, chanting or praying in unison, shared emotional experiences, common bodily ornamentation or mutilation, and the mingling of bodily fluids in nursing, sex, and blood rituals. They also rationalize it with myths of shared ancestry, descent from a patriarch, rootedness in a territory, or relatedness to a totemic animal. Communality evolved from maternal care, kin selection, and mutualism, and it may be implemented in the brain, at least in part, by the oxytocin system.
Fiske’s second relational model, Authority Ranking, is a linear hierarchy defined by dominance, status, age, gender, size, strength, wealth, or precedence. It entitles superiors to take what they want and to receive tribute from inferiors, and to command their obedience and loyalty. It also obligates them to a paternalistic, pastoral, or noblesse oblige responsibility to protect those under them. Presumably it evolved from primate dominance hierarchies, and it may be implemented, in part, by testosterone-sensitive circuits in the brain. Equality Matching embraces tit-for-tat reciprocity and other schemes to divide resources equitably, such as turn-taking, coin-flipping, matching contributions, division into equal portions, and verbal formulas like eeny-meenyminey-moe. Few animals engage in clear-cut reciprocity,
Recall that the logic of reciprocal altruism, which implements our sense of fairness, is to be “nice” by cooperating on the first move, by not defecting unless defected on, and by conferring a large benefit to a needy stranger when one can do so at a relatively small cost to oneself. When care and harm are extended outside intimate circles, then they are simply a part of the logic of fairness.)176
Fiske’s final relational model is Market Pricing: the system of currency, prices, rents, salaries, benefits, interest, credit, and derivatives that powers a modern economy. Market Pricing depends on numbers, mathematical formulas, accounting, digital transfers, and the language of formal contracts. Unlike the other three relational models, Market Pricing is nowhere near universal, since it depends on literacy, numeracy, and other recently invented information technologies.
One can line up the models, Fiske notes, along a scale that more or less reflects their order of emergence in evolution, child development, and history: Communal Sharing > Authority Ranking > Equality Matching > Market Pricing.
The theories of Shweder, Haidt, and Fiske agree on how the moral sense works. No society defines everyday virtue and wrongdoing by the Golden Rule or the Categorical Imperative. Instead, morality consists in respecting or violating one of the relational models (or ethics or foundations): betraying, exploiting, or subverting a coalition; contaminating oneself or one’s community; defying or insulting a legitimate authority; harming someone without provocation; taking a benefit without paying the cost; peculating funds or abusing prerogatives.
Some social norms are merely solutions to coordination games, such as driving on the right-hand side of the road, using paper currency, and speaking the ambient language.180
The emotional response to a relational mismatch depends on whether it is accidental or deliberate, which model is substituted for which, and the nature of the resource.
Tetlock distinguishes three kinds of tradeoffs. Routine tradeoffs are those that fall within a single relational model, such as choosing to be with one friend rather than another, or to purchase one car rather than another. Taboo tradeoffs pit a sacred value in one model against a secular value in another, such as selling out a friend, a loved one, an organ, or oneself for barter or cash. Tragic tradeoffs pit sacred values against each other, as in deciding which of two needy transplant patients should receive an organ, or the ultimate tragic tradeoff, Sophie’s choice between the lives of her two children.
The art of politics, Tetlock points out, is in large part the ability to reframe taboo tradeoffs as tragic tradeoffs (or, when one is in the opposition, to do the opposite). A politician who wants to reform Social Security has to reframe it from “breaking our faith with senior citizens” (his opponent’s framing) to “lifting the burden on hardworking wage-earners” or “no longer scrimping in the education of our children.” Keeping troops in Afghanistan is reframed from “putting the lives of our soldiers in danger” to “guaranteeing our nation’s commitment to freedom” or “winning the war on terror.” The reframing of sacred values, as we will see, may be an overlooked tactic in the psychology of peacemaking.
Communism envisioned a Communal Sharing of resources (“From each according to his ability, to each according to his need”), an Equality Matching of the means of production, and an Authority Ranking of political control (in theory, the dictatorship of the proletariat; in practice, a nomenklatura of commissars under a charismatic dictator). A kind of populist socialism seeks Equality Matching for life’s necessities, such as land, medicine, education, and child care. At the other pole of the continuum, libertarians would allow people to negotiate virtually any resource under Market Pricing, including organs, babies, medical care, sexuality, and education.
Haidt has shown that liberals believe that morality is a matter of preventing harm and enforcing fairness (the values that line up with Shweder’s Autonomy and Fiske’s Equality Matching). Conservatives give equal weight to all five foundations, including In-group Loyalty (values such as stability, tradition, and patriotism), Purity/Sanctity (values such as propriety, decency, and religious observance), and Authority/Respect (values such as respect for authority, deference to God, acknowledgment of gender roles, and military obedience).189
Communal Sharing, for all its cuddly connotations, supports the mindset behind genocidal ideologies based on tribe, race, ethnicity, and religion.
In a similar way, it furnishes the moral rationalizations employed by slaveholders, colonial overlords, and benevolent despots. But Authority Ranking also justifies violent punishment for insolence, insubordination, disobedience, treason, blasphemy, heresy, and l√®se-majeste.
Equality Matching supplies the rationale for tit-for-tat retaliation: an eye for an eye, a tooth for a tooth, a life for a life, blood for blood. As we saw in chapter 8, even people in modern societies are apt to conceive of criminal punishment as just deserts rather than as general or specific deterrence.194
Unless all people are explicitly enfranchised and granted ownership of their own bodies and property, the amoral pursuit of profit in a market economy can exploit them in slave markets, human trafficking, and the opening of foreign markets with gunboats. And the deployment of quantitative tools can be used to maximize kill ratios in the waging of high-tech war. Yet Rational-Legal reasoning, as we shall see, can also be put in the service of a utilitarian morality that calculates the greatest good for the greatest number, and that titrates the amount of legitimate police and military force to the minimum necessary to reduce the aggregate amount of violence.195
In judging the importance of moral concerns, recall, social liberals place little weight on In-group Loyalty and Purity/Sanctity (which Fiske lumps under Communal Sharing), and they place little weight on Authority/Respect. Instead they invest all their moral concern in Harm/Care and Fairness/Reciprocity. Social conservatives spread their moral portfolio over all five.197 The trend toward social liberalism, then, is a trend away from communal and authoritarian values and toward values based on equality, fairness, autonomy, and legally enforced rights.
Why might a disinvestment of moral resources from community, sanctity, and authority militate against violence? One reason is that communality can legitimize tribalism and jingoism, and authority can legitimize government repression. But a more general reason is that a retrenchment of the moral sense to smaller territories leaves fewer transgressions for which people may legitimately be punished.
If the recent theories of moral psychology are on the right track, then intuitions of community, authority, sacredness, and taboo are part of human nature and will probably always be with us, even if we try to sequester their influence.
In theory, peace negotiations should take place within a framework of Market Pricing. A surplus is generated when adversaries lay down their arms––the so-called peace dividend––and the two sides get to yes by agreeing to divide it. Each side compromises on its maximalist demand in order to enjoy a portion of that surplus, which is greater than what they would end up with if they walked away from the table and had to pay the price of continuing conflict.
What exogenous causes are shifting the allocation of moral intuitions away from community, authority, and purity and toward fairness, autonomy, and rationality? One obvious force is geographic and social mobility. People are no longer confined to the small worlds of family, village, and tribe, in which conformity and solidarity are essential to daily life, and ostracism and exile are a form of social death.
Another subverter of community, authority, and purity is the objective study of history. The mindset of Communality, Fiske notes, conceives of the group as eternal: the group is held together by an immutable essence, and its traditions stretch back to the dawn of time.203 Authority Rankings too are naturally portrayed as everlasting. They were ordained by the gods, or are inherent in a great chain of being that organizes the universe. And both models boast an abiding nobility and purity as part of their essential nature.
He analyzed a dataset of 42 presidents from GW to GWB and found that both raw intelligence and openness to new ideas and values are significantly correlated with presidential performance as it has been assessed by nonpartisan historians.209
It’s more accurate to say that the three smartest postwar presidents, Kennedy, Carter, and Clinton, kept the country out of destructive wars.
Nazi ideology, like the nationalist, romantic militarist, and communist movements of the same era, was a fruit of the 19th-century counter-Enlightenment, not the line of thinking that connects Erasmus, Bacon, Hobbes, Spinoza, Locke, Hume, Kant, Bentham, Jefferson, Madison, and Mill.
When Hume famously wrote that “reason is, and ought to be, only the slave of the passions,” he was not advising people to shoot from the hip, blow their stack, or fall head over heels for Mr. Wrong.216 He was basically making the logical point that reason, by itself, is just a means of getting from one true proposition to the next and does not care about the value of those propositions. Nonetheless there are many reasons why reason, working in conjunction with “some particle of the dove, kneaded into our frame,” must “direct the determinations of our mind, and where every thing else is equal, produce a cool preference of what is useful and serviceable to mankind, above what is pernicious and dangerous.”
It is reason––a deduction of the long-term consequences of an action––that gives the self reasons to control the self. Self-control, moreover, involves more than just avoiding rash choices that will damage one’s future self. It can also mean suppressing some of our base instincts in the service of motives that we are better able to justify.
A real change came in the writings of Grotius, Hobbes, Kant, and other modern thinkers: war was intellectualized as a game-theoretic problem, to be solved by proactive institutional arrangements. Centuries later some of these arrangements, such as Kant’s triad of democratization, trade, and an international community, helped to drive down the rate of war in the Long Peace and the New Peace.
And the Cuban Missile Crisis was defused when Kennedy and Khrushchev consciously reframed it as a trap for the two of them to escape without either side losing face.
General intelligence, moreover, is highly heritable, and mostly unaffected by the family environment (though it may be affected by the cultural environment). We know this because measures of g in adults are strongly correlated in identical twins separated at birth and are not at all correlated in adopted siblings who have been raised in the same family.
Current IQ tests tap abstract, formal reasoning: the ability to detach oneself from parochial knowledge of one’s own little world and explore the implications of postulates in purely hypothetical worlds.
Proportional, percentage, correlation, causation, control group, placebo, representative sample, false positive, empirical, post hoc, statistical, median, variability, circular argument, tradeoff, and cost-benefit analysis.
The cognitive skill that is most enhanced in the Flynn Effect, abstraction from the concrete particulars of immediate experience, is precisely the skill that must be exercised to take the perspectives of others and expand the circle of moral consideration.
But of course there has been an intellectual renaissance in recent decades, perhaps not in culture but certainly in science and technology. Cosmology, particle physics, geology, genetics, molecular biology, evolutionary biology, and neuroscience have made vertiginous leaps in understanding, while technology has given us secular miracles such as replaceable body parts, routine genome scans, stunning photographs of outer planets and distant galaxies, and tiny gadgets that allow one to chat with billions of people, take photographs, pinpoint one’s location on the globe, listen to vast music collections, read from vast libraries, and access the wonders of the World Wide Web.
The other half of the sanity check is to ask whether our recent ancestors can really be considered morally retarded. The answer, I am prepared to argue, is yes. Though they were surely decent people with perfectly functioning brains, the collective moral sophistication of the culture in which they lived was as primitive by modern standards as their mineral spas and patent medicines are by the medical standards of today.
For the same reason, the specific differences in reasoning that I will focus on are not necessarily heritable (even though general intelligence is highly heritable), and I will stick with the assumption that all differences among groups are environmental in origin.
Let me present seven links, varying in directness, between reasoning ability and peaceable values. Intelligence and Violent Crime. The first link is the most direct: smarter people commit fewer violent crimes, and they are the victims of fewer violent crimes, holding socioeconomic status and other variables constant.265
Intelligence and Cooperation. At the other end of the abstractness scale, we can consider the purest model of how abstract reasoning might undermine the temptations of violence, the Prisoner’s Dilemma.
Intelligence and Liberalism. Now we get to a finding that sounds more tendentious than it is: smarter people are more liberal.
But the key qualification is that the escalator of reason predicts only that intelligence should be correlated with classical liberalism, which values the autonomy and well-being of individuals over the constraints of tribe, authority, and tradition.
Intelligence need not correlate with other ideologies that get lumped into contemporary left-of-center political coalitions, such as populism, socialism, political correctness, identity politics, and the Green movement. Indeed, classical liberalism is sometimes congenial to the libertarian and antipolitical-correctness factions in today’s right-of-center coalitions.
Intelligence and Economic Literacy.
Smarter people tend to think more like economists (even after statistically controlling for education, income, sex, political party, and political orientation). They are more sympathetic to immigration, free markets, and free trade, and less sympathetic to protectionism, make-work policies, and government intervention in business. Of course none of these positions is directly related to violence.
That sets it in opposition to populist, nationalist, and communist mindsets that see the world’s wealth as zero-sum and infer that the enrichment of one group must come at the expense of another. The historical result of economic illiteracy has often been ethnic and class violence, as people conclude that the have-nots can improve their lot only by forcibly confiscating wealth from the haves and punishing them for their avarice.
Education, Intellectual Proficiency, and Democracy. Speaking of the Kantian Peace, the democracy leg of the tripod may also be fortified by reasoning.
In one arena, however, politicians really do seem to be swimming against the Flynn Effect: American presidential debates. To those who followed these debates in 2008, three words are enough to make the point: Joe the Plumber. The psychologists William Gorton and Janie Diels quantified the trend by scoring the sophistication of candidates’ language in the debates from 1960 through 2008.292 They found that the overall sophistication declined from 1992 to 2008, and the quality of remarks on economics began a free fall even earlier, in 1984. Ironically, the decrease in sophistication in presidential debates may be the product of an increase in the sophistication of political strategists. Televised debates in the waning weeks of a campaign are aimed at a sliver of undecided voters who are among the least informed and least engaged sectors of the electorate. They are apt to make their choice based on sound bites and one-liners, so the strategists advise the candidates to aim low. The level of sophistication cratered in 2000 and 2004, when Bush’s Democratic opponents matched him platitude for platitude. This exploitable vulnerability of the American political system might help explain how the country found itself in two protracted wars during an era of increasing peace.
Empathy is a circle that may be stretched, but its elasticity is limited by kinship, friendship, similarity, and cuteness. It reaches a breaking point long before it encircles the full set of people that reason tells us should fall within our moral concern.
It is reason that teaches us the tricks for expanding our empathy, and it is reason that tells us how and when we should parlay our compassion for a pathetic stranger into an actionable policy.
If you detect a flaw in this argument, it is reason that allows you to point it out and defend an alternative.
The technology of weaponry has obviously changed the course of history many times by determining winners and losers, making deterrence credible, and multiplying the destructive power of certain antagonists. No one would argue, for example, that the proliferation of automatic weapons in the developing world has been good for peace. Yet it’s hard to find any correlation over history between the destructive power of weaponry and the human toll of deadly quarrels. Over the millennia weapons, just like every technology, got better and better, yet rates of violence have not gone steadily up but rather have lurched up and down the slope of an inclined sawtooth.
The spears and arrows of pre-state peoples notched up higher proportional body counts than has anything since (chapter 2), and the pikemen and cavalry of the Thirty Years’ War did more human damage than the artillery and gas of World War I
The tungsten theory of the Vietnam War is an example of resource determinism, the idea that people inevitably fight over finite resources like land, water, minerals, and strategic terrain.
The most destructive eruptions of the past half millennium were fueled not by resources but by ideologies, such as religion, revolution, nationalism, fascism, and communism
Evolutionary psychologists tell us that no matter how rich or poor men are, they can always fight over women, status, and dominance. Economists tell us that wealth originates not from land with stuff in it but from the mobilization of ingenuity, effort, and cooperation to turn that stuff into usable products. When people divide the labor and exchange its fruits, wealth can grow and everyone wins. That means that resource competition is not a constant of nature but is endogenous to the web of societal forces that includes violence. Depending on their infrastructure and mindset, people in different times and places can choose to engage in positive-sum exchanges of finished products or in zero-sum contests over raw materials––indeed, negative-sum contests, because the costs of war have to be subtracted from the value of the plundered materials.
The poor correlation could have been predicted by the police blotters, which show that homicides are driven by moralistic motives like payback for insults and infidelity rather than by material motives such as cash or food.
The tangled relationship between wealth and violence reminds us that humans do not live by bread alone. We are believing, moralizing animals, and a lot of our violence comes from destructive ideologies rather than not enough wealth. For better or worse––usually worse––people are often willing to trade off material comfort for what they see as spiritual purity, communal glory, or perfect justice.
The scriptures present a God who delights in genocide, rape, slavery, and the execution of nonconformists, and for millennia those writings were used to rationalize the massacre of infidels, the ownership of women, the beating of children, dominion over animals, and the persecution of heretics and homosexuals
The theory that religion is a force for peace, often heard among the religious right and its allies today, does not fit the facts of history.
Fascism happily coexisted with Catholicism in Spain, Italy, Portugal, and Croatia, and though Hitler had little use for Christianity, he was by no means an atheist, and professed that he was carrying out a divine plan.1
So the subtitle of Christopher Hitchens’s atheist bestseller, How Religion Poisons Everything, is an overstatement. Religion plays no single role in the history of violence because religion has not been a single force in the history of anything.
The second part of the tragedy is that the costs to a victim (––100, in this case) are vastly disproportionate to the benefits to the aggressor (10). Unless two adversaries are locked in a fight to the death, aggression is not zero-sum but negative-sum; they are collectively better off not doing it, despite the advantage to the victor. The advantage to a conqueror in gaining a bit more land is swamped by the disadvantage to the family he kills in stealing it, and the few moments of drive reduction experienced by a rapist are obscenely out of proportion to the suffering he causes his victim. The asymmetry is ultimately a consequence of the law of entropy: an infinitesimal fraction of the states of the universe are orderly enough to support life and happiness, so it’s easier to destroy and cause misery than to cultivate and cause happiness.
THE LEVIATHAN A state that uses a monopoly on force to protect its citizens from one another may be the most consistent violence-reducer that we have encountered in this book.
If a government imposes a cost on an aggressor that is large enough to cancel out his gains––say, a penalty that is three times the advantage of aggressing over being peaceful––it flips the appeal of the two choices of the potential aggressor, making peace more attractive than war (figure 10––2).
Leviathan––or his female counterpart Justitia, the goddess of justice––is a disinterested third party whose penalties are not inflated by the self-serving biases of the participants, and who is not a deserving target of revenge.
When bands, tribes, and chiefdoms came under the control of the first states, the suppression of raiding and feuding reduced their rates of violent death fivefold (chapter 2). And when the fiefs of Europe coalesced into kingdoms and sovereign states, the consolidation of law enforcement eventually brought down the homicide rate another thirtyfold (chapter 3).
Inept governance turns out to be among the biggest risk factors for civil war, and is perhaps the principal asset that distinguishes the violence-torn developing world from the more peaceful developed world
GENTLE COMMERCE The idea that an exchange of benefits can turn zero-sum warfare into positive-sum mutual profit was one of the key ideas of the Enlightenment, and it was revived in modern biology as an explanation of how cooperation among nonrelatives evolved.
Though gentle commerce does not eliminate the disaster of being defeated in an attack, it eliminates the adversary’s incentive to attack (since he benefits from peaceful exchange too) and so takes that worry off the table. The profitability of mutual cooperation is at least partly exogenous because it depends on more than the agents’ willingness to trade: it depends as well on whether each one specializes in producing something the other one wants, and on the presence of an infrastructure that lubricates their exchange, such as transportation, finance, record-keeping, and the enforcement of contracts.
Beginning in the late Middle Ages, expanding kingdoms not only penalized plunder and nationalized justice, but supported an infrastructure of exchange, including money and the enforcement of contracts.
Bureaucrats displaced knightly warriors.
Yamaguchi was invoking the most fundamental empirical generalization about violence, that it is mainly committed by men. From the time they are boys, males play more violently than females, fantasize more about violence, consume more violent entertainment, commit the lion’s share of violent crimes, take more delight in punishment and revenge, take more foolish risks in aggressive attacks, vote for more warlike policies and leaders, and plan and carry out almost all the wars and genocides
Historically, women have taken the leadership in pacifist and humanitarian movements out of proportion to their influence in other political institutions of the time, and recent decades, in which women and their interests have had an unprecedented influence in all walks of life, are also the decades in which wars between developed states became increasingly unthinkable
Female-friendly values may be expected to reduce violence because of the psychological legacy of the basic biological difference between the sexes, namely that males have more of an incentive to compete for sexual access to females, while females have more of an incentive to stay away from risks that would make their children orphans.
Zero-sum competition, whether it takes the form of the contests for women in tribal and knightly societies or the contests for honor, status, dominance, and glory in modern ones, is more a man’s obsession than a woman’s.
Feminization need not consist of women literally wielding more power in decisions on whether to go to war. It can also consist in a society moving away from a culture of manly honor, with its approval of violent retaliation for insults, toughening of boys through physical punishment, and veneration of martial glory
One of these arrangements is marriage, in which men commit themselves to investing in the children they sire rather than competing with each other for sexual opportunities. Getting married reduces men’s testosterone and their likelihood of living a life of crime, and we saw that American homicide rates plunged in the marriage-happy 1940s and 1950s, rose in the marriage-delaying 1960s and 1970s, and remain high in African American communities that have particularly low rates of marriage
Violence is a problem not just of too many males but of too many young males. At least two large studies have suggested that countries with a larger proportion of young men are more likely to fight interstate and civil wars
The reproductive biologist Malcolm Potts, writing with the political scientist Martha Campbell and the journalist Thomas Hayden, has amassed evidence that when women are given access to contraception and the freedom to marry on their own terms, they have fewer offspring than when the men of their societies force them to be baby factories. And that, in turn, means that their countries’ populations will be less distended by a thick slab of young people at the bottom. (Contrary to an earlier understanding, a country does not have to become affluent before its rate of population growth comes down.) Potts and his coauthors argue that giving women more control over their reproductive capacity (always the contested territory in the biological battle of the sexes) may be the single most effective way of reducing violence in the dangerous parts of the world today.
THE EXPANDING CIRCLE
Suppose that living in a more cosmopolitan society, one that puts us in contact with a diverse sample of other people and invites us to take their points of view, changes our emotional response to their well-being. Imagine taking this change to its logical conclusion: our own well-being and theirs have become so intermingled that we literally love our enemies and feel their pain.
Smaller increments in the valuation of other people’s interests––say, a susceptibility to pangs of guilt when thinking about enslaving, torturing, or annihilating others––could shift the likelihood of aggressing against them.
More people read books, including fiction that led them to inhabit the minds of other people, and satire that led them to question their society’s norms. Vivid depictions of the suffering wrought by slavery, sadistic punishments, war, and cruelty to children and animals preceded the reforms that outlawed or reduced those practices. Though chronology does not prove causation, the laboratory studies showing that hearing or reading a first-person narrative can enhance people’s sympathy for the narrator at least make it plausible
THE ESCALATOR OF REASON
The expanding circle (as I have been using the term) and the escalator of reason are conceptually distinct (chapter 9). The first involves occupying another person’s vantage point and imagining his or her emotions as if they were one’s own. The second involves ascending to an Olympian, superrational vantage point––the perspective of eternity, the view from nowhere––and considering one’s own interests and another person’s as equivalent.
The escalator of reason has an additional exogenous source: the nature of reality, with its logical relationships and empirical facts that are independent of the psychological makeup of the thinkers who attempt to grasp them.
The limited reach of empathy, with its affinity for people like us and people close to us, suggests that empathy needs the universalizing boost of reason to bring about changes in policies and norms that actually reduce violence in the world
When cosmopolitan currents bring diverse people into discussion, when freedom of speech allows the discussion to go where it pleases, and when history’s failed experiments are held up to the light, the evidence suggests that value systems evolve in the direction of liberal humanism
What has technology given us, they say, but alienation, despoliation, social pathology, the loss of meaning, and a consumer culture that is destroying the planet to give us McMansions, SUVs, and reality television? Lamentations of a fall from Eden have a long history in intellectual life, as the historian Arthur Herman has shown in The Idea of Decline in Western History .16 And ever since the 1970s, when romantic nostalgia became the conventional wisdom, statisticians and historians have marshaled facts against it. The titles of their books tell the story: The Good News Is the Bad News Is Wrong, It’s Getting Better All the Time, The Good Old Days––They Were Terrible!, The Case for Rational Optimism, The Improving State of the World, The Progress Paradox, and most recently, Matt Ridley’s The Rational Optimist and Charles Kenny’s Getting Better.17
These defenses of modernity recount the trials of daily living before the advent of affluence and technology. Our ancestors, they remind us, were infested with lice and parasites and lived above cellars heaped with their own feces. Food was bland, monotonous, and intermittent. Health care consisted of the doctor’s saw and the dentist’s pliers. Both sexes labored from sunrise to sundown, whereupon they were plunged into darkness. Winter meant months of hunger, boredom, and gnawing loneliness in snowbound farmhouses.
Until recently most people never traveled more than a few miles from their place of birth. Everyone was ignorant of the vastness of the cosmos, the prehistory of civilization, the genealogy of living things, the genetic code, the microscopic world, and the constituents of matter and life. Musical recordings, affordable books, instant news of the world, reproductions of great art, and filmed dramas were inconceivable, let alone available in a tool that can fit in a shirt pocket. When children emigrated, their parents might never see them again, or hear their voices, or meet their grandchildren. And then there are modernity’s gifts of life itself: the additional decades of existence, the mothers who live to see their newborns, the children who survive their first years on earth.
On top of all the benefits that modernity has brought us in health, experience, and knowledge, we can add its role in the reduction of violence.
To review the history of violence is to be repeatedly astounded by the cruelty and waste of it all, and at times to be overcome with anger, disgust, and immeasurable sadness. I know that behind the graphs there is a young man who feels a stab of pain and watches the life drain slowly out of him, knowing he has been robbed of decades of existence. There is a victim of torture whose contents of consciousness have been replaced by unbearable agony, leaving room only for the desire that consciousness itself should cease. There is a woman who has learned that her husband, her father, and her brothers lie dead in a ditch, and who will soon “fall into the hand of hot and forcing violation.” It would be terrible enough if these ordeals befell one person, or ten, or a hundred. But the numbers are not in the hundreds, or the thousands, or even the millions, but in the hundreds of millions––an order of magnitude that the mind staggers to comprehend, with deepening horror as it comes to realize just how much suffering has been inflicted by the naked ape upon its own kind.22
Thanks for reading. Did you like the content you just read? You can help me spread these ideas by sharing this blog post through your social media channels or sending it as a direct message to your friends.