Emotion and the Law


 
 

Every society has rules for which emotions are acceptable, when they are acceptable, and how to express them. In my American culture, it’s appropriate to feel grief when someone dies, and inappropriate to chuckle as the casket is lowered into the ground. A surprise party is a time to feel surprised and then joyful, and if you know about your own party in advance, it’s appropriate to feign surprise when you arrive. Members of the Ilongot tribe in the Philippines may feel the emotion liget when acting as a team to behead an enemy, in celebration of a job well done.1

If you violate your culture’s rules of social reality, punishment may follow. Laughter at a funeral may get you shunned. Failure to be surprised at your own party may yield disappointed guests. And most cultures no longer prize decapitation.

The ultimate rules for emotion in any society are set by its legal system.* That might seem like a surprising claim, but consider this. In the United States, if your accountant steals your life savings, or a banker sells you a bad mortgage, it’s considered unacceptable to kill them; but if you murder your spouse in a fit of rage for cheating on you with a secret lover, the law might cut you some slack, especially if you’re a man. It’s unacceptable to make your neighbor feel fear that you will harm him bodily—that is considered a form of assault—but in some states it’s okay for you to “stand your ground” and harm someone first, even if you kill the person. It’s acceptable for you to profess romantic love, but not (at various times in U.S. history) toward people whose sex is the same as yours or whose skin color isn’t. Violate these norms, and you might lose your money, your freedom, or your life.

For centuries, laws in the United States have been shaped by the classical view of emotion, steeped in the essentialist view of human nature. Judges, for example, attempt to set emotion aside to render a decision by pure reason, a belief that assumes emotion and reason are distinct entities. Violent defendants plead that they were hijacked by their anger, assuming that anger is one single, unitary cauldron that, when unconstrained by clear thought, bubbles over to unleash a torrent of aggression. Juries look for remorse in a defendant, as if remorse had a single, detectable expression in the face and body. Expert witnesses testify that a defendant’s bad behavior was caused by one errant brain blob, an example of baseless blob-ology.

The law is a social contract that exists in a social world. Are you responsible for your actions? Yes, says the essentialist view of human nature, as long as you haven’t been commandeered by your emotions. Are other people responsible for your actions? No, you are an individual with free will. How do you determine what a defendant is feeling? By detecting his or her emotions in expressions. How do you make a just, moral decision? By setting your emotions aside. What is the nature of harm? Physical harm, that is, tissue damage, is worse than emotional harm, which is considered to be separate from the body and less tangible. All of these assumptions—born of essentialism—are baked into the law at its deepest levels, driving verdicts of guilt and innocence and gauging punishments on a massive scale, even as neuroscience has been quietly debunking them as myths.2

Simply put, some people are punished undeservedly, and others escape punishment, based on an outdated theory of the mind that is rooted in belief rather than science. In this chapter, we’ll explore some common myths about emotion in the legal system and ask whether a biologically richer theory of the mind, especially one that is grounded in realistic neuroscience, can improve society’s pursuit of justice.

As every budding adolescent discovers, freedom is great. You can decide to stay out past midnight with your friends. You can decide not to do your homework. You can choose to eat cake for dinner. But as we all learn, choices come with consequences. The law is founded on the simple idea that you can choose to treat others well or badly. Choice bestows responsibility. If you treat others badly and consequently they suffer some harm, then you must be punished, particularly if you intended that harm. This is how society shows its respect for you as an individual. Your value as a human being, some legal scholars say, is rooted in the fact that you choose your actions and are responsible for them.3

If something interferes with your ability to choose your actions freely, the law says that you might be less responsible for the harm you caused. Take the case of Gordon Patterson, who caught his wife, Roberta, “in a state of semiundress” with her boyfriend, John Northrup. Patterson shot Northrup twice in the head, killing him. Patterson confessed to the shooting but argued that he was less culpable due to his “extreme emotional disturbance” at the time of the crime. According to U.S. law, Patterson’s sudden burst of rage caused him not to be fully in control of his actions, and he was therefore found guilty of second-degree murder—rather than first-degree murder, which requires premeditation and carries a harsher punishment. In other words, rational killing is considered worse than emotional killing, all other circumstances being equal.4

The U.S. legal system assumes that emotions are part of our supposed animal nature and cause us to perform foolish and even violent acts, unless we control them with our rational thoughts. Centuries ago, legal minds decided that people, when provoked, sometimes kill because they haven’t “cooled off” yet, and anger erupts unbidden. Anger steams, boils, explodes, and leaves a wake of destruction in its path. Anger makes people unable to conform their actions to the law, and so partially mitigates a person’s responsibility for his actions. The argument is known as a heat-of-passion defense.5

The heat-of-passion defense depends on some familiar assumptions from the classical view of emotion. The first assumption is that there is one universal type of anger, with a specific fingerprint, that justifies such a defense to a charge of murder. It supposedly includes a flushed face, clenched jaw, flared nostrils, and increased heart rate, blood pressure, and perspiration. As you’ve already learned, this alleged fingerprint is merely a Western cultural stereotype that’s not supported by data. On average, people’s heart rates go up when angry, but there’s tremendous variation, and similar increases are also part of the stereotypes for happiness, sadness, and fear. And yet, most killings are not committed in happiness or sadness; and if they were, the law does not consider these emotional episodes to be a mitigating factor.6

What’s more, most instances of anger do not lead to killing. I can state quite definitively that in twenty years of creating anger in my lab, we’ve never seen a test subject kill anybody. We see a far greater repertoire of action: swearing, threatening, pounding the table, leaving the room, crying, trying to resolve whatever conflict they’re having, or even smiling while wishing ill upon their oppressor. So the idea of anger as a trigger for uncontrolled murder is at best questionable.7

When I explain to people in the legal profession that anger has no biological fingerprint, they often assume I am claiming emotions don’t exist. That’s not at all the case. Of course anger exists. You just can’t point to a spot in a defendant’s brain, face, or EKG, and say, “Look, anger is right here,” let alone draw legal conclusions.

The legal system’s second assumption behind the heat-of-passion defense is that “cognitive control” in the brain is synonymous with rational thought, deliberate actions, and free will. For you to be considered culpable, it is not enough that you performed a harmful action (known by the legal term actus reus). You also had to mean it. You caused harm of your own free will with a guilty mind (mens rea). Emotions, on the other hand, are seen as rapid, automatically triggered reactions spewing from your ancient, inner beast. The human mind is considered a battleground for reason and emotion, so when you fail to exercise sufficient cognitive discipline, emotions are said to burst forth to hijack your behavior. They interfere with your choice of action, and therefore make you less culpable. This narrative of emotion as the primitive part of human nature, to be controlled by the more advanced and uniquely human rational parts, is the “triune brain” myth (chapter 4) whose roots go all the way back to Plato.

The distinction between emotion and cognition hinges on their alleged separation in the brain, with one regulating the other. Your emotional amygdala spies an open cash register, but then, as the story goes, you rationally consider your likelihood of jail time, which causes your prefrontal cortex to slam on the brakes and stop your arm from dipping into the drawer. But as you’ve learned by now, thinking and feeling are not distinct in the brain. Your desire for easy cash and your decision to pass it up are both constructed across your entire brain by interacting networks. Whenever you carry out an action—whether it feels automatic, like recognizing an object as a gun, or more deliberate, like aiming one—your brain is always a whirlwind of parallel predictions that compete with one another to determine your actions and your experience.

At different times, you have different experiences of agency. Emotion sometimes can feel uncontrollable, like a burst of anger that arrives without warning, but you can also act in anger with intent, methodically plotting someone’s demise. In addition, non-emotions like memories or ideas can pop into your head unbidden. And yet we never hear of defendants who commit murder “in a fit of thinking.”

You can even work yourself up deliberately into a frothing anger. Accused mass murderer Dylann Roof, who shot nine people in a Bible study meeting in South Carolina in June 2015, appeared to cultivate his anger toward African Americans deliberately for many months before the day he walked into that church. Roof said that he almost didn’t go through with his plan because everyone was so nice to him, and he appeared to work himself up to the heinous deed in the meeting, uttering repeated phrases like “I have to do it” and “You have to go.” So, overall, moments of emotion are not synonymous with moments that you’re out of control.8

Anger is a population of diverse instances, not a single automatic reaction in the true sense of the phrase. The same holds for every other category of emotion, cognition, perception, and other type of mental event. It might seem like your brain has a quick, intuitive process and a slower, deliberative one, and that the former is more emotional and the latter more rational, but this idea is not defensible on neuroscience or behavioral grounds. Sometimes your control network plays a large role in the construction process, and other times its role is less, but it is always involved, and the latter times are not necessarily emotional.9

Why does the fiction of the two-system brain survive, beyond the usual reason of essentialism? Because most psychology experiments unwittingly perpetuate this fiction. In real life, your brain predicts nonstop, with each brain state dependent on those that came before. Laboratory experiments break this dependency. Test subjects view images or listen to sounds presented in random order, responding after each one, say, by pressing a button. Such experiments disrupt the brain’s natural process of prediction. And the results come out looking like the subject’s brain makes a rapid, automatic response, followed by a controlled choice about 150 milliseconds later, as if the two responses came from distinct systems in the brain.10 The illusion of a two-system brain is a byproduct of a century-old, flawed experimental design, and our laws maintain the illusion.*

The legal system, with its essentialized view of the mind and brain, mixes up volition—whether your brain actually played a role in controlling your behavior—and awareness of volition—whether you experience having a choice. Neuroscience has quite a bit to say about this distinction. If you sit in a chair with your legs bent, toes not touching the floor, and tap your knee just below your kneecap, the bottom half of your leg gives a little kick. Hold your hand to a flame and your arm recoils. Present a puff of air to your cornea and you blink. Each of these examples is a reflex: sensation leading directly to motion. Reflexes in your peripheral nervous system have sensory neurons wired directly to motor neurons. We call the resulting actions “involuntary” because there is one, and only one, specific behavior for a specific sensory stimulation due to the direct wiring.11

Your brain, however, is not wired like a reflex. If it were, you’d be at the mercy of the world, like a sea anemone that reflexively stabs whatever fish happens to brush against its tentacles. The anemone’s sensory neurons, which receive input from the world, are directly connected to its motor neurons for movement. It has no volition.

A human brain’s sensory and motor neurons, however, communicate through intermediaries, called association neurons, and they endow your nervous system with a remarkable ability: decision-making. When an association neuron receives a signal from a sensory neuron, it has not one possible action but two. It can stimulate or inhibit a motor neuron. Therefore, the same sensory input can yield different outcomes on different occasions. This is the biological basis of choice, that most prized of human possessions. Thanks to association neurons, if a fish brushes against your skin, you can react with indifference, laughter, violence, or anything in between. You might feel like a sea anemone at times, but you have much more control over your harpoon than you might think.12

Your brain’s control network, which helps select your actions, is composed of association neurons. This network is always engaged, actively selecting your actions; you just don’t always feel in control. In other words, your experience of being in control is just that—an experience.13

Here’s where the law is out of sync with science, thanks to the classical view of human nature. The law defines deliberate choice—free will—as whether you feel in control of your thoughts and actions. It fails to distinguish between your ability to choose—the workings of your control network—and your subjective experience of choice. The two are not the same in the brain.14

Scientists are still trying to figure out how the brain creates the experience of having control. But one thing is certain: there is no scientific justification for labeling a “moment without awareness of control” as emotion.15

What does all this mean for the law? Remember that the legal system decides guilt or innocence based on intent—whether someone meant to commit harm. The law should continue to punish based on how intentional harm is, not on whether emotion is involved or whether a person experiences himself as an agent with volition.

Emotions are not temporary deviations from rationality. They are not alien forces that invade you without your consent. They are not tsunamis that leave destruction in their wake. They are not even your reactions to the world. They are your constructions of the world. Instances of emotion are no more out of control than thoughts or perceptions or beliefs or memories. The fact is, you construct many perceptions and experiences and you perform many actions, some that you control a lot and some that you don’t.

The legal system has a standard called the reasonable person who represents the norms of society, that is, the social reality within your culture. Defendants are measured against this standard. Consider the legal argument at the heart of the heat-of-passion defense: would a reasonable person have committed the same killing if he’d been similarly provoked without a chance to cool off ?

The standard of the reasonable person, and the social norms behind it, is not merely reflected in the law—it is created by the law. It is a way of saying, “Here is what we expect a human person to act like, and we will punish you if you don’t conform.” It’s a social contract, a guide to behavior for the average person in a population of diverse individuals. And like all averages, the reasonable person is a fiction that doesn’t apply exactly to any single individual. It’s a stereotype, and it encompasses stereotyped ideas about emotional “expression,” feeling, and perception that are part of the classical view of emotion and the theory of human nature that supports it.

A legal standard based on emotion stereotypes is especially problematic for the equitable treatment of men and women. The prevailing belief in many cultures is that women are more emotional and empathic, whereas men are more stoic and analytical. Shelves full of popular books portray this stereotype as fact: The Female Brain; The Male Brain; His Brain, Her Brain; The Essential Difference; Brain Sex; Unleash the Power of the Female Brain; and on and on. This stereotype affects even powerful women who are widely respected. Madeline Albright, the first female U.S. secretary of state, wrote in her memoir that “many of my colleagues made me feel that I was overly emotional, and I worked hard to get over that. In time, I learned to keep my voice flat and unemotional when I talked about issues that I considered important.”16

Take a moment and reflect on your own emotions. Do you tend to feel things intensely or more moderately? When we ask these types of questions in my lab to male and female test subjects—to describe their feelings from memory—the women report feeling more emotion than the men do on average. That is, the women believe they are more emotional than men, and the men agree. The one exception is anger, as subjects believe that men are angrier. However, when the same people record their emotional experiences as they occur in everyday life, there are no sex differences. Some men and women are very emotional, and some are not. Likewise, the female brain is not hardwired for emotion or empathy, and the male brain is not hardwired for stoicism or rationality.17

Where do these gender stereotypes come from? In the United States at least, women routinely “express” more emotion when compared to men. For example, women move their facial muscles more when watching films than men do, but women don’t report more intense experiences of emotion while watching. This finding, if nothing else, might explain why the stereotypes of the stoic man and the emotional woman leak into the courtroom and have a significant influence on judges and juries.18

Because of these stereotypes, heat-of-passion defenses—and legal proceedings in general—are often applied differently to male versus female defendants. Consider two murder cases that are pretty similar except for the sex of the defendant. In the first case, a man named Robert Elliott was convicted of killing his brother, allegedly because of “extreme emotional disturbance” that included “an overwhelming fear of his brother.” The jury found him guilty of murder but the decision was overturned by the Supreme Court of Connecticut, citing that Elliott’s “intense feelings” about his brother overwhelmed his “self-control” and “reason.” In the second case, a woman named Judy Norman killed her husband after he had systematically beaten and abused her for years. The Supreme Court of North Carolina rejected the defense’s claim that Norman was acting in self-defense out of “a reasonable fear of imminent death or great bodily harm,” and she remained convicted of voluntary manslaughter.19

These two cases match several stereotypes about emotion in men versus women. Anger is stereotypically normal for men because they are supposed to be aggressors. Women are supposed to be victims, and good victims shouldn’t become angry; they’re supposed to be afraid. Women are punished for expressing anger—they lose respect, pay, and perhaps even their jobs. Whenever I see a savvy male politician play the “angry bitch card” against a female opponent, I take it as an ironic sign that she must be really competent and powerful. (I have yet to meet a successful woman who hasn’t paid her dues as a “bitch” before she was accepted as a leader.)20

In courtrooms, angry women like Ms. Norman lose their liberty. In fact, in domestic violence cases, men who kill get shorter and lighter sentences, and are charged with less serious crimes, than are women who kill their intimate partners. A murderous husband is just acting like a stereotypical husband, but wives who kill are not acting like typical wives, and therefore they are rarely exonerated.21

Emotion stereotyping is even worse when the female victim of domestic violence is African American. The archetypal victim in American culture is fearful, passive, and helpless, but in African American communities, women sometimes violate this stereotype by defending themselves vigorously against their alleged batterers. By fighting back, they reinforce a different stereotype of female emotion, the “angry black woman,” which is also pervasive in the U.S. legal system. These women are more likely to be charged with domestic violence themselves, even when their actions were in self-defense and were less severe than the original assault. (No “stand your ground” allowed here!) And if they injure or kill their alleged batterer, they usually fare worse than a European American woman in the same situation.22

For example, consider the case of Jean Banks, an African American woman who stabbed and killed her live-in partner, James “Brother” McDonald, after he had beaten her for years, sometimes so severely that she required medical attention. On this particular day, both had been drinking, and during an argument, McDonald pushed Banks to the ground and attempted to slice her with a glass cutter. Banks grabbed a knife to defend herself and stabbed him through the heart. She claimed self-defense but nonetheless was convicted of second-degree murder. (Compare this to light-skinned Judy Norman, who was convicted of voluntary manslaughter, a lesser charge.)23

Angry women do not fare well outside of domestic violence cases either. Judges infer all sorts of negative personality characteristics in angry female rape victims that they tend not to attribute to angry male crime victims. When a woman has been raped, for instance, judges (and juries and the police) expect to see her express grief on the witness stand, which tends to bring the rapist a heavier sentence. When a female victim expresses anger, judges evaluate her negatively. These judges are falling prey to another version of the “angry bitch” phenomenon. When people perceive emotion in a man, they usually attribute it to his situation, but when they perceive emotion in a woman, they connect it to her personality. She’s a bitch, but he’s just having a bad day.24

Outside the courtroom, we find laws where gender stereotypes prescribe the acceptable emotions we must feel and express. Abortion laws, as written, signal which emotions are appropriate for a woman to feel, namely, remorse and guilt, whereas relief and happiness go unmentioned. The debate over the legality of gay marriage was, in a way, whether the law should sanction the emotion of romantic love between two people of the same sex. Adoption laws governing gay men raise the question of whether a father’s love is equal to that of a mother.25

Overall, there is no scientific justification for the law’s view of men’s and women’s emotions. They are merely beliefs that come from an outdated view of human nature. The examples I’ve chosen represent only a small slice of the issue, both on the legal side and on the science side. I’ve barely scratched the surface of emotion stereotypes of ethnic groups, for example, who face similar struggles in and out of court. As long as the law codifies emotion stereotypes, people will continue to be the target of inconsistent rulings.26

When Stefania Albertani pled guilty to drugging and killing her own sister, not to mention setting the corpse on fire, her defense team took a bold step and blamed her brain.

Brain imaging revealed that two regions of Albertani’s cortex contained fewer neurons than a control group of ten other healthy women. The regions were the insula, which the defense claimed was associated with aggression, and the anterior cingulate gyrus, which allegedly was associated with lowering one’s inhibitions. Two expert witnesses concluded that a “causal relationship” between her brain structure and her crime was possible. After this testimony, Albertani’s jail sentence was reduced from life imprisonment to twenty years.27

Legal decisions like this one, which was a media sensation in Italy in 2011, are becoming more common as lawyers employ neuroscience findings in their defense strategy. But are these decisions justified? Can brain structure explain why someone committed a crime? Can a region of a certain size or connectivity actually cause murderous behavior, and in the process, make a defendant less responsible for a crime?28

Legal arguments like those made by Albertani’s defense team grossly misrepresent neuroscience findings and the conclusions that can be drawn from them. It is just not possible to localize a complex, psychological category like “Aggression” to one set of neurons, because of degeneracy; “Aggression,” like any other concept, may be implemented differently in the brain each time it’s constructed. Even simple actions like hitting or biting have not been localized to a single set of neurons in the human brain.29

The brain regions mentioned by Albertani’s defense team are among the most highly connected hubs in the entire brain. They show increased activation for just about every mental event you can list, from language to pain to math skills. So, sure, they might play a role in aggression and impulsivity in some instances. But it’s a stretch to claim any specific causal relationship between these regions and the extreme aggression of murder . . . if Albertani’s motive was even aggression in the first place.30

It’s also a stretch to claim that variation in brain size translates into variation in behavior. No two brains are exactly alike. They generally have the same parts, roughly in the same place, connected together in pretty much the same way, but at a fine-grained level, in their microcircuitry, they have vast differences. Some may translate into behavioral differences, but many do not. Your insula might be larger or more highly connected than mine without any discernable effect on your behavior when compared to my behavior. Even if we examine many brains and find a statistically significant difference in insula size between people who are more or less aggressive, that doesn’t mean that a larger insula causes aggression, let alone murder. (Plus, even if a larger insula did cause aggression, how big does it need to be to produce a killer?) In rare cases, a tumor can press against the brain and cause severe personality changes, but in general, it is not scientifically justified to try a brain region for murder.31

Perhaps the most surprising thing about Albertani’s case is that the expert witnesses and the judge thought that the brain was an “extenuating explanation” for Albertani’s murderous behavior. All behavior stems from the brain. No human actions, thoughts, or feelings exist apart from firing neurons. The wrong way to use neuroscience in court is to argue that a biological explanation automatically releases someone from responsibility. You are your brain.32

The law often looks for simple, single causes, so it’s tempting to blame a brain aberration for criminal behavior. But behavior in real life is anything but simple. It’s a culmination of multiple factors, including predictions from your brain, prediction error from your five senses plus interoceptive sensation, and a complex cascade involving billions of prediction loops. And that’s just the story inside a single person. Your brain is also surrounded by other brains in other bodies. Whenever you speak or act, you influence the predictions of others around you, who in turn influence your predictions right back. A whole culture collectively plays a role in the concepts you build and the predictions you make, and therefore in your behavior. People can argue over how large a role culture plays, but the fact of its role is not debatable.

Bottom line: Sometimes a biological problem can interfere with your brain’s ability to choose your actions with intent. Maybe you grow a brain tumor, or some neurons begin to die in just the wrong places. But mere variability in the brain—in its structure, function, chemistry, or genetics—is not an extenuating circumstance for a crime. Variation is the norm.

Dzhokhar Tsarnaev, the Boston Marathon bomber, was convicted in 2015 and sentenced to death. Tsarnaev received a trial by jury, a right guaranteed to all Americans by the U.S. Constitution. According to the BBC, who reported on the sentencing, “Only two of the jurors believed Tsarnaev has felt remorse. The other 10, like many in Massachusetts, think he has no regrets.” Jurors formed these opinions of Tsarnaev’s remorse by observing him closely during the trial, where he reportedly sat “stone-faced” throughout most of the proceedings. Slate.com noted that Tsarnaev’s defense attorney “did not—or could not—present evidence [that] Dzhokhar Tsarnaev has felt any of the remorse that the prosecution says he is devoid of.”33

Trial by jury is considered the gold standard for fairness in a criminal case. Jurors are instructed to make decisions based only on the evidence presented. In a predicting brain, however, this is an impossible task. The jurors perceive every defendant, plaintiff, witness, judge, attorney, courtroom, and iota of evidence through the lens of their own conceptual system, which makes the idea of the impartial juror an implausible fiction. In effect, a jury is a dozen subjective perceptions that are supposed to yield one fair and objective truth.

The idea that jurors can somehow detect remorse in a defendant, from his facial configurations or bodily movements or words, is steeped in the classical view, which assumes that emotions are universally expressed and recognized. The legal system assumes that remorse, like anger and other emotions, has a single, universal essence with a detectable fingerprint. However, remorse is an emotion category composed of many diverse instances, each one made for a specific situation.

A defendant’s construction of remorse depends on his concept for “Remorse,” culled from his prior experiences within his culture, which exists as cascades of predictions that guide his expression and his experience. On the other side of the courtroom, a juror’s perception of remorse is a mental inference—a guess based on cascades of predictions in her brain that make sense of the defendant’s facial movements, body posture, and voice. For that juror’s perceptions to be “accurate,” she and the defendant must categorize with similar concepts. This kind of synchrony, with one person feeling remorse and the other perceiving it, even without words ever being spoken, is more likely to occur when two people have similar backgrounds, age, sex, or ethnicity.34

In the Boston Marathon Bombing case, if Tsarnaev felt remorse for his deeds, what would it have looked like? Would he have openly cried? Begged his victims for forgiveness? Expounded on the error of his ways? Perhaps, if he were following American stereotypes for expressing remorse, or if this were a trial in a Hollywood movie. But Tsarnaev is a young man of Muslim faith from Chechnya. He lived in the United States and had close American friends, but Tsarnaev had also (by his defense team’s account) spent a lot of time with his older, Chechen brother. Chechen culture expects men to be stoic in the face of adversity. If they lose a battle, they should bravely accept defeat, a mindset known as the “Chechen wolf.” So if Tsarnaev felt remorse, he might well have remained stony-faced.35

Tsarnaev did reportedly become tearful for a moment when his aunt took the stand to plead for his life. Chechnya has a culture of honor, where it is painful to shame your family. If Tsarnaev saw a loved one publicly shamed, say, an aunt begging on his behalf, a few tears would be consistent with Chechen cultural norms for honor.36

We—and jurors—can only guess when constructing a perception to explain Tsarnaev’s impassive stance. Using our Western cultural concepts of remorse, we perceived him as coolly indifferent or full of bravado, rather than stoic. So it’s possible that our guesswork, in this case, produced a cultural misunderstanding in the courtroom, ultimately leading to his death sentence. Or maybe he really is remorseless.37

As it turns out, Tsarnaev actually did convey remorse for his actions in a letter of apology he wrote in 2013, just a few months after the bombing, two years before he went to trial. Jurors never saw the letter, however. It was sealed as confidential under the U.S. Government’s Special Administrative Measures, citing an “international security issue,” and excluded as evidence from the trial.38

On June 25, 2015, Tsarnaev finally spoke at his sentencing hearing. He confessed to the bombing and stated that he understood the impact of his crime. “I am sorry for the lives that I’ve taken,” he apologized quietly and calmly, “for the suffering that I’ve caused you, for the damage that I’ve done. Irreparable damage.” The range of responses from victims and the press covering the trial was predictably variable. Some were stunned. Some were upset. Some were outraged. Some accepted his apology. And many just could not decide whether it was sincere.

We can never know whether Tsarnaev experienced remorse for his terrible actions, nor if his letter could have affected his sentence. But one thing is certain: At a death penalty proceeding, a defendant’s remorse is a critical feature that jurors must rely on, according to the law, to make a decision between imprisonment and death. And those perceptions of remorse, like all perceptions of emotion, are not detected but constructed.39

At the other end of the spectrum, a show of remorse can mean absolutely nothing. Take the case of Dominic Cinelli, a violent criminal with a thirty-year history of armed robberies, assaults, and prison escapes. Cinelli was serving three consecutive life sentences when he appeared before the Massachusetts Parole Board in 2008. A parole board is made up of psychologists, corrections officers, and other knowledgeable professionals who decide whether an inmate will serve beyond his minimum sentence or be released. They witness a virtual parade of remorse, some genuinely experienced and some faked, and their profound responsibility to the public rests on their ability to tell the difference.

In November 2008, Cinelli convinced the parole board that he was no longer a criminal with darkness in his soul. The board unanimously voted to free him. It didn’t take long for Cinelli to embark on a new series of robberies and fatally shoot a police officer. Cinelli was later killed during a shootout with the police. The governor of Massachusetts, Deval Patrick, saw five of the seven members of the parole board resign. He seemed to think that they lacked the ability to detect authentic remorse.40

It’s possible that Cinelli was putting on an act. It’s also possible that Cinelli authentically felt remorse in the moment while he was testifying, but once he was out of prison, his old model of the world resurfaced, with his old predictions, creating his old self, and his remorse evaporated. Since there is no objective criterion for feelings of remorse, we will never know for sure. There is likewise no objective criterion for anger, sadness, fear, or any other emotion relevant to a trial.

U.S. Supreme Court Justice Anthony Kennedy once said that juries must “know the heart and mind of the offender” in order for a defendant to have a fair trial. Emotions, however, have no consistent fingerprints in facial movements, body posture and gestures, or voice. Jurors and other perceivers make educated guesses about what those movements and sounds mean in emotional terms, but there is no objective accuracy. At best, we can measure whether jurors agree with one another in the emotions they perceive, but when the defendant and the jurors have different backgrounds, beliefs, or expectations, agreement is a poor substitute for accuracy. If a defendant’s demeanor cannot reveal emotion, then the legal system is left to grapple with a difficult question: under what circumstances can a trial be completely fair?41

When jurors or judges see smugness in a defendant’s smile, or when they hear a witness’s quavering voice as fear, they are making a mental inference, employing their emotion concepts to guess that the action (smiling or quavering) was caused by a particular state of mind. Mental inference, you’ll remember, is how your brain gives meaning to other people’s actions through a cascade of predictions (chapter 6).42

Mental inference is so pervasive and automatic, at least in cultures of the West, that we’re usually unaware of doing it. We believe that our senses provide an accurate and objective representation of the world, as if we had X-ray vision for deciphering another person’s behavior to discover his intent (“I can see right through you”). In these moments, we experience our perceptions of other people as an obvious property of them—a phenomenon we’ve called affective realism—rather than a combination of their actions and the concepts in our own brain.

When someone is on trial for a crime, and liberty and life are at stake, there can be a gaping chasm between appearance and reality. Deep down we know this, but at the same time we are supremely confident that we can discern truth from fiction more accurately than the other schmucks in the room. And herein lies the problem in court.

Jurors and judges are charged with an almost impossible task: to be a mind reader, or if you’d rather, a lie detector. They must decide if a person intended to cause harm. According to the legal system, intent is a fact that is as plain as the nose on a defendant’s face. But in a predicting brain, a judgment about someone else’s intent is always a guess you construct based on the defendant’s actions, not a fact you detect; and just as with emotions, there is no objective, perceiver-independent criterion of intent. Seventy years of psychological research confirms that judgments like these are mental inferences, that is, guesses. Even if DNA evidence connects a defendant to the scene of a crime, it does not determine whether he had criminal intent.43

Judges and jurors infer intent, usually in line with their own beliefs, stereotypes, and current body states. Here is just one example of how this works. Test subjects watched a video of protestors being dispersed by police. They were told the protestors were pro-life activists picketing an abortion clinic. Those who were liberal Democrats, who tend to be pro-choice, inferred that the activists had violent intentions, whereas socially conservative subjects inferred peaceful intentions. The researchers also showed the same video to a second set of subjects, describing the protestors this time as gay rights activists objecting to the military’s Don’t Ask, Don’t Tell policy. This time, those who were liberal Democrats, who tend to support gay rights, inferred that the activists had peaceful intentions, whereas socially conservative subjects inferred violent intentions.44

Now imagine that this video were evidence at a trial. All jurors would watch the same scenes, with exactly the same behaviors onscreen, but through affective realism, they would come away with only perceptions, not facts, constructed in line with their own beliefs, entirely without their awareness. My point is that bias is not advertised by a glowing sign worn around jurors’ necks; we are all guilty of it, because the brain is wired for us to see what we believe, and it usually happens outside of everyone’s awareness.

Affective realism decimates the ideal of the impartial juror. Want to increase the likelihood of a conviction in a murder trial? Show the jury some gruesome photographic evidence. Tip their body budgets out of balance and chances are they’ll attribute their unpleasant affect to the defendant: “I feel bad, therefore you must have done something bad. You are a bad person.” Or permit family members of the deceased to describe how the crime has hurt them, a practice known as a victim impact statement, and the jury will tend to recommend more severe punishments. Crank up the emotional impact of a victim impact statement by recording it professionally on video and adding music and narration like a dramatic film, and you’ve got the makings of a jury-swaying masterpiece.45

Affective realism intertwines with the law outside the courtroom as well. Imagine that you are enjoying a quiet evening at home when suddenly you hear loud banging outside. You look out the window and see an African American man attempting to force open the door of a nearby house. Being a dutiful citizen, you call 911, and the police arrive and arrest the perpetrator. Congratulations, you have just brought about the arrest of Harvard professor Henry Louis Gates, Jr., as it happened on July 16, 2009. Gates was trying to force open the front door of his own home, which had become stuck while he was traveling. Affective realism strikes again. The real-life eyewitness in this incident had an affective feeling, presumably based on her concepts about crime and skin color, and made a mental inference that the man outside the window had intent to commit a crime.46

A similar bout of affective realism gave birth to Florida’s controversial “Stand Your Ground” law. This law permits the use of deadly force in self-defense if you reasonably believe you’re in imminent danger of death or great bodily harm. A real-life incident was the catalyst for the law, but not in the way that you might think. Here’s how the story is usually told: In 2004, an elderly couple was asleep in their trailer home in Florida. An intruder tried to break in, so the husband, James Workman, grabbed a gun and shot him. Now here’s the true, tragic backstory: Workman’s trailer was in a hurricane-damaged area, and the man he shot was an employee of the Federal Emergency Management Agency (FEMA). The victim, Rodney Cox, was African American; Workman is white. Workman, mostly likely under the influence of affective realism, perceived that Cox meant him harm and opened fire on an innocent man. Nevertheless, the inaccurate first story became a primary justification for Florida’s law.47

The very history of stand your ground laws is, ironically, potent evidence against their value. It’s impossible to determine reasonable fear for one’s life in a society where racist stereotypes abound and affective realism literally transforms how people see each other. The whole line of reasoning for stand your ground is gutted by affective realism.

If stand your ground doesn’t scare the crap out of you, think about the impact of affective realism on people who legally carry concealed weapons. Affective realism indisputably influences people’s perceptions of threat; therefore it virtually assures that innocent people will be shot by accident. It’s simple: you predict a threat, sensory information from the world says otherwise, but then your control network downplays the prediction error to maintain the prediction of threat. Bam, you’ve shot a harmless fellow citizen. Human brains are built for this sort of delusion, through the same process that produces daydreams and imagination.

I will not wade any further into the national debate about firearms for now, but from a purely scientific perspective, consider this. The founding fathers of the United States had good reasons for protecting a “right of the people to keep and bear Arms” in the Second Amendment of the Constitution, but they were not neuroscientists. Nobody in 1789 knew that the human brain constructs every perception and is ruled by interoceptive predictions. Right now, over 60 percent of people in the United States believe that crime is on the rise (though it’s historically low), and they also believe that owning a gun will make them safer. These beliefs are ripe to lead people, through affective realism, to genuinely see a deadly threat where there is none and to act accordingly. Now that we know definitively that our senses don’t reveal objective reality, shouldn’t this critical knowledge influence our laws?48

As a general rule, the legal system has had a lot of difficulty coming to grips with the mountains of scientific evidence that our senses don’t provide a literal readout of the world. For hundreds of years, eyewitness reports used to be considered one of the most reliable forms of evidence. When a witness said, “I saw him do it” or “I heard her say it,” these statements were considered to be facts. The law also treated memories as if they entered the brain pristinely, were stored whole, and were later retrieved and played back like a movie.49

Just as jurors cannot pull back the curtain of their own beliefs for direct access to some unblemished version of reality, witnesses and defendants do not report a collection of facts but a description of their own perceptions. One can glance at Serena Williams’s triumphant face at the beginning of chapter 3 and later, on the witness stand, swear on a Bible that Williams was screaming in terror. Any words spoken by eyewitnesses are based on recollections that are constructed in the moment, using past experiences that were themselves constructed.

Psychologist Daniel L. Schacter, one of the world’s experts on memory, tells the story of a brutal rape that took place in Australia in 1975. The victim told police that she’d seen her attacker’s face clearly, identifying him as Donald Thomson, a scientist. Police picked up Thomson the next day based on this eyewitness evidence, but Thomson had an iron-clad alibi: he was being interviewed on television at the time of the rape. It turned out that the victim’s TV was on when the intruder broke into her house, and it was tuned to Thomson’s interview, which ironically was about Thomson’s research on memory distortion. The poor woman had somehow, in her trauma, fused Thomson’s face and identity onto her attacker.50

Most men falsely accused are not so lucky. Jurors place a lot of weight on eyewitness testimony, yet they accept mistaken identifications just as frequently as correct ones, as long as the witnesses sound confident. In one study of convictions that were later overturned by DNA evidence, 70 percent of the accused were convicted based on eyewitness testimony.51

Eyewitness reports are perhaps the least reliable evidence one can have. Memories are not like a photograph—they are simulations, created by the same core networks that construct experiences and perceptions of emotion. A memory is represented in your brain in bits and pieces as patterns of firing neurons, and “recall” is a cascade of predictions that reconstruct the event. Your memories are therefore highly vulnerable to reshaping by your current circumstances, like having your body all worked up in the witness stand, or if you’re being badgered by a persistent defense attorney.

The law has been slow to accept that memories are constructed, but the situation is gradually changing. The Supreme Courts of New Jersey, Oregon, and Massachusetts are leading the way in this regard. Their jurors now receive instructions that provide step-by-step details—based on years of psychological research—explaining all the ways in which memory can go wrong in eyewitness testimony. They read how memories are constructed and infused with beliefs that can result in distortions and illusions, how the instructions given by lawyers and police can introduce biases, how confidence is unrelated to accuracy, how stress can impair memory, and how eyewitness testimony was a factor in falsely convicting more than three quarters of the people who were exonerated by DNA evidence for crimes that they did not commit.52

Unfortunately, no such guidelines exist to explain to jurors what an emotional expression is, what a mental inference is, or how they are constructed.

The figure of the dispassionate judge, who renders emotionless decisions in strict accordance with the law, is an archetype in many societies. The law expects judges to be neutral, as emotion would presumably get in the way of fair decisions. “Good judges pride themselves on the rationality of their rulings and the suppression of their personal proclivities,” wrote the late U.S. Supreme Court Justice Antonin Scalia, “including most especially their emotions.”53

In some ways, a purely rational approach to legal decision-making sounds compelling and even noble, but as we’ve seen so far, the brain’s wiring doesn’t divide passion from reason. We needn’t work hard to poke holes in this argument; it comes with its own holes pre-drilled.

Let’s start with the idea that a judge can be dispassionate, which should be interpreted as “having no affect” (rather than “having no emotion”). This idea is a biological impossibility unless that person has suffered brain damage. As we discussed in chapter 4, no decision can ever be free of affect as long as loudmouthed body-budgeting circuitry is driving predictions throughout the brain.

Affectless decision-making from the bench is a fairy tale. Robert Jackson, another former Supreme Court justice, described “dispassionate judges” as “mythical beings” like “Santa Claus or Uncle Sam or Easter bunnies.” Direct scientific evidence shows him to be pretty much on target. Remember how judges’ impartiality was easily swayed in parole cases held right before lunchtime, when they attributed their unpleasant affect to the prisoner instead of to hunger (chapter 4)? In another series of experiments, over 1,800 state and federal judges from the United States and Canada were handed scenarios of civil and criminal cases and asked what their rulings would be. Some scenarios were identical except the defendants were portrayed as more likeable or unlikeable. The experimenters found that judges tended to rule in favor of more likeable or sympathetic people.54

Even the U.S. Supreme Court is not immune to leaking passion from the bench. A team of political scientists examined 8 million words spoken by the members of the Court during oral arguments, and their questioning, over thirty years. They found that when judges focus “more unpleasant language” toward an attorney, that side is more likely to lose. You can predict the loser by simply counting the justices’ negative words during questioning. Not only that, but by examining the affective connotations in the judges’ words during oral arguments, you can predict their votes.55

Common sense dictates that judges experience strong affect in the courtroom. How could they not? They hold people’s futures in their hands. Their working hours are filled with heinous crimes and grievously harmed victims. I know how draining this can be, having been a therapist for victims of rape and childhood sexual abuse, and sometimes working with the perpetrators. Judges also encounter defendants who are more likable than the people they have preyed on, a situation that surely is challenging to grapple with, especially in a courtroom full of whispering spectators and bickering attorneys. And sometimes a judge must shoulder the affect of an entire country. Former U.S. Supreme Court Justice David Souter suffered so much while deciding Bush v. Gore that he wept because of its deliberations (along with half of the United States). All this mental effort taxes a judge’s body budget. The judge’s life is one of intense and continual emotional labor under the fiction of equanimity.56

Nevertheless, the law continues to hold dear the fiction of the dispassionate judge, even at the highest levels. When Supreme Court Justice Elena Kagan, as a nominee in 2010, was asked whether it was ever appropriate for feelings to help decide a case, she replied to the contrary, “It’s the law all the way down.” Justice Sonia Sotomayor also ran into opposition during her confirmation hearings, as some senators feared that her emotions and empathy were in direct opposition to her abilities to judge fairly. Her take on all this, for the most part, was that judges do have feelings but should not make decisions based on them.

Nonetheless, the evidence is clear that judges are not affectless in their rulings. The next question is: should they be? Is pure reason really the best way to render a wise decision? Imagine a person who is very calmly and coolly weighing the pros and cons about whether or not another person should die. There’s not a trace of emotion in sight. Like Hannibal Lecter in The Silence of the Lambs, or Anton Chigurh in No Country for Old Men. I am being a bit facetious here, but this kind of dispassionate decision-making is essentially what the law instructs in the sentencing portion of criminal cases. Rather than pretend that affect is absent, it’s better to use affect wisely. As U.S. Supreme Court Justice William Brennan once expressed, “Sensitivity to one’s intuitive and passionate responses, and awareness of the range of human experience, is therefore not only an inevitable but a desirable part of the judicial process, an aspect more to be nurtured than feared.” The key is emotional granularity: having a wide and deep range of concepts (emotion, physical, or otherwise) to make sense of the onslaught of bodily sensations that are the hazards of the job.57

Consider, for example, a judge faced with a defendant like James Holmes, who murdered twelve moviegoers and injured seventy more during a midnight screening of a Batman movie in Aurora, Colorado, in 2012. Such a judge might reasonably construct an experience of anger, but that feeling alone could be problematic; anger could prompt the judge to punish the defendant too harshly for the sake of retribution, threatening the moral order that the trial is founded on. To balance his view, some legal scholars argue, the judge could try to cultivate empathy for the defendant, who perhaps is insane or a victim of some sort himself. Anger is a form of ignorance; in this case, ignorance of the defendant’s perspective. Holmes clearly struggled with serious mental illness for years. He tried to kill himself for the first time when he was eleven years old, and has attempted suicide several times in jail. Empathy is extremely difficult to cultivate for someone who opens fire on innocents in a movie theater. Even remembering that the defendant is a human being, no matter how severe or gruesome the crime, might be a struggle at times, but this is when empathy might be most important. It may prevent a judge from going too far in punishing the offender during sentencing, and help to ensure the morality of penal decision-making and retributive justice. This is the type of emotional granularity that makes for wise use of emotion in the courtroom.58

When it comes right down to it, the most useful emotions for a judge to feel depend on the judge’s goals during the trial. What, for example, is the goal of punishment? Is it retribution? Deterrence to avoid future harm? Rehabilitation? This depends on the law’s theory of the human mind. Whatever the goal, punishment must be enacted so that the defendant’s humanity is preserved, while the victim’s humanity is honored, even if the defendant commits an unspeakable act. To do otherwise puts the legal system itself in jeopardy.

Why is it that you can sue someone for breaking your leg but not for breaking your heart? The law considers emotional damage to be less serious than physical damage and less deserving of punishment. Think about how ironic this is. The law protects the integrity of your anatomical body but not the integrity of your mind, even though your body is just a container for the organ that makes you who you are—your brain. Emotional harm is not considered real unless accompanied by physical harm. Mind and body are separate. (Let’s all raise a glass to René Descartes here.)

If there is one thing you can take away from this book, it is that the boundaries between mental and physical are porous. chapter 10 explained a bit about the ways in which emotional harm from chronic stress, parental emotional abuse and neglect, and other psychological ills can ultimately cause physical illness and injury. And we’ve seen how stress and proinflammatory cytokines lead to numerous health problems, including brain atrophy, and increase the likelihood of cancer, heart disease, diabetes, stroke, depression, and a host of other illnesses.59

But that’s not the whole story. Emotional harm can shorten your life. Inside your body, you have little packets of genetic material that sit on the ends of your chromosomes like protective caps. They’re called telomeres. All living things have telomeres—humans, fruit flies, amoebas, even the plants in your garden. Every time one of your cells divides, its telomeres get a little shorter (although they can be repaired by an enzyme called telomerase). So generally their size slowly decreases, and at some point, when they are too short, you die. This is normal aging. But guess what else causes your telomeres to get smaller? Stress does. Children who experience early adversity have shorter telomeres. In other words, emotional harm can do more serious damage, last longer, and cause more future harm than breaking a bone. This means the legal system might be misguided when it comes to understanding and gauging the degree of lasting injury that can come from emotional harm.60

As another example, consider chronic pain. The law treats chronic pain by and large as “emotional” because there’s no observable tissue damage. In these cases, the law usually concludes that the suffering is not real enough to merit compensation. People who suffer from chronic pain are often diagnosed as mentally ill, and even more so if they opt for an invasive operation to try and reduce their “illusory” suffering. Medical insurance companies deny treatment since chronic pain is considered psychological, not physical. The sufferer cannot work, yet no compensation is provided. But as we saw in the preceding chapter, chronic pain is likely a brain disease of prediction gone wrong. The suffering is real. The law is missing the point that prediction and simulation are the normal way that the brain works, and chronic pain is a difference of degree, not kind.61

Interestingly, the law does accept that other types of harm can be absent now but show up in the future. A prominent example is chemical harm such as Gulf War Syndrome, a chronic, multi-symptom illness allegedly caused by unknown factors during the Gulf War, whose effects did not appear until later. Gulf War Syndrome is controversial; there is no consensus on whether it’s actually a distinct medical condition. Regardless, thousands of veterans have taken their claims of Gulf War Syndrome to court. There is no analogous legal avenue for stress or other harm seen as emotional. (Awards for pain and suffering are relatively rare.)

Having made this observation, I must point out that the law is deeply inconsistent and even ironic in its view of emotional harm when you consider international norms for torture. The Geneva Conventions prohibit psychological harm to prisoners of war, and the U.S. Constitution likewise forbids “cruel and unusual punishment.” So it’s illegal for a government to torture a prisoner psychologically, but it’s perfectly legal to place a prisoner in solitary confinement for long periods, even though the stress of confinement may shorten the prisoner’s telomeres and therefore his life.62

It’s also perfectly legal for a high school bully to insult, torment, and humiliate your children even though this will shorten their telomeres and potentially their lifespan. When a group of middle-school girls deliberately excludes another girl, they are acting with intent and motivation to cause suffering, yet legal action is rare. In one highly publicized case, fifteen-year-old Phoebe Prince hanged herself in 2010 after months of verbal aggression and physical threats. Six teenagers were criminally prosecuted for harassment, stalking, assault, and assorted civil rights violations after they bullied her and then posted crude comments on her Facebook memorial page. This case prompted Massachusetts to pass anti-bullying laws. These laws are a start, but they punish only the most extreme cases. How do you regulate the playground in a legal context?63

Bullies intend to cause suffering, but is the intent to cause harm? We cannot know for sure, but in most cases I doubt it. Most kids are unaware that the mental anguish they inflict can translate into physical illness, atrophied brain tissue, reduced IQ, and shortened telomeres. Kids will be kids, we say. But bullying is a national epidemic. In one study, over 50 percent of children nationwide reported being verbally or socially bullied at school, or having participated in bullying another child at school, at least once in two months. Over 20 percent reported being the victim or perpetrator of physical bullying, and over 13 percent reported involvement with electronic bullying. Bullying is considered a serious enough childhood risk, with potential lifelong health consequences, that at press time, the U.S. Institute of Medicine and the National Research Council’s Committee on Law and Justice are producing a comprehensive report on its biological and psychological ramifications.64

If you suffer mental anguish in the moment, whether from bullying or another cause, should your suffering count as harm, and should the perpetrators be punished? A recent legal case implies the answer is sometimes yes. A company in Atlanta demanded DNA samples from its employees because someone was contaminating its warehouse with feces. It’s illegal to take genetic information from someone without his consent (it violates the Genetic Information Nondiscrimination Act), but the case was won largely on emotional grounds. The two plaintiffs were awarded about $250,000 each to compensate them for feeling humiliated and bullied, plus a remarkable $1.75 million in punitive damages for “emotional distress and mental anguish.” The large award was not for the plaintiffs’ actual emotional suffering but their potential emotional suffering in the future. After all, their personal health information could be used against them at any time for the rest of their lives. This fear of the future was easy for jurors to simulate and therefore empathize with. In a chronic pain case, it’s harder: how do you see the invisible? There are no injuries to look at, and nothing to help your brain create the simulation, so empathy suffers and consequently so does compensation.65

The legal system has difficulty dealing with mental anguish for purely practical reasons. How do you measure it objectively if emotions have no essences or fingerprints? Also, physical harm like a broken leg is usually more economically predictable than emotional harm, which is far more variable. And how do you distinguish everyday emotional pain from lasting harm?66

Perhaps the most important question here is: Whose suffering counts as harm? Who deserves our empathy and therefore the full protection of the law? If you negligently or intentionally break my arm, you owe me. But if you negligently or intentionally break my heart, you don’t, even if we were close for a long time, regulating each other’s body budgets, and the breakup will put me through a physical process that can be as excruciating as withdrawal from an addictive drug. You can’t sue someone for heartbreak, no matter how much you might want to (or how much they deserve it). The law is about creating and enforcing social reality. Empathic claims about pain are fundamentally claims about whose rights matter . . . and whose humanity matters.67

As you’ve seen, the law embodies the classical view of emotion and the view of human nature from which it derives. This essentialist story is a folktale that is not respected by the brain and its connection to the body. Therefore, based on today’s scientific view of the brain, I’m going to go out on a limb with some recommendations for jurors, judges, and the legal system in general. I am not a legal scholar, and I realize that the concerns of science are not the same as those of the law. I realize also that it’s one thing to speculate about basic dilemmas of humanity in the pages of a book but quite another to establish legal precedent on them. But it’s important to try to build bridges between disciplines. Neuroscience and the legal system are seriously out of sync on fundamental issues of human nature. These discrepancies must be addressed if the legal system is to remain one of our most impressive achievements of social reality and continue protecting people’s inalienable rights to life, liberty, and the pursuit of happiness.

I’d begin by educating judges and jurors (and other legal actors like attorneys, police officers, and parole officers) about the basic science of emotion and the predictive brain. The New Jersey, Oregon, and Massachusetts Supreme Courts are taking steps in the right direction by formally instructing jurors that human memory is constructed and fallible. We need a similar approach for emotion. Toward that end, I propose a set of five teaching points. You might call it an affective science manifesto for the legal system.

The first teaching point in the manifesto concerns so-called expressions of emotion. Emotions are not expressed, displayed, or otherwise revealed in the face, body, and voice in any objective way, and anyone who determines innocence, guilt, or punishment needs to know this. You cannot recognize or detect anger, sadness, remorse, or any other emotion in another person—you can only guess, and some guesses are more informed than others. A fair trial depends on synchrony between experiencers (defendants and witnesses) and perceivers (jurors and judges), and this can be difficult to achieve in many circumstances. For example, some defendants are better at using their nonverbal movements to communicate information about their emotions, such as remorse. Some jurors will be better at synchronizing their concepts with a defendant than others will. That means jurors might need to work harder to perceive emotions in challenging situations, like when they disagree with a defendant or witness on a political issue, or when the other person is of a different ethnicity. Jurors should try to put themselves in the other person’s shoes to facilitate this synchrony and cultivate empathy.68

The second point is about reality. Your sight, hearing, and other senses are always colored by your feelings. Even the most objective-sounding evidence is colored by affective realism. Jurors and judges must be educated about the predictive brain and affective realism, how their feelings literally alter what they see and hear in court. Perhaps the protestor video study I mentioned, where political beliefs caused people to perceive violent intent or not, could serve as an educational example. Jurors must also understand how affective realism influences eyewitnesses. Even a simple statement like “I saw him holding the knife” is a perception infused with affective realism. Eyewitness testimony does not relay cold, hard facts.

The third point is about self-control. Events that feel automatic are not necessarily completely outside your control and are not necessarily emotional. Your predicting brain provides the same range of control when you construct an emotion as when you construct a thought or a memory. The defendant in a murder trial is not a man-shaped sea anemone at the mercy of his environment, triggered by anger to pursue an inevitable, aggressive act. Most instances of anger, no matter how automatic they feel, don’t lead to murder. Anger can also unfold very deliberately over a long time, so there is nothing inherently automatic about it. You have relatively more responsibility for your actions when you have relatively more control, regardless of whether the event is an emotion or a cognition.

Fourth, beware the “my brain made me do it” defense. Jurors and judges should be skeptical of claims that certain brain regions directly cause bad behavior. That is junk science. Every brain is unique; variation is normal (think degeneracy) and not necessarily meaningful. Unlawful behavior has never been definitively localized to any brain region. I am not referring here to foreign growths like tumors or obvious signs of neurodegeneration, which in some cases, such as certain types of frontotemporal dementia, can make it harder for people to conform their actions to the law. Even so, many tumors and neurodegenerative damage cause no run-ins with the legal system at all.

The final teaching point is to be mindful of essentialism. Jurors and judges need to know that every culture is full of social categories like sex, race, ethnicity, and religion. These must not be mistaken for physical, biological categories with deep dividing lines in nature. Also, emotion stereotypes don’t belong in a courtroom. Women should not be punished for feeling anger rather than fear toward their aggressors, and men should not be punished for feeling helpless and vulnerable rather than brave and aggressive. The law’s reasonable person standard is a fiction based on stereotypes, and it is inconsistently applied. Perhaps it’s time to bury the reasonable person and conceive some other standard for comparison.69

Beyond the affective science manifesto, we also have the longstanding myth of the dispassionate judge, which is both propagated and questioned by members of the U.S. Supreme Court and other legal experts. Scholars may debate in legal journals about the value of emotion in judicial action, but the anatomy of the human brain makes it implausible for any human, including a judge, to escape the influence of interoception and affect when making decisions. Emotions are neither the enemy nor a luxury but a source of wisdom. Judges need not reveal their emotions (just as therapists learn not to), but they must be aware of them and explicitly use them to the best of their ability.

To employ emotions wisely, I suggest that judges learn to experience emotion with high granularity. If they feel unpleasant, they’ll be helped if they can categorize finely to experience (say) anger distinctly from irritation or hunger. Anger can be a reminder to cultivate empathy toward an unsympathetic defendant, a gullible plaintiff, a belligerent witness, or a particularly intrusive attorney. Without empathy, anger can foster the type of retributive punishment that risks undermining the very notion of justice at the foundation of the legal system. Judges can cultivate higher granularity using the exercises I recommended in chapter 9: collecting experiences, learning more emotion words, using conceptual combination to invent and explore new emotion concepts, and deconstructing and recategorizing their emotional experiences in the moment. It sounds like a lot of work, but like any skill, it becomes habitual with practice. Also, it would not hurt for judges who face defendants from other cultures to be briefed on the different cultural norms for emotional experience and communication.

Judges might also be educated to reduce the influence of affective realism when selecting jurors (a process known as voir dire). Often, judges and attorneys weed out jurors by asking them direct, transparent questions such as “Can you be objective, fair, and impartial in this case?” or “Do you know the defendant?” They also try to assess superficial similarities between jurors and defendants. For example, if a financial advisor stands accused of embezzling millions of dollars of his clients’ retirement investments, the judge might ask potential jurors whether they themselves have been victims of embezzlement, or whether a close relative works in the financial industry. But surface markers of similarities and differences are only the tip of the iceberg. It might be wise to examine a juror’s affective niche to understand how the juror might predict during a trial, which could indicate biases that shape perception. For example, a judge could ask what magazines the jurors read, what movies they prefer to see, or whether they play first-person shooter games, using standard assessment techniques from psychology. Such information would allow a judge to consider the potential biases of jurors based on how they spend their time, rather than just asking jurors directly about their biases (since such self-reports are not necessarily valid).70

My suggestions so far address low-hanging fruit. Now we’re ready for the really difficult stuff—scientific considerations that could change fundamental assumptions in the law.

We already know that our senses do not reveal reality, and judges and jurors necessarily suffer from affective realism. These factors, along with the rest of our knowledge of mind and brain, lead to a fairly radical idea (I’m almost afraid to say it): perhaps it is time to reevaluate trial by jury as the basis for determining guilt and innocence. Yes, it’s enshrined in the U.S. Constitution, but the writers of that landmark document had no inkling of how the human brain works, nor that one day we could detect a defendant’s DNA under a victim’s fingernails. Before DNA evidence, the law could not say whether a judgment of guilt was true or false. The legal system could only decide whether or not the judgment was rendered fairly, meaning that the rules and procedures of law were followed consistently. The law was therefore not about truth but consistency. Due process was about avoiding procedural errors in rendering a decision of guilt or innocence, not about the validity of the decision itself. Today’s legal system works only if we assume that consistency produces a just outcome. DNA testing is changing all that. It’s not perfect, but it’s immeasurably more objective than the affect-laden perceptions of human jurors.71

When DNA evidence is unavailable or irrelevant, perhaps trials might dispense with a jury and instead feature the collective wisdom of multiple judges working together, randomly drawn from a larger pool of judges. As I’ve said already, I’m not a legal scholar, just a scientist, so perhaps wiser legal minds can construct a balanced judicial panel system in better ways. A panel of skilled judges who are trained to be self-aware and emotionally granular might avoid affective realism more effectively than a jury would. It’s not a perfect solution by any means: in the United States at least, judges tend to be on the older side, predominantly European American, and may overrepresent a particular set of beliefs while maintaining the illusion that they are free of them. Judges are also more likely to hand out maximum sentences. But one thing is certain: every day in America, thousands of people appear before a jury of their peers and hope they will be judged fairly, when in reality they are judged by human brains that always perceive the world from a self-interested point of view. To believe otherwise is a fiction that is not supported by the architecture of the brain.72

And now we get to the toughest issue of all: what it means to control your behavior and therefore be responsible for your actions. The law (like much of psychology) usually considers responsibility in two parts: actions caused by you, where you have more responsibility, and actions caused by the situation, where you have less. This simple dichotomy of internal versus external does not mesh with the reality of the predictive brain.

In a construction view of human nature, every human action involves three types of responsibility, not two. The first is traditional: your behavior in the moment. You pull the trigger. You grab the money and run. (The legal system names this behavior actus reus, the harmful action.)

The second type of responsibility involves your specific predictions that brought about the unlawful act (known as mens rea, the guilty mind). Your behavior is not caused in a single moment; it is always driven by prediction. When you steal money from an open cash register, you are an agent in the moment, but the ultimate cause of your behavior also includes concepts like “Cash Register,” “Money,” “Ownership,” and “Stealing.” Each of these concepts is associated with a large and diverse population of instances in your brain, and based on them, you issued predictions that led to your action. Now, if other people with similar concepts in the same situation (i.e., the reasonable person) would also steal the cash, well, you might be less culpable for your actions. However, they may well have left the cash untouched, in which case your responsibility is greater.

The third type of responsibility relates to the content within your conceptual system, separately from how your brain uses that system to predict when breaking the law. A brain does not compute a mind in a vacuum. Every human being is the sum of his or her concepts, which become the predictions that drive behavior. The concepts in your head are not purely a matter of personal choice. Your predictions come from the cultural influences you were pickled in. When a European American police officer shoots an unarmed African American civilian, and the officer honestly saw a gun in the civilian’s hands due to affective realism, the event has roots in something outside the moment. Even if the officer were overtly racist, his actions were partly caused by his concepts, formed by a lifetime of experience, which includes American stereotypes about race. The victim’s concepts and actions are likewise informed by a lifetime of experience, which includes American stereotypes of cops. All of your predictions are shaped not just by direct experience but also indirectly by television, movies, friends, and the symbols of your culture. While it’s exciting to escape into a world of urban crime in a movie, or to retreat from the stress of the day by watching an hour or two of a police drama on TV, routine depictions of police conflicts have a cost. They fine-tune our predictions about the danger posed by people of certain ethnicities or socioeconomic status. Your mind is not only a function of your brain but also of the other brains in your culture.73

This third domain of responsibility cuts two ways. Sometimes it’s trivialized as “society is to blame,” a phrase lampooned as bleeding-heart liberal sentiment. I am saying something more nuanced. If you commit a crime, you are indeed to blame, but your actions are rooted in your conceptual system, and those concepts don’t just appear in a puff of magic. They are forged by the social reality you live in, which gets under your skin to turn genes on and off and wire your neurons. You learn from your environment like any other animal. Nevertheless, all animals shape their own environment. So as a human being, you have the ability to shape your environment to modify your conceptual system, which means that you are ultimately responsible for the concepts that you accept and reject.

As we discussed in chapter 8, the predictive brain expands the horizon of self-control beyond the moment of action and therefore broadens your responsibility in a complicated way. Your culture might teach you that people of a certain skin color are more likely to be criminals, but you have the ability to mitigate the harm that such beliefs can cause, and hone your predictions in a different direction. You can befriend people of different skin tones and see for yourself that they’re law-abiding citizens. You can choose not to watch TV shows that reinforce racist stereotypes. Or you can blindly follow the norms of your culture, accept the stereotyped concepts bestowed upon you, and increase the chances that you’ll treat certain people badly.

Dylann Roof, the man who shot African American members of a Bible study group, chose to surround himself with symbols of white supremacy. Sure, he grew up in a society struggling with racism, but so did most adults in the United States, and most of us don’t go around shooting people. So at the level of neurons, you and your society jointly cause certain predictions to become more likely in your brain. However, you still bear responsibility to overcome harmful ideology. The difficult truth is that each of us, ultimately, is responsible for our own predictions.

The law has precedent for this prediction-based view of responsibility. For example, if you drive drunk and hit someone with your car, you are responsible for the harm you caused, even though you could not control your limbs effectively in your inebriated state. You should have known better, because every adult in our society knows that drunkenness carries a risk of bad decision-making, so you are culpable for bad things that happen downstream.

The law calls this a foreseeability argument. It doesn’t matter whether you intended to cause harm or not: you are liable. And we now have enough scientific evidence to extend the foreseeability argument from large-scale common sense to the millisecond predictions of the brain. You know full well that some of your concepts, such as racial stereotypes, can lead you into trouble. If your brain predicts that an African American youth in front of you is holding a weapon, and you perceive a gun where none is present, you have some degree of culpability even in the face of affective realism, because it is your responsibility to change your concepts. If you educate yourself and inoculate yourself against such stereotypes, expanding your conceptual system with the goal to change your predictions, you still might mistakenly see a gun where none is present, and a tragedy still might occur. But your culpability is diminished somewhat, because you’ve acted responsibly to change what you can.

Eventually, the legal system must come to grips with the tremendous influence of culture on people’s concepts and predictions, which determine their experiences and actions. After all, the brain wires itself to the social reality it finds itself in. This ability is one of the most important evolutionary advantages we have as a species. So we bear some responsibility for the concepts we help wire into future generations of little human brains. But this is not an issue for criminal law. It is actually a policy issue relevant to the First Amendment, which guarantees the right to free speech. The First Amendment was founded on the notion that free speech produces a war of ideas, allowing truth to prevail. However, its authors did not know that culture wires the brain. Ideas get under your skin, simply by sticking around for long enough. Once an idea is hardwired, you might not be in a position to easily reject it.

The science of emotion is a convenient flashlight for illuminating some of the law’s long-held assumptions about human nature—assumptions that we now know are not respected by the architecture of the human brain. People don’t have a rational side and an emotional side, with the former regulating the latter. Judges can’t set aside affect to issue rulings by pure reason. Jurors can’t detect emotion in defendants. The most objective-looking evidence is tainted by affective realism. Criminal behavior can’t be isolated to a blob in the brain. Emotional harm is not mere discomfort but can shorten a life. In short, every perception and experience within the courtroom—or anywhere else—is a culturally infused, highly personalized belief, corrected by sensory inputs from the world, rather than the result of an unbiased process.

We’re at a turning point where the new science of mind and brain can begin to shape the law. By educating judges, jurors, attorneys, witnesses, police officers, and other participants in the legal process, we should be able to produce a legal system that is ultimately more fair. Perhaps we cannot move away from trial by jury anytime soon, but even simple steps, like educating jurors that emotions are constructed, can improve the current situation.

For now at least, the legal system still considers you to be an emotional beast enrobed in rational thought. Throughout this book, we’ve systematically challenged this myth by evidence and observation, but there’s one remaining assumption that we haven’t questioned yet: are beasts even emotional? Are the brains of our close primate cousins, such as chimps, capable of constructing emotion? What about dogs: do they have concepts and social reality as we do? Just how unique in the animal kingdom are our emotional abilities? We’ll explore these topics in the next chapter.