Cass Sunstein
Big Brains logo

The Deadly Flaw in Our Judgment, with Cass Sunstein (Ep. 73)

Scholar examines persistent decision-making errors in ‘Noise: A Flaw in Human Judgment,’ co-authored with Daniel Kahneman and Olivier Sibony

Cass Sunstein
Big Brains logo

Show Notes

Many of the most important moments in our lives rely on the judgment of others. We expect doctors to diagnose our illnesses correctly, and judges to hand out rulings fairly. But there’s a massive flaw in human judgment that we’re just beginning to understand, and it’s called “noise.”

In a new book, former University of Chicago law professor Cass Sunstein takes us through the literature on noise, explains how it shows up in our world and what we can do to fight it. From misdiagnoses to unequal treatment in courtrooms, Noise: A Flaw in Human Judgment—co-authored with Daniel Kahneman and Olivier Sibony—examines why noise is the “silent killer” we didn’t even know was there.

Subscribe to Big Brains on Apple PodcastsStitcher and Spotify.
(Episode published July 15, 2021)

Related:

Transcript:

Paul Rand: Some of the most important moments in our lives, rely on the judgment of others. A doctor diagnosing your illness, a judge deciding the harshness of your sentence, or a hiring committee deciding whether you should get that new job. We like to believe these judgments are made fairly and correctly.

Cass Sunstein: Our system should be fair to people. They shouldn’t treat them as beneficiaries of good or bad luck, just by virtue of the decision-maker. If the decision-maker is in a good mood, maybe you get promoted. If the decision makers in a terrible mood, maybe you don’t. That’s really unfair.

Paul Rand: We all know that human judgment can be flawed. And we’re all familiar with the biggest culprit that causes error in judgment: bias.

Cass Sunstein: Bias isn’t obviously bad. If it’s racial bias or bias on the basis, sex discriminatory bias, that’s self-evidently bad.

Paul Rand: But there’s another error-causing offender that most people have never heard of. It’s called noise.

Cass Sunstein: Noise is a more invisible enemy. And noise is a statistical concept, not a cause of concept. So bias is concept. For noise, you don’t know what the cause is.

Paul Rand: This is Cass Sunstein. He’s a hometown hero here at the University of Chicago, having served in the law school for 27 years. Now he’s a professor at Harvard Law school, Senior Counselor at the Department of Homeland Security in the Biden administration. And one of three co-authors of a new book, Noise: A Flaw in Human Judgment.

Cass Sunstein: The kind of punchline of the book, is wherever there is judgment there is noise and more than you think.

Paul Rand: Noise is a little understood concept.

Cass Sunstein: The human mind doesn’t get all excited at noise. And one of the goals of the book, is to get the human mind more excited about noise as it were.

Paul Rand: Noise is around us all the time. Appearing in medical misdiagnoses. When people get higher sentences than they may deserve, or others receive for the same crime, or when the wrong people are hired for crucial jobs.

Cass Sunstein: In many organizations, once you have some kind of audit of noise, it’s scandalous and people say there ought to be a law.

Paul Rand: From the University of Chicago Podcast Network, this is Big Brains. A podcast about the pioneering research and pivotal breakthroughs that are reshaping our world today. Today, noise. I’m your host Paul Rand. Noise is a tricky concept, but once you understand it, you can’t stop seeing it everywhere. The best place to start, is the difference between noise and bias.

Cass Sunstein: So if you have a scale that every morning says, you’re five pounds heavier than you actually are, that is a bias scale. If you have a government official who thinks that because we haven’t had a pandemic in the United States in a long time, this is not going to be a pandemic, when in fact the objective analysis suggests that it is, then we have a bias official.

Paul Rand: When we are consistently prejudiced one way or another, for or against something or someone, that’s bias. And consistency is really the key.

Cass Sunstein: A systematic tendency, or in a specific direction compared to the accurate value.

Paul Rand: If bias is consistent error, noise is the opposite. It’s scattered and random. Much like the noisy you hear all around you. In all that seeming randomness, it can be hard to pinpoint exactly what’s causing the problem. Unlike bias, which can often be pinned down, noise is all the invisible stuff that’s affecting decisions that we can’t see. If you removed bias completely from a set of judgments and still solve variation, what you’re left with is noise.

Cass Sunstein: So you can think of that. If people are at a shooting range, and if everyone shoots at the target accurately, there’s no noise so there’s no bias. If everyone shoots at the target to the right, exactly the same point, then there’s bias, but no noise. If everyone shoots in a random way, all over the target, then it’s noisy, but not bias. And that shooting range example maps on to a lot of organizations.

Paul Rand: And bias is something we all kind of think we understand and recognize the concept of and work with on a regular basis.

Cass Sunstein: Now, bias has a kind of charisma. It’s little like the person on the screen and you can’t take your eyes off of. Elvis Presley in the early days. Noise is more like the character in a murder mystery to whom you never pay attention, but who turns out to be the killer. If you see the human mind as a measuring instrument, the human mind is noisy and is often a killer. Sometimes literally a killer as in hospitals.

Paul Rand: So how does this killer operate? What does noise look like? The easiest place to see it, is in the criminal justice system.

Cass Sunstein: If you have judges, let’s start with judges and some are just really tough on criminals and others aren’t. You can have a noise in the system where some judges are going to say, let’s say with drug offenders, ‘‘I’m going to give the maximum sentence.’’ And others are going to say, ‘‘I’m going to give the minimum sentence.’’ That will create a noisy system in which criminal defendants are subject to a lottery, depending on whom they draw as their particular judge.

Paul Rand: One study from 1981, found that 200 judges given the same 16 hypothetical cases, only unanimously agreed on a prison term in three of the 16 cases. In one case, terms range from a mean of 8.5 years to life in prison. That’s a life altering difference just based on the luck of the draw of the judge that you get.

Cass Sunstein: And you could see this in grants of asylum, where some people will be really tough, some people not so much. You could see it in grading in universities where one professor might be very reluctant to give an A and another might be very happy to give an A. And that could create noise across the system.

Paul Rand: Another study of asylum seekers, found that one judge admitted only 5% of applicants, while another admitted 88%. That’s a huge difference. That’s noise in the system. But there can also be noise within a single individual.

Cass Sunstein: So if you are really happy, let’s say because the Chicago Bears won and you really care about them, or the Chicago White Sox, then you might make a judgment which is more lenient, or more kind. With respect to let’s say, a criminal defendant or a job applicant, than you would otherwise make. And actually have data with respect to sports teams and weather, that suggests judgment can be affected by success of sports teams and by weather. And that can produce noise within the person and so noise within the system.

Paul Rand: It sounds absurd, but it’s true. One study of a million and a half judicial decisions, found that judges were harsher in the days following the loss of a local football team. Another review of 200,000 immigration court decisions, found that people are less likely to get asylum when it’s hot outside. These big moments in people’s lives, shouldn’t come down to the temperature.

Cass Sunstein: There are two reasons why noise is bad. The first is unfairness. And arbitrary is sometimes refers to [inaudible 00:08:07] prejudice, but it often refers to being struck by lightning. And our system shouldn’t be fair to people. They shouldn’t treat them as beneficiaries of glitter, bad luck just by virtue of the decision-maker. So that’s the first reason. The second [inaudible 00:08:25] cost, if you have a bunch of doctors in a hospital who were very optimistic about the patient’s prospects and therefore tell them to go home. And a bunch who were very pessimistic and then run a zillion tests, those mistakes add up.

Cass Sunstein: And that could cause very serious health problems, where the people who go home might get sick and die. And the people who are tested, might spend a lot of money in the long run. Too many tests can cause health problems. We think of medicine rightly as something which has expertise and specialization. So the idea that in diagnosing endometriosis, or lung cancer, or heart disease, doctors would be very noisy that was not expected by me. So it just is the case that in medicine, the idea of getting a second opinion, is often a really good idea because the second opinion isn’t going to be the same as the first.

Cass Sunstein: And that is alarming. That we don’t want to be too hard on doctors. The level of noise isn’t out of control except in psychiatry, where it really is, but it’s much higher than one would think. And as many hospitals know, that’s a problem because if two doctors disagree, maybe because one is biased in one or another way, one of them has to be wrong. And what are we going to do about that?

Paul Rand: So what are we going to do about it? It’s clear that this level of unfairness is dangerous and maybe even deadly. Solutions for cutting through the noise, are after the break. So if we now kind of get our arms around what noise is, you also talk about some solutions for it. And recognizing that there are some inherent problems that this, people can recognize bias and maybe work against it. But noise is something it’s really hard to recognize. And one of the concepts I actually thought was kind of interesting, you called it decision hygiene. And I wonder if you can tell me what that means.

Cass Sunstein: The idea of decision hygiene, is designed to suggest that when you wash your hands or engage in agentive practices, you’re counteracting an enemy that you can’t identify. And the idea of decision and hygiene, is there things you can do that can reduce noise, maybe even eliminate it, even if you don’t know what you’re responsible for.

Paul Rand: You also spent some time talking about different remedies. And one of the other ones which I thought was interesting and maybe cover some in some of these groups, is the wisdom of crowds. So we’re probably not going to get a crowd of judges to weigh in on something, but tell us what you mean about wisdom of crowds and how that fits into this.

Cass Sunstein: Okay. So if you got a group of 20 people to design something, chances are that another group of 20 similar people, is going to make the same decision. Maybe I can give a example from jury behavior, which will, where we actually have data, which will make your point clear.

Paul Rand: Sure.

Cass Sunstein: We find that individual judgments about the dollar amounts people should pay for misconduct, are really noisy. Crazy noise. But if you aggregate groups into groups of six, or 12, then it will be much less noisy. If you take the average of a group of six, or even a better average of a group of 12, that will be a pretty good predictor of what another group of 12 is going to do. Whereas, what an individual is going to do, is a terrible predictor of what another individual is going to do.

Cass Sunstein: And if this seems a little abstract, when hospitals ask four doctors or six doctors to make a decision and they take let’s say the average view or the majority view, that is likely to cut both bias and noise, which is why aggregating independent judgments with independence in huge bonds and in bold letters, that is a great noise buster.

Paul Rand: One of the other things was kind of interesting, is you talked about something, you call it the cascade effect. And I wonder if you can play that into this whole concept that we were just talking about.

Cass Sunstein: Great. That’s an important qualification of what I just said. And give some clarity. So let’s talk about a study done a number of years ago, the popularity of songs.

Paul Rand: Okay.

Cass Sunstein: There was a website that had lots of bands and lots of songs. Some of them had eerily perfect titles for our book. I think one is actually called, I Am Error.

Paul Rand: Okay.

Cass Sunstein: In the control group, you could get a sense of what songs people liked. But the experimental groups, the treatment groups, were ones in which people could see which songs had popularity within their particular group.

Paul Rand: Okay.

Cass Sunstein: And the prediction of let’s say the meritocrats, is the best songs will ultimately be regarded as such, or worsening as such, but that’s not what happened. What happened was that while the best songs never completely crashed and the worst songs never went right to the top, other than that, anything could happen. So I Am Error could be really popular, like a terrific success in one world if they got early downloads. And it could be really unpopular in another group, where it didn’t get early downloads.

Cass Sunstein: And the coolness of the paper is to show that because of social influences, groups can go in very unpredictable directions, depending on what is early on seen as liked by group members. And it fits with how groups sometimes work, if people aren’t making independent judgements. So if you’re a group of people in which let’s say the first two members say, let’s open an office in Singapore with great confidence and enthusiasm. And the third who let’s say has her private doubts, think well I have my private doubts, but the first you think it’s great so I guess I’ll say so too.

Cass Sunstein: And then the fourth thinks, I have a lot of private doubts, but the first three said so, so I’m going to say, sure it sounds like a good idea. And that [inaudible 00:14:54] opens the office in Singapore. You could have another group that’s basically identical, where the first two speakers happen to be persons three and four in the group I just described. And they say, ‘‘I don’t think it’s a good idea.’’ And then the first two speakers in the first group, or in the second group, the third and fourth speakers. And they say, ‘‘I guess it’s not a good idea.’’ And that means that you can see a lot of noise across groups, because of social influences and who speaks first and with the most enthusiasm and authority.

Paul Rand: Okay. So that’s the wisdom of crowds.

Cass Sunstein: A second form of decision IG, is guidelines.

Paul Rand: Okay. And one of the examples used in the book was the Sentencing Reform Act of 1984. And that seemed to really lay out this problem pretty starkly. Can you explain that case?

Cass Sunstein: Yes. So one of our goals in the book, is to get someone to erect a statue to judge Marvin Frankel. It’s kind of the hero of the book.

Paul Rand: Cass and his coauthors are trying to bring noise back to the forefront. But this problem was identified years ago. In the 1970s, judge Frankel discovered the rampant noise in the criminal justice system.

Cass Sunstein: And he thought, this is scandalous. The idea that your life is on the line and it’s random, it’s like being struck by lightning, whether you get leniency or stringency. So that’s horrible.

Paul Rand: He started to argue for clear, concise guidelines as a solution. And his ideas came to the attention of the Supreme court.

Cass Sunstein: So he has helped spur the enactment of the sentencing guidelines, which were intended to reduce unfair disparities among different judges, by having guidelines which sharply constrained sentencing.

Paul Rand: In 1984, the Sentencing Reform Act created mandatory guidelines to restrict the range of criminal sentences.

Cass Sunstein: They didn’t eliminate judgment, but they greatly narrowed it. And as a result of the sentencing guidelines, the range of variation across judges, did get cut very dramatically.

Paul Rand: The expected difference in sentence lengths, dropped by 6%. But ...

Cass Sunstein: A lot of the judges hated that. They hated it because they thought, ‘‘Who do these members of Congress and these people on the commission, the sentencing commission think they are? I’m a judge. I see the offender. I see the offense. Let me exercise discretion.’’ What people deserve more accurately than some guidelines that were issued without any reference to the particular case on judging.

Paul Rand: In 2005, the Supreme Court changed the guidelines from mandatory, to just advisory.

Cass Sunstein: And the result of the Supreme Court converting the guidelines from mandatory advisory, was to resolve in a big burst of noise. Like a explosion, a volume of noise in our sets, where variability broke out dramatically after the guidelines became advisory.

Paul Rand: One study found that when guidelines were mandatory, harsh judges gave sentences almost three months longer than others. When they switched to advisory, that number doubled. It also increased bias as well. The disparity between sentences for African Americans and Whites convicted of the same crime, increased significantly. When it comes to judgment, it feels counter-intuitive to say, that simple guidelines lead to less noise. We like to think that the ability to take in and store through complex variables, makes our judgments better. But that’s not always the case.

Cass Sunstein: So we discussed in the criminal justice system, how guidelines did reduce noise and sentencing. When a infant is born in the United States and many other countries, there’s something called the Apgar score. Where you raise a child, an infant along five measures of health, on a scale of one to four as I recall. And that Apgar score, it’s a little like an algorithm, not a lot like an algorithm because there’s judgment, but it is a real reducer of noise.

Cass Sunstein: You ask people is the kid healthy? They’d be very noisy. Under the Apgar score, not so much. So the basic idea is in many domains of life, medicine, law, engineering, fingerprinting, you move toward guidelines.

Paul Rand: One controversial mechanism for moving these fields toward guidelines, could be a greater reliance on machine learning and algorithm.

Cass Sunstein: Algorithms and machine learning, often are like Apgar score. They’re really, really rigid. So they might say, if the following ingredients of a patient show, then we predict it’s more likely than not that they have the following disease. And no judgment at all. And that is by definition, going to be a noise eliminator. Algorithms and machine learning are not noise. That’s because they spit out the same answer every time.

Paul Rand: When they’re designed well, there is some evidence that AIs can be superior to humans. Another University of Chicago scholar, Sendhil Mullainathan created a study where an AI was trained to predict flight risk and bail decisions. The AI ended up performing much better than human judges. He also created an AI to diagnose the risk of heart attack in patients and the need for additional tests. Its accuracy was also superior to human doctors.

Cass Sunstein: Now they might be biased, or they might be ridiculous, or they might be otherwise objectionable. But the noise problem is eliminated. The fact is that in many domains, practitioners are finding algorithms are helpful as an aid in medicine for sure. And in some domains they’re finding that algorithms just outperform human beings. For flights, often judgments are made not by pilots, unless certain there are special circumstances, but the instruments tell you what to do.

Paul Rand: In terms of this being one of the best solutions to the noise, do you feel that we’ll start finding that the incidents of noise, are going to cut back as AI starts taking a more prominent role in some of our decision-making?

Cass Sunstein: No question. So the reason the book isn’t algorithms now, explanation point.

Paul Rand: Yep, yep.

Cass Sunstein: That’s the [inaudible 00:21:54] of the book is first, we don’t think it’s realistic to say that algorithms will displace human judgment in every domain. And we don’t think that’s desirable for many reasons, but there’s no question that we are moving toward increased reliance on algorithm and machine learning. And it’s all TBD about whether they’re improving things or not. We need to know, but they have great promise for doing exactly that.

Cass Sunstein: Now, how exactly to deal with that, is a fair question. Do we want better algorithms? Is that the right solution? Or do we want the algorithm to be A, transparent and B, nearly advisory. So a judge can say, ‘‘Okay, this would be algorithm.’’ And says, ‘‘But I actually know something it doesn’t.’’ And notice in talking about the book, something that is extremely interesting. That is whenever I say something negative about algorithms, people tend to be really happy. When I say something positive about algorithms, people kind tend to be sad or mad. So there is a bias I think, in favor or against algorithms, which out-shoots the legitimate reasons for being nervous about algorithms.

Paul Rand: If I am listening to this podcast reading your book, and I think I buy the concept of noise, I want to do better. What can I personally do, to minimize the impact that noise is having in my life?

Cass Sunstein: Delay your intuition. So we often have an immediate intuition about something. Job to take, house to buy, whom to marry, what puppy to get into the window. Don’t ignore your intuition, but delay your intuition. After your initial intuition, kind of put it in a box and ask the various ingredients of the decision, the various components. Assess each of them independently. There may be seven in deciding, let’s say what job to take, or whether to move. Look at each of the seven, assess them independently, make sure you have them maybe on a piece of paper, at least in your head, and then let your intuition go.

 

Episode List

Is scientific progress slowing? with James Evans (Ep. 89)

Scholar examines how researchers could generate greater innovation and discovery

Could we vaccinate against opioid addiction? with Sandra Comer and Marco Pravetoni (Ep. 88)

Scientists discuss promising solution, now in clinical trials, to address drug overdose epidemic

The man who fought to sanction Putin and Russian oligarchs, with Bill Browder

Businessman explains how his work on the Magnitsky Act made him country’s No. 1 enemy

Why big ideas fail to scale—and how to fix it, with John List (Ep. 87)

Economist discusses the secrets of using science to scale promising social programs

Could personalizing laws make society more just? with Omri Ben-Shahar (Ep. 86)

Exploring the benefits and pitfalls of using big data to tailor different rules to different people

How to stick to your resolutions, with Ayelet Fishbach (Ep. 85)

New book explores science of setting goals, achieving success and learning from failure

Confronting gun violence with data, with Jens Ludwig (Ep. 82)

Director of Crime Lab explains evidence behind community-based solutions to violent crime

 

Unlocking the secrets of black holes, with Andrea Ghez (Ep. 81)

Nobel-winning scientist examines the monster at the center of our galaxy—and how we got here

Do your genes determine your success in life? with Kathryn Paige Harden (Ep. 80)

In new book, behavior geneticist debates the role genetics play in social inequality

Course registration