Free Novel Read

Conformity Page 5


  What is true for doctors is highly likely to be true for lawyers, engineers, legislators, bureaucrats, judges, investors, and academics as well. It is easy to see how cascades might develop among groups of citizens, especially—but not only—if those groups are small, insulated, and connected by affective ties. If Barry does not know whether climate change is a serious problem, and if Alberta insists that it is not, Barry might well be persuaded, and their friend Charles is likely to go along, making it unlikely that Danielle will be willing to reject the shared judgment of the developing group. When small communities of like-minded people end up fearing a certain risk, or fearing and hating another group, cascades are often responsible.

  Consider a legal analog, which offers lessons for those engaged in activities outside of law: There is a disputed issue under the Endangered Species Act. The question is what exactly the government has to do to protect endangered species. The stakes are high; environmental groups argue that the government has to do much more than the government is now doing. The first court of appeals to decide the question finds the issue genuinely difficult but resolves the issue favorably to the government. The second court of appeals tends to favor, very slightly, the view that the government is wrong, but the holding of the previous court of appeals is enough to tip the scales in the government’s favor. A third court of appeals is also slightly predisposed to rule against the government, but it lacks the confidence to reject the shared view of two circuits. Eventually all circuits come into line, with the final few feeling the great weight of the unanimous position of others, and perhaps insufficiently appreciating the extent to which that weight is a product of an early and somewhat idiosyncratic judgment. Because the courts of appeals are in agreement, the Supreme Court refuses to get involved in the dispute. This can happen a lot—and it makes for bad law.

  To be sure, precedential cascades do not always happen, and splits among courts of appeals are common.12 One reason is that subsequent courts often have sufficient confidence to conclude that predecessor courts have erred. But it is inevitable that cascades will sometimes develop, especially in highly technical areas. It will be hard to detect them after they have occurred.

  In terms of improving current practice, the implication is clear: judicial panels should be cautious about giving a great deal of weight to the shared view of two or more courts of appeals. A patient who seeks a second opinion should not disclose the first opinion to his new doctor; the goal is to obtain an independent view. In a similar vein, a court of appeals should be alert to the possibility that the unanimity of previous courts does not reflect independent agreement. And when the U.S. Supreme Court rejects the unanimous view of a large number of courts of appeals, it might be smart not to give undue weight to that unanimity; a precedential cascade could have been responsible for the consensus.13 For the legal system, the danger is that a cascade, producing agreement among the lower courts, might prove self-insulating as well as self-reinforcing. Unless there is clear error, why should the Supreme Court become involved?

  In informational cascades as discussed thus far, all participants are being entirely rational; they are acting as they should in the face of limited information. But as I have suggested, it is possible that participants in the cascade will fail to see the extent to which the decisions of their predecessors carry little independent information. If most people think that genetically modified foods create risks to health and the environment, can they really be wrong?

  A possible answer is that they might indeed be wrong, especially if they are not relying on their private information and are following the signals sent by other people. And both outsiders and contributors to cascades often seem to mistake a cascade for a series of separate and independent judgments. Sometimes scientists, lawyers, and other academics sign petitions or statements, suggesting that hundreds and even thousands of people share a belief or an opinion. The sheer number of signatures can be extremely impressive. But it is perhaps less so if we consider the likelihood that most signatories lack reliable information on the issue in question and are simply following the apparently reliable but actually uninformative judgment of numerous others.

  Even when those who participate in informational cascades are being entirely rational, there is a serious risk of error. People might converge on an erroneous, damaging, or dangerous path, simply because they are failing to disclose and to act on the basis of all the information that they have.

  Cascades are easy to create in laboratory settings. Some of the experiments are detailed and a bit technical, but four general lessons are clear. First, people will often neglect their own private information and defer to the information provided by their predecessors. Second, people are alert to whether their predecessors are especially informed; more informed people can shatter a cascade. Third, and perhaps most intriguingly, cascade effects are greatly reduced if people are rewarded not for correct individual decisions but for correct decisions by a majority of the group to which they belong. Fourth, cascade effects, and blunders, are significantly increased if people are rewarded not for correct decisions but for decisions that conform to the decisions made by most people. As we shall see, these general lessons have implications for institutional design. They suggest that errors are most likely when people are rewarded for conforming and least likely when people are rewarded for helping groups and institutions to decide correctly.

  The simplest experiment asked subjects to guess whether the experiment was using Urn A, which contained two red balls and one white, or Urn B, which contained two white balls and one red.14 In each period, the contents of the chosen urn were emptied in a container. A randomly selected subject was asked to make one (and only one) private draw of a ball. After that draw, the subject recorded, on an answer sheet, the color of the draw and the subject’s own decision about the urn. The subject’s draw was not announced to the group, but the subject’s decision about the urn was disclosed. Then the urn was passed to the next subject for another private draw, which was not disclosed, and for that subject’s own decision about the urn, which was disclosed. This process continued until all subjects had made decisions, and at that time the experimenter announced the actual urn used. Subjects could earn $2 for a correct decision.

  In this experiment, cascades often developed. After a number of individual judgments were revealed, people sometimes announced decisions that were inconsistent with their private draw but that fit with the majority of previous announcements.15 More than 77 percent of “rounds” resulted in cascades, and 15 percent of private announcements did not reveal a “private signal,” that is, the information provided by people’s own draw. Consider cases in which one person’s draw (say, red) contradicted the announcement of that person’s predecessor (say, Urn B). In such cases, the second announcement nonetheless matched the first about 11 percent of the time—far less than a majority but enough to ensure occasional cascades. And when one person’s draw contradicted the announcement of two or more predecessors, the second announcement was likely to follow those who went before. Notably, the majority of decisions followed Bayes’s rule and hence were rationally based on available information16—but erroneous cascades were nonetheless found. Here is an actual example of a cascade producing an entertainingly inaccurate outcome (the urn used was B):17

  1 2 3 4 5 6

  Private draw

  A

  A

  B

  B

  B

  B

  Decision

  A

  A

  A

  A

  A

  A

  What is noteworthy here, of course, is that the total amount of private information—four whites and two reds!—justified the correct judgment, in favor of Urn B. But the existence of two early signals, producing rational but incorrect judgments, led all others to fall in line. “Initial misrepresentative signals start a chain of incorrect decisions that is not broken by more representative signals received later.”18
It should be simple to see how this result might map onto real-world assessments of factual, moral, and legal issues, especially in insulated groups where external correction is less likely.

  How to Make and Break Cascades

  Is the likelihood of cascades affected by institutional arrangements and social norms? Can social, political, or legal arrangements diminish or increase the risk of erroneous cascades, inadvertently or through conscious decision?

  A central point here is that in an informational cascade, everyone is equal. People are simply trying to get the right answer, and they pay attention to the views and acts of others only because they want to be right. But it is easy to imagine slight alterations of the situation, so that some participants know more than others or so that people do not only care whether they are right. How would these alterations affect outcomes?

  Fashion Leaders and Informed Cascade Breakers

  In the real world of cascades, “fashion leaders” have unusual importance.19 A prominent scientist might declare that immigration or climate change is a serious problem; a well-respected political leader might urge that a foreign country is run by killers or that war should be made against it; and a lawyer with particular credibility might conclude that some law violates the Constitution. In any of these cases, the speaker provides an especially loud informational signal, perhaps sufficient to start or to stop a cascade. In 2018, Yale economist William Nordhaus won the Nobel Prize, largely for his work on climate change. Many people hoped that Nordhaus’s increased prominence, courtesy of the prize, would fuel attention to the climate change problem.

  Now turn to the actions of followers. In the hormone therapy case given above, none of the doctors is assumed to have, or believed to have, more information than his or her predecessors. But in many cases, people know, or think they know, a great deal. It is obvious that such people are far less likely to follow those who came before. Whether they will do so should depend on a comparison between the amount of information provided by the behavior of predecessors and the amount of private information that they have. And in theory, the most informed people will often shatter cascades, possibly initiating new and better ones. Whether this will happen, in practice, depends on whether the people who come later know, or believe, that the deviant agent was actually well informed. If so, the most informed people operate as fashion leaders.

  A simple study attempts to test the question whether more informed people actually shatter cascades.20 The study was essentially the same as the urn experiment just described, except that players had a special option after any sequence of two identical decisions (for example, two “Urn A” decisions): they could make not one but two independent draws before deciding (and thus obtain more information). The other subjects were informed of every case in which a player was making two draws. The simplest finding is that this “shattering mechanism” did indeed reduce the number of cascades—and thus significantly improved decisions.21

  But the mechanism did not work perfectly. In some cases, cascades were nonetheless found. And in some cases, people who were permitted to draw twice, and saw two different balls (say, one red and one white), wrongly concluded that the cascade should be broken. The remarkable and somewhat disturbing outcome is that they initiated an inaccurate cascade. Consider this evidence, in a case in which the actual urn was A:

  1 2 3 4 5 6

  Private draw

  A

  A

  B, A

  B

  A

  B

  Decision

  A

  A

  B

  B

  B

  B

  This disturbing pattern undoubtedly has real-world analogues, in which people sometimes give excessive weight to their own information, even if that information is ambiguous and if it makes sense to follow the crowd. But the larger point is the simple one: more informed people are less influenced by the signals of others, and they also carry more influence themselves.

  But what about cases in which fashion leaders are not necessarily more informed or in which they are seen by others as having more information or more wisdom than they actually have? We can imagine self-styled experts—on diets, vaccines, herbal foods, alternative medicine, or economic trends—who successfully initiate cascades. They might be cranks, they might be crazy, or they might be self-promoters. The risk here is that their views will be wrongly taken as authoritative. On social media, that happens all the time. The result can be to lead people to errors and even to illness and death. “Fake news” can spread like wildfire; informational cascades are the culprits. In 2017 and 2018, that was a particular concern for Facebook, whose platform has often been used as a basis for the rapid transmission of falsehoods.

  How can society protect itself? There are no panaceas here, but potential answers lie in good institutional arrangements, civil liberties, free markets, and good social norms, encouraging people to be skeptical of supposed truths. In systems with freedom of speech and free markets, it is always possible to debunk supposedly authoritative sources. And within groups, it is possible to structure decision-making to reduce the relevant risks. Votes might, for example, be taken in reverse order of seniority, to ensure that less experienced people will not be unduly influenced by the judgments of their predecessors; this is in fact the practice of the U.S. Supreme Court.

  The spread of falsehoods on social media raises independent problems of course. For Facebook, an improved news feed can help; it might reduce the likelihood that damaging or intentional falsehoods will spread. (Facebook continues to test approaches to promote that goal.) Twitter might also experiment with initiatives designed to reduce the likelihood that damaging lies will go viral. It is worth considering whether to dismantle certain kinds of lies. But this is not the place for a treatment of possible reforms of social media.22 The only point is that an understanding of the mechanisms behind informational cascades helps illuminate why social media can be so damaging to democracy.

  Majority Rule: Rewarding Correct Outcomes by Groups Rather than by Individuals

  How would the development of cascades be affected by an institution that rewards correct answers not by individuals but by the majority of the group? In an intriguing variation on the urn experiment, subjects were paid $2 for a correct group decision and penalized $2 for an incorrect group decision, with the group decision determined by majority rule.23

  People were neither rewarded nor punished for correct individual decisions. The result was that only 39 percent of rounds saw cascades! In 92 percent of cases, people’s announcement matched their private draw.24 And because people revealed their private signals, the system of majority rule produced a substantial increase in fully informed decisions—that is, the outcomes that people would reach if they were somehow able to see all private information in the system. A simple way to understand this finding is to assume that a group has a large number of members and that each member makes an announcement that matches that member’s private draw. As a statistical matter, it is overwhelmingly likely that the majority’s position will be correct. As an example, consider this period from the majority rule experiment (the actual urn was A):25

  1 2 3 4 5 6 7 8 9

  Private draw

  A

  A

  A

  A

  B

  A

  A

  A

  B

  Decision

  A

  A

  A

  A

  B

  A

  A

  A

  B

  What is the explanation for the significantly reduced level of cascade behavior in a system of majority rule? The answer lies in the fact that the individuals know they have nothing to gain from a correct individual decision and everything to gain from a correct group decision. As a result, it is in the individuals’ interest to say exactly what they see, because it is the accurate announcement, from each person,
that is most likely to promote an accurate group decision. That finding has large implications for how to structure groups and organizations.

  Note that to explain the effect of majority rule in producing better outcomes, it is not necessary or even helpful to say that when people are rewarded for a correct group decision, they become altruistic or less concerned with their self-interest. On the contrary, self-interest provides a fully adequate explanation of the people’s behavior. In the individual condition, it is perfectly rational to care little or not at all about whether one is giving an accurate signal to others. That signal is an “informational externality,” affecting others, for better or for worse, but not affecting one’s own likelihood of gain. If a subject’s individual signal misleads others, the subject has no reason to care.

  But under the majority rule condition that I have just described, rewarding accurate group decisions, the subject should care a great deal about producing an accurate signal, simply because an inaccurate signal will reduce the likelihood that the group will get it right. And here the subjects need not care about the accuracy of their individual decisions except insofar as that decision provides a helpful signal to the group. Hence it is only to be expected that cascades are reduced, and correct outcomes are increased, when people are rewarded for good group decisions.

  There is a general point here. For most people, it is entirely rational, under plausible assumptions, to participate in a cascade. Participants benefit themselves even if they fail to benefit others (by failing to disclose privately held information) or affirmatively harm others (by giving them the wrong signal). This claim holds even if people are just trying to get it right and even if conformity is not rewarded as such. By contrast, it is not rational, under plausible assumptions, to disclose or act upon private information, even though the disclosure or action will actually benefit others. If other people have decided not to vaccinate their children, and if you think they must know what they are doing (even if you tend to disagree, based on what you know), you might just follow their lead and never voice your doubts.