Many studies have found that political polarization in the United States is rapidly increasing, particularly online, where echo chambers and social media have inflamed partisanship. But new research from the University of Chicago’s Knowledge Lab of more than 200,000 Wikipedia pages finds that collaborations bridging the political spectrum produce higher-quality work than articles edited by moderate or one-sided teams.
Wikipedia pages covering politics, social issues and science written by editors with a broader range of political affiliation rank better on Wikipedia’s own quality scale, because of diverse perspectives, increased debate and appeals to community guidelines, the study found. The analysis, published in Nature Human Behavior, suggests that ideological diversity, in a system with well-defined policies, can actually create more productive and higher quality collaborations.
“This study doesn’t say we can always get along,” said James Evans, professor of sociology, director of Knowledge Lab and a leading scholar in the quantitative study of how ideas and technologies emerge. “But if we’re diverse along political lines, it actually means that we bring separate perspectives, and when we’re able to work together, then we’re able to produce a more complete and balanced perspective. If we’re imbalanced, then this study also suggests how bad it can be.”
The crowdsourced model of Wikipedia allows any user to edit most pages, as long as they follow the site’s guidelines on providing sources and avoiding bias. These policies are enforced in a decentralized fashion by other users, and discussions over the legitimacy of edits are conducted on each article’s “talk page.” Articles on controversial events, topics or figures, such as the Syrian Civil War, abortion or George W. Bush, attract a higher rate of edits and discussion, and may include an extra level of protection where edits require community approval before appearing.
In the new study, researchers Feng Shi, Misha Teplitskiy, Eamon Duede and Evans first estimated the political affiliation of more than 600,000 Wikipedia contributors through how often they contributed to liberal or conservative articles. They then measured the overall political alignment of each editing community behind 232,000 different Wikipedia pages, considering editor groups with a broader range of ideological alignments to be more “polarized.”
When this measure of polarization was compared against Wikipedia’s six-category scale for article quality (ranging from “stub” up to “featured article”), the authors found that higher polarization was associated with higher quality—not just for political articles, but also those on social issues and science topics.
“While political polarization is now regarded as toxic or brutal, it can work in our favor if it begets diversity of views, balanced engagements and reasoned debates,” said Shi, a data scientist with the Odum Institute for Research in Social Science at the University of North Carolina at Chapel Hill. “Polarization of the editors is positively associated with the quality of their work, even controlling for article length, editing activity, previous editing experience, and other article and talk page attributes.”
Knowledge Lab is a unique research center that combines “science of science” approaches from sociology with the explosion of digital information now available on the history of research and discovery. By using advanced computational techniques and developing new tools, Knowledge Lab researchers reconstruct and examine how knowledge over time grows and influences our world, generating insights that can fuel future innovation.
The analysis also found that polarization drives different styles of discussion on article “talk pages.” By analyzing the content of these pages, researchers found that polarized teams engage in more debate but with less toxic conflict than ideologically uniform editing communities, where the efforts of lone, contrarian editors to “de-bias” articles provoke charged disputes. Polarized teams also refer to Wikipedia’s policies and guidelines more frequently, a structure that protects against the raw emotions and abuse found in many less-regulated online communities.
“Our work suggests that increasing oversight and bureaucracy can be highly beneficial for content,” said Teplitskiy, a postdoctoral fellow at Harvard University’s Laboratory for Innovation Science. “Another way in which Wikipedia is different is its well-known and well-publicized commitment to discourse and consensus. Strongly signaling such a mission upfront may induce self-selection of only those individuals who are willing to cooperate for a common good.”
“It’s important, and perhaps surprising, to note that Wikipedia’s guidelines are generative of not just quality articles, but a sustained culture,” said Duede, a PhD student in the UChicago Committee on the Conceptual and Historical Studies of Science. “These are not just rules concerning what can be said and in what manner in an article. These stipulate acceptable social conduct, how editors treat one another in talk page debate. But, also how we, as researchers, engage with the community. It required enormous effort to earn the right to conduct this study. We had to become members of the community in order to understand the community.”
Though the current study focused solely on Wikipedia polarization, the authors suggested that its conclusions could be expanded to other collaborative sites, or even to the formation of ideologically diverse teams in the offline world.
“Wikipedia works because it has a culture where people can appeal to guidelines and recommendations, and they do, they rely on the laws of the community,” Evans said. “In a community or media environment without laws, or with reducing norms, it becomes potentially a toxic environment where there are shorter conversations, less collaboration and lower quality.”
Citation: “The Wisdom of Polarized Crowds” March 4, 2019 in Nature Human Behavior. Doi: 10.1038/s41562-019-0541-6
Funding: National Science Foundation, the John Templeton Foundation, and the Air Force Office of Scientific Research.