Healthier Democracy Inspires New Research Center to Confront Misinformation

Nov 1, 2019

By Doug Parry

University of Washington Information School

Misinformation infects the information ecosystem, sowing doubt and confusion. It feeds on openness, transparency and a free exchange of ideas. If misinformation is a virus, democracy is a welcoming host.

“I see misinformation as one of society’s most pressing problems,” said iSchool Associate Professor Jevin West. “Everything we do in society depends on individuals having the best information possible. If none of us trusts anything we’re reading or we’re being duped all the time, I think that’s something to be concerned about.”

West and colleagues have devoted their energies to addressing the problem. This fall, they established the Center for an Informed Public, an interdisciplinary research center led by the iSchool, Human Centered Design & Engineering (HCDE) and the School of Law. The center, CIP for short, will bring together diverse voices from across the UW campus as well as industry, government, media, libraries, nonprofits and other institutions in pursuit of ambitious goals: resisting strategic misinformation, promoting an informed society and strengthening democratic discourse.

Misinformation has been around since the days of stone tablets, and West, the center’s inaugural director, harbors no illusions that the CIP will make it disappear. But through research, the center will learn about the effects of misinformation and how it spreads. And through outreach, the center will educate people about the sources of misinformation and the motivations of those who spread it. The researchers hope a healthier information environment will foster a healthier public discourse.

“The reason we want to devote a lot of effort and thinking to this is that it’s involved with every aspect of society,” West said. “You can’t solve climate change until you solve the information problem. You can’t solve human health issues until you solve the information integrity issue.” 

West has spent much of his career studying how information spreads in the academic world — the “science of science” — and how dubious information can gain traction in the scientific community. Since the mid-2010s, he has turned his attention toward bringing data reasoning and critical reasoning skills to the classroom. Along with colleague Carl Bergstrom from the UW Biology Department, he developed “Calling Bullshit,” an undergraduate course that filled within minutes after registration opened. Educators at colleges and high schools across the country adopted the curriculum, equipping students with the tools to identify and call out misinformation.

That was just the beginning. West aimed for a much broader effort when he teamed with fellow faculty members Chris Coward and Emma Spiro from the iSchool, Kate Starbird from HCDE and Ryan Calo from the School of Law to pursue funding for the CIP. In July, the John S. and James L. Knight Foundation and the William and Flora Hewlett Foundation announced investments totaling $5.6 million to establish the center. It’s part of a $50 million Knight Foundation effort to better understand how technology is transforming democracy and how we receive and engage with information.

From the outset, the researchers know they will face misperceptions that they intend to stifle certain political viewpoints, particularly on the conservative end of the spectrum. They emphasize that misinformation, not politics, is the problem. Misinformation and its more sinister form, disinformation, come from all sides, often with the goal of confusing people so they won’t know what to believe and won’t participate in public debates, weakening democracy as a result.

Starbird, an associate professor in HCDE, saw this effect in action as she researched the social media conversation around #BlackLivesMatter. Her initial takeaways were that the conversation around the movement was divided into two polarized camps and that much of it was toxic, adopting an “us-vs.-them” mentality. She later learned that Russian agents posing as Americans were participating in the conversation in large numbers on both sides of the debate. Their goal wasn’t to promote a point of view; it was to rip communities apart.

In studying the use of social media during crisis events, Starbird has seen a dramatic change in the conversation around such platforms. An increasing amount of what she has seen shared is disinformation — the strategic use of false information to cause harm. Social media were hailed as tools for democratic activists during the Arab Spring of 2010, but have since been weaponized to undermine democracy.

“Over time in this shift toward misinformation and disinformation, we’re seeing the worst of human behavior,” Starbird said. 

While disinformation works to erode people’s trust, the CIP will aim to make them feel empowered by knowing what tactics are being used and how to counter them. The challenge, Starbird said, will be to help people understand that they’re vulnerable without making them even more distrustful and cynical.

“These are hard problems, not just in the U.S. but globally, about how we access and make sense of information and how we can be manipulated, and what can we do to help defend ourselves,” she said.

Like Starbird, iSchool Assistant Professor Emma Spiro focuses much of her research on social media in crisis situations, but she also brings a sociologist’s perspective to social networks and the dynamics of relationships among people and groups that meet online. She noted that misinformation is a problem across generations and forms of media.

“Younger kids are on YouTube watching videos all the time, and they’re on on SnapChat; older people are watching TV,” Spiro said. “They have different ways of interacting with information, and this problem spans all of them.”

Knowing they are tackling a multifaceted problem, the CIP team enlisted collaborators from across the UW campus, including the Communication Leadership Program and the School of Law, where Ryan Calo is an associate professor. Throughout his academic career, he has worked at the intersection of law and technology, studying topics such as bots used to deceive social media users. As part of the CIP’s leadership team, he lends his expertise in how policymakers deal with technology, and how they often struggle to keep up with it.

“The stakes are super high,” Calo said, adding that attempts to prevent misinformation raise challenging legal questions. For example, “The idea that a group of people are intentionally attempting not to convey a particular idea, but to rip apart the fabric of democratic society, really challenges our model of what free-speech protection looks like.” 

Scholars and professionals in many fields are grappling with misinformation and their responsibility to counteract it. Journalists, tech workers, policymakers, educators and librarians are among those who are increasingly tasked with identifying misinformation and preventing it from spreading, and they will need tools and training to do so. In response, public outreach will be a key feature of the center.

Chris Coward, a researcher and director of the iSchool’s Technology & Social Change Group, brings an applied research background to the CIP, meaning that he’s accustomed to research that engages the public in direct problem-solving. Much of his work involves mobilizing libraries as hubs of civic participation around the world. The CIP plans community labs in public libraries, tribal centers and other institutions, as well as in online communities, in an effort to promote civil discussions of civic issues.

“A lot of public libraries and library associations have already been in touch with us,” Coward said. “They’re an avenue to connect directly with the public because they are in both rural and urban areas, they’re in over 16,000 communities, and they span the ideological and political divides that are part of this bigger problem.”

Outreach and research are among many pieces of the puzzle as the CIP ramps up its efforts. Education will be key. Technology could provide answers to certain aspects of the problem, such as detecting “deepfakes,” West said. But most importantly, it will take boots on the ground.

“It’s going to have to include people that actually talk to humans,” he said. “We need humans doing that work, which is kind of what we do in the Information School.”

Other News