Since October, MIF has been hosting a webinar series organized by Kelly Born of the William and Flora Hewlett Foundation on how funders can support remedies for the deluge of false facts faced by news consumers.
The first webinar in October examined how to create news that audiences can trust. The second webinar in November explored how social media is being used to spread propaganda and disinformation. Last week, for the third webinar, hosts Elizabeth Christopherson and Jonathan Kartt of the Rita Allen Foundation led a discussion that addressed how people absorb information and what can be done to correct falsehoods and improve critical thinking about media.
Watch the conversation here:
Christopherson, who is the CEO and president of Rita Allen—opened up the discussion with a reflection on the “stressed” information environment, noting factors including the “proliferation of false information, bad actors, the human inclination to seek news that supports our worldview, and a dizzying pace of information exchange.” Trust in media is lower than ever, and voters more polarized, she observed, which makes such issues very timely.
In response, she spoke about various ways in which she and her colleagues and grantees are working to address these concerns. The foundation supports journalism innovation, the creation of networks and collaborations that foster new knowledge and solutions, and projects that encourage public engagement with science and facts. Of particular concern is the disconnect between “expert consensus and public belief.”
Kartt, the foundation’s officer for programs and evaluation, noted that Rita Allen had teamed up with the Knight Foundation and the Democracy Fund in the spring to issue an open call for prototypes designed to improve the flow of accurate information.
Presenters included:
Brendan Nyhan, a professor in the Department of Government at Dartmouth College, spoke about his research into the effectiveness of fact-checking, and how misinformation spreads. Fact-checking can work “under certain circumstances,” he said, but the problem is that the people consuming false facts are not necessarily the same as those seeking out fact-checks. Certain misperceptions—such as those surrounding Barack Obama’s birth certificate or the existence of weapons of mass destruction in Iraq—persist despite widespread distribution of corrective information. Elites can play a key role in propagating disinformation, as well as seeding more general mistrust in media. This is particularly pernicious given the polarized political environment. That said, Nyhan noted, not everyone consumes information in “echo chambers” that reinforce false beliefs. Consumption of falsehoods is concentrated among the most conservative news audiences, with others varying their news diets.
Tom Rosenstiel, the executive director of the American Press Institute, noted that concerns over so-called “fake news” did not start with the internet. The biggest decline in trust in journalism began in the 1980s, and correlates to growing choice of news products across platforms including cable news and talk radio. State-sponsored efforts to promulgate propaganda and disinformation are also old news—half the decline in trust indicated by polling happened before the internet became a key news source.
That said, social media creates powerful conditions for misinformation to spread. He noted several “propulsive forces.” On social platforms, news is broken apart from brand—the sharer becomes more important than the producer. A culture of gatekeeping and news quality has been replaced with the values of connectivity and openness. Polarization is also deepened online by a business model of advertising that encourages targeting and separating audience members. Rosenstiel noted that Facebook and Google control the majority of ad revenue on mobile, further consolidating the information environment.
Why do people believe misinformation? Blaming consumption bias alone oversimplifies the problem, he said. Repetition alone is a powerful element—confronted with repeated assertions, people will even begin to doubt things they already know to be true. State-sponsored propagation of falsehoods is very powerful; more bots than humans were responsible for spreading disinformation during the most recent elections.
Eric Newton, the innovation chief at Cronkite News, observed that in the history of communications, “technology has always moved faster than our ability to use it.” Some people still can’t read books, never mind understanding how to use their mobile phones or social media platforms to the best effect when seeking quality information.
He spoke about “supply-side” solutions to the misinformation problem designed to increase audience members’ trust in and connection with news outlets: transparency and engagement. In terms of transparency, the Trust Project has worked with a dozen newsrooms to be “radically clear” about how and why they produce journalism. In this way, audiences learn about how news works—about ethics, corrections, sourcing and next steps—at the same time they absorb the news itself.
Projects such as the Center for Media Engagement at the University of Texas, Austin, and Gather at the University of Oregon are tracing the different ways that people are using social media to share and interact with news. Organizations such as TEGNA and Hearken enable journalists to bring community members into the reporting and fact-checking process, increasing comprehension of and trust in the journalistic process.
On the “demand side,” projects that teach audience members how to analyze, create and act upon media help to create more informed and enthusiastic news consumers. Funders have supported an array of media, digital and news literacy programs over the years, but it has been difficult to embed these in the formal education system. Notable projects include a Checkology, which teaches news literacy to K-12 students; the Center for News Literacy at Stony Brook University, which has a MOOC (massive open online course) called Making Sense of the News; and the Newseum, which has educated millions in news literacy via its NewseumED initiative.
Newer innovations combine both supply and demand approaches, said Newton, such as the News Co/Lab at Arizona State University’s Walter Cronkite School of Journalism and Mass Communication, where he works. The project works with newsrooms to be radically clear, to engage audiences, and to try other experiments that, he said, “put the wheels on the suitcase.”
During the Q&A, Rosenstiel observed that fact-checkers are very skeptical about efforts to flag disinformation on social platforms. Not only is the amount of fact-checked information “vanishingly small” compared to the wide range of information shared on Facebook or Twitter, but there’s little evidence that flagging information as false is effective. When fact-checkers build infrastructure, he said, politicians adapt, and even “weaponize” the fact-checking routines to spread half-truths or communicate dog whistles. This is why focusing on longer-range checking of facts related to particular issues has more traction than focusing on personalities.
Newton said that he has hope that the platforms might be willing to distribute news literacy materials and learn from experiments in transparency and engagement.
Christopherson asked Nyhan where he’d place his “big bet,” and he suggested that investment in scholar-practitioner partnerships might allow funders to support more rigorous evaluation of which of these interventions works best.