Register Now
JFunders24: The Annual Gathering of Journalism Funders
Reserve your seat today!
Learn More
JFunders24: The Annual Gathering of Journalism Funders

Update as of Dec. 19, 2017—We have completed our webinar series. Read up on the various ways in which funders are fighting misinformation on the three points of the information system: production, distribution and consumption.

By Kelly Born | Program officer for the Madison Initiative at the William and Flora Hewlett Foundation

“Fake news” continues to spread like wildfire across the western world, and American philanthropy is at the forefront of the battle to address it. From the recent prototype fund sponsored by the Knight and Rita Allen Foundations and the Democracy Fund, to the Gates, Ford, and Knight Foundations’ collaboration with Democracy Fund in support of CUNY’s News Integrity Initiative, to Craig Newmark and others’ investment in Poynter’s fact-checking initiative, many in philanthropy are seeking to help mitigate the effects of disinformation and propaganda.

Broadly speaking, most of these philanthropic initiatives seek to intervene at one of three points in the information system: on the production of politically relevant information, on its distribution, or on its consumption. These points each represent necessary parts of any ultimate solution—but alone are insufficient.

To date, the most prominent philanthropic interventions have focused on the front end, to improve the quality and reliability of journalism produced, or on the back end, via fact-checking and media literacy aimed at information consumers. Political advertising, foreign meddling, and “fake news” on Facebook have dominated recent headlines, but within philanthropy fewer funders have focused on the role that online platforms play in distributing information—perhaps because these large commercial platforms are opaque and hard to influence, or perhaps because these other avenues represent more familiar territory.

To explore these issues, I’m working with Media Impact Funders to organize a three-part webinar series: The Future of Truth: Can Philanthropy Help Mitigate Misinformation?

As a program officer at the William and Flora Hewlett Foundation’s Madison Initiative, I’ve been immersed in this question. Our initiative focuses on making democracy and government—Congress, in particular— more effective in an era of polarization, and it’s clear that the information ecosystem is an important part of the challenge.

This webinar series will explore challenges and opportunities to improve information production, distribution and consumption:

  • October 12: Josh Stearns of the Democracy Fund will lead the first webinar on information production.
  • November 2: I will lead the second webinar on information distribution.
  • December 14: Elizabeth Christopherson of the Rita Allen Foundation will lead the third webinar on information consumption.

Information production: Supporting quality journalism

Challenges in the journalism field are by now well-understood—newsroom employment alone has dropped from 65,000 in 2004 to just 41,000 in 2015. Trust in news is likewise declining. In 2017, the Pew Research Center found that only 34 percent of Democrats considered information from national news organizations “very trustworthy” — and a mere 11 percent of Republicans said the same.

Many have argued that the antidote to fake news is more quality journalism. Numerous funders are supporting efforts to improve journalism funding—either directly funding nonprofit newsrooms, advocating for increased funding for public media, or trying to improve the business models of traditional for-profit journalism outlets.

Other interventions on the production side focus on press freedom, including efforts to secure journalists themselves and their access to sources, whistleblower protections, and more. Grantees such as The Coral Project, which Rita Allen Foundation supports, are educating newsrooms on how to improve audience engagement, better include minority voices, and more. Others are striving (either directly or indirectly) to improve trust and transparency in journalism.

In the first webinar, we’ll be hearing from speakers focused on increasing quality and trust, including Sarah Alvarez, the founder and manager of Outlier Media; David Bornstein the co-Founder and CEO of the Solutions Journalism Network; Emily Goligoski, the research director of the Membership Puzzle Project; and Joy Mayer, who leads the Trusting News research project at the Donald W. Reynolds Journalism Institute.

The Knight, Ford, Gates, MacArthur, McCormick and Rita Allen Foundations, the Omidyar Network, and the Democracy Fund have all been key players in supporting the journalism field. This work is critical. But it is not enough. There is quality news out there, but now that anyone can create and distribute information online, quality sources are too often drowned out in a sea of partisan noise, dis- or misinformation, and propaganda.

Information distributors: The role of online platforms

Now that anyone can create and distribute content, Facebook and Google have become the main information intermediaries. More than 60 percent of Americans now get news on social. More than 80 percent of search engine users use Google Search, which now receives between 3.5 and 5.5 billion queries per day.

                                 
Source: Pew Research. News Use Across Social Media Platforms 2016.

Online information distribution differs from what audiences were accustomed to in the offline age. Original sources are often unclear. Sharers of information are easily anonymized—with one recent study finding that as many as 50 million Twitters users and 137 million Facebook users may be bots. Groups such as Cambridge Analytica are able to create personalized, adaptive, and ultimately addictive propaganda, microtargeted to the level of the individual IP address. And all of this activity is overseen by large tech companies with proprietary algorithms to which the public is not privy.

So what could be done to address these distribution challenges? Stanford Law Professor Nate Persily recently suggested “5 D’s” the platforms could pursue to address disinformation:

  • Disclose—provide information on how underlying platform algorithms work; on who is buying ads, and their source, or; as Facebook has begun doing, on whether a story has been tagged as “fake news.”
  • Demote—better identify and down-rank stories from questionable sources, while elevating trustworthy ones.
  • Delayhalt the spread of a story until its accuracy is verified.
  • Dilute— serve up stories from verified professional news organizations that display factual stories on the same topic.
  • Delete—remove content determined to be false or misleading.

Of course, given many platforms’ commitment to an open internet and American free speech norms, most are reticent to delete content. Legally, however, doing so could be easier than they let on. Platforms already do so for content they deem inappropriate (hate speech, incitement, obscenity, child pornography, etc.), alongside copyright infringements and other illegal speech (like prostitution, drug sales). Alternately platforms could allow users more control over their own feeds (e.g., Twitter recently expanded its hateful-conduct policy, enabling users to hide tweets with certain words).

To date, there has been less philanthropic focus on the role of technology platforms, in part due to these companies’ limited openness to collaboration or data sharing. Several foundations are now exploring ways to improve the role of the platforms, including the Ford, Hewlett, and Open Society foundations, and the Omidyar Network and Democracy Fund. Most are still in the research stage.

In the second webinar, we’ll hear from researchers focused on these questions, including Jonathan Albright, the research director at the Tow Center for Digital Journalism; Josh Tucker, the co-director of NYU’s Social Media and Political Participation Lab; and Nate Persily, professor of law at Stanford University.

Information consumption: Audience-facing Interventions

Two primary efforts—fact-checking and news literacy—aim to help consumers either correct or avoid dis- or misinformation or propaganda.

Hundreds of independent fact-checking groups have emerged in recent years, many now coordinated by the International Fact-Checking Network (IFCN) at Poynter. Groups such as FactcheckFactChecker (home of the “Pinocchios”), Snopes and Politifact remain most prominent in the U.S., and technology platforms are now leveraging their work. In August, Facebook announced plans to step up its outsourcing of fact-checking to these established organizations, relying on users to flag potentially “fake” news stories for fact-checking. Google is also helping to draw users’ attention to fact-checking articles relevant to significant news stories.

The Democracy Fund,  Knight Foundation, Omidyar Network, Craig Newmark Foundation, Rita Allen Foundation and others have supported both the fact-checking itself and, alongside the Hewlett Foundation, some great behavioral science research exploring its impact. But questions persist, including:

  • Scalability: It can take 8 to 15 hours to debunk a falsehood that took only minutes to create.
  • Distribution: The most misinformed audiences are often not the ones seeking out corrections.
  • Impact: Even when fact-checks do reach larger audiences, people often don’t believe the correction or, even if they do, don’t actually change their views. Behavioral scientists note widespread tendencies towards motivated reasoning and confirmation bias—people trust information that reaffirms their pre-existing worldviews.

As a seminal study by the RAND Corporation notes regarding fact-checking: “Don’t expect to counter the firehose of falsehood with the squirt gun of truth.”

As a potentially longer-term solution, many grantees are also working to improve news literacy including the News Literacy Project, the Center for News Literacy, the American Press Institute and others. These and other media literacy approaches target students, adults, online news consumers, pop culture consumers and others.

In the third webinar, we’ll hear from experts in fact-checking and media literacy, including Eric Newton, the innovation chief at Cronkite News; Brendan Nyhan, a professor in the Department of Government at Dartmouth College; and Tom Rosenstiel, the executive director of the American Press Institute.

Here, the same challenges around preaching to the choir, motivated reasoning, and confirmation bias apply, as do concerns about the time lag before such education would have real impact. Like fact-checking, news literacy is also easily “weaponized” by any opposing side, which can shed doubt on any fact by claiming a lack of “news literacy” on behalf of the believer.

Such approaches also put the onus for problem-solving on citizens who are already busy with their day jobs, overwhelmed with conflicting information, and less sophisticated or motivated than the average propagandist. Many citizens can’t be bothered to vote in U.S. elections, or read past the headlines before sharing an article, much less track down sources on an individual news article. So while news literacy remains important, it too is unlikely to solve the whole problem.

No silver bullet

There clearly is no simple solution here. Instead, some combination and refinement of the above approaches will likely help to chip away at the worst effects of this modern disinformation and propaganda problem that has plagued us, in one form or another, for centuries.

Given the dizzying amount of activity in the philanthropic space in recent months, our immediate hope is simply that funders can begin to better coordinate efforts with one another, and amongst grantees—to provide more visibility into who is doing what, broker connections between related efforts, identify where the gaps are, research what works, and begin to bring successful efforts to scale.

This post was adapted from a longer analysis on the Hewlett Foundation site.