Facebook’s Search for Trusted News
Social Media
You can't Trust Facebook's Search for Trusted News
Wired Magazine - January 2018
https://www.wired.com/story/you-cant-trust-facebooks-search-for-trusted-news/
Do you trust me?
Do you trust what you are about to read, assuming you keep reading? (Keep reading!) Do you believe that I comported myself ethically during my reporting, did not make anything up, did not use the work of others without credit?
Let me put that another way:
Do you trust that this article will make you feel better, or correct, about the world? Do you think that I, as the writer, have some connection to you, as part of a community? That I want you to be informed, sure, but also protected?
Both of those paragraphs define trust, but very differently. Which makes it both troubling and a little weird that last week the social network Facebook—in a news release attributed to Adam Mosseri, head of the company’s newsfeed—announced it would start prioritizing “trusted” news sources. “We surveyed a diverse and representative sample of people across the US to gauge their familiarity with, and trust in, various different sources of news,” the release says. “This data will help to inform ranking in News Feed.”
By one estimate, Facebook has 214 million US users, and is a major disseminator of news produced elsewhere. Some of that news is fake; the social network’s users are prone to spreading extreme content, and some of that content is literal propaganda. Russian agents used Facebook to disrupt the US elections in 2016, exposing 140 million people to their trolling. Even Facebook knows it has a problem—in a corporate post, the company’s product manager for civic engagement acknowledged that social media could “corrode democracy,” and he listed Facebook's efforts to expose untruths and deter people from sharing misinformation.
The relationship between Facebook and the news media is, as the site might put it, complicated. Much of the ad money that used to go to independent news outlets now goes to Facebook—the company generated more than $27 billion in ad revenue in the first nine months of last year, topping Comcast and Disney—while advertising in newspapers and magazines fell off a cliff. Money that used to pay for news now pays for Facebook.
So the question you should ask next is not how Facebook can figure out what news organizations people trust. It’s not even whether that’s possible. The question is if that’s even the right question.
Facebook plans to gather its data with a poll. “As part of our ongoing quality surveys, we will now ask people whether they’re familiar with a news source and, if so, whether they trust that source,” writes Facebook founder Mark Zuckerberg on, duh, Facebook.
This has turned out to be literally true. Buzzfeed published the complete poll on Tuesday. It asks which news outlets on a list users are familiar with and how much they trust those “domains.” That’s it.
The five possible answers range from “entirely” to “not at all,” easy to code as one through five (or five through one). “The idea is that some news organizations are only trusted by their readers or watchers, and others are broadly trusted across society even by those who don't follow them directly,” Zuckerberg writes.
So, yeah. That’s probably not going to work.
In his 2002 book Trust and Trustworthiness, the late political scientist Russell Hardin writes that trust itself has at best messy definitions, not widely agreed upon. “Quarrels about what it ‘really’ means sound like the worst of Platonic debates,” Hardin writes. “There is no Platonically essential notion of trust.”
That doesn’t stop him from trying, though. “Trustworthiness,” Hardin says, is the raw stuff, the thing that a person or an institution might possess. “Trust” is what someone feels. It’s a three-part relationship: A trusts B to do X. If you only have two elements, that’s not really trust.
When it comes to news, solving for X is the tricky part. What does Facebook think its users trust news organizations to do? The company did not respond to requests for comment.
What Facebook seems to be asking about is not actually trust but trustworthiness—because, frankly, it should not matter whether someone trusts a news outlet. It should matter whether that news outlet is trustworthy. People trust other people and things for all sorts of bad reasons, Hardin points out, to do all sorts of bad things. “Members of a community may trust one another in ways that are commonly all to the good, and yet their trust may enable them to subjugate and brutalize a neighboring community,” he writes.
Of course, the appearance of trustworthiness can be gamed. “The legitimacy part is the one that gets gamed the most,” says Kimberly Elsbach, a professor of management at UC Davis. “Saying that you’re using a legitimate, well-known process, but not actually doing that.”
Worse, people tend to be more trusting of things that are familiar. They’ll distrust an expert but believe a friend or loved one. “A lot of people have a very local view of what they trust,” says Roderick Kramer, a professor of organizational behavior at Stanford. “Their local church, local institutions, local paper, their friends.” (Apparently people share news on Facebook with friends somewhat indiscriminately; an experiment where Facebook fact-checkers marked some stories as “disputed” didn’t cut sharing rates, though appending related news did—somewhat.)
Here’s the even deeper problem: Not only do people not trust the media much in general, but their level of trust emerges predictably from their political orientation.
Using data from an ongoing multi-subject survey out of the University of Michigan, a 2010 study in the journal American Behavioral Scientist said that three things predicted whether someone will trust the news media: how far they leaned to the left, politically; how trusting they are in general; and how well they think the economy is doing. This was before political polarization reached its current supercharged levels, and the survey asked about the news in general rather than particular sources. It’s safe to assume that people who bottom out on all those metrics still trust some sources of information, and presumably they’d upvote those on the Facebook survey.
Similarly, a Pew Research Center study from May 2017 said that 89 percent of people who identified as Democrats said the news media’s watchdog role kept politicians from doing bad things, compared with just 42 percent of Republicans. Seventy-five percent of Americans say the news media does fairly well or very well at keeping them informed, but that splits on party lines, too—88-69 Democratic.
Also last year, a researcher now at the University of Missouri polled audiences from 28 different news organizations about their level of trust. Mike Kearney, a journalism professor, asked the question differently, though. “How likely are you to believe what you read, see, or hear from mainstream journalism organizations (however you define mainstream)?” Granted, these were people already reading news, but more than two-thirds said they were likely or very likely to believe. Kearney, too, found that liberals were more credulous. So were white people.
Kearney also asked about specific outlets, which may offer a preview of the Facebook newsfeed bump. At the bottom: Buzzfeed, Breitbart, social media, and Infowars. Most trusted: Reuters, public television, and The Economist. (WIRED didn’t appear on the list.) “Maybe in a highly salient political time, any type of controversy drives us to the more confirmatory. We choose a news source because it reinforces our pre-existing beliefs,” Kearney says. “What is trust or trustworthiness of a source? We don’t have a universal definition, even though we all understand the underlying concept. But for most of us it gets expressed in a way that reaffirms our worldview.”
That’s a fundamental problem. Unlike most trustworthy institutions, journalism isn’t supposed to reaffirm worldviews. Quite the opposite, in fact. Journalists are supposed to comport themselves according to specific ethical standards, but those standards can seem at odds with societal norms—telling other people’s secrets, for example, or being impertinent to powerful people. Plus, today pretty much anyone can put on a suit and sit in front of a TV set that looks like a traditional newsroom or make radio or a podcast, and it all looks and sounds like Walter freaking Cronkite even if it’s actually Joseph freaking Goebbels.
All of which, at last, brings us back to Facebook. It’s not asking which news sources people believe are operating in good faith, providing relevant analysis, attempting to be fair but not falsely equivalent. And it’s not asking people who consume a lot of news about their experiences. It’s asking one deceptively simple question: Which news outlets do you trust?
It’s also reductive: Facebook users look at Facebook, so will likely name outlets most often seen on Facebook. (Distinguished Competitors, whatever you spent on that social desk is about to pay off!) Perhaps because my profession has done such a terrible job of explaining exactly what it is we do and how we do it, people are likely to distrust the places that do it the best.
Still think this is going to work? Trust an expert: “Facebook and Google have popularized scurrilous news sources through algorithms that are profitable for these platforms but inherently unreliable. Recognition of a problem is one step on the pathway to cure, but the remedial measures that both companies have so far proposed are inadequate, commercially, socially and journalistically.” The source? Rupert Murdoch, the head of Fox News.