Facebook to let users prioritize news sources

SAN FRANCISCO -- Facebook said Friday that it planned to prioritize high-quality news on the social network by allowing its users to rank news sources that they see as the most credible and trustworthy.

"There's too much sensationalism, misinformation and polarization in the world today," Mark Zuckerberg, Facebook's chief executive, wrote in a post Friday. "We decided that having the community determine which sources are broadly trusted would be most objective."

The shift is another signal of Facebook's ability -- this time using the collective power of its more than 2 billion members worldwide -- to play kingmaker with publishers. Many publishers have long relied on Facebook to reach audiences, and they largely reacted with disappointment when the company said it would play down news overall.

The move was also the most recent by Facebook to counter charges that not enough is being done to stamp out fake news and disinformation on its platform. The company was dogged by criticism in late 2016 after the presidential election that too many false stories attacking Hillary Clinton had spread on its site, possibly affecting the election's outcome. Last year, Facebook also acknowledged that Russian agents had used the site to spread divisive and polarizing ads and posts.

The same criticism has also engulfed other social media companies such as Twitter, which on Friday said it was emailing notifications to 677,775 people in the United States that they had interacted with Russian propaganda accounts around the time of the 2016 election.

The company Friday also disclosed thousands of accounts that it said were associated with the Kremlin-linked troll farm, the Internet Research Agency and the Russian government, adding to numbers that it released to Congress in October.

Twitter said it had identified 3,814 Internet Research Agency-linked accounts, which posted some 176,000 tweets in the 10 weeks preceding the election, and another 50,258 automated accounts connected to the Russian government, which tweeted more than a million times, while acknowledging that "such activity represents a challenge to democratic societies everywhere," in a release on Friday afternoon.

U.S. intelligence agencies have concluded that Russia conducted a sophisticated campaign intended to affect the outcome of the election, which included spreading propaganda and incendiary reports on social media about subjects like police brutality, Black Lives Matter, Muslim rights and veterans issues, and hacking Democratic officials to sow discord in the country.

The company also highlighted the changes it had made to its internal policing efforts, saying that it currently blocks about 250,000 logins that come from automated accounts a day. It said that in December, updated methodology allowed it to identify about 6.4 million suspicious accounts around the world per week. And the company sought to underline how small a portion of the activity on Twitter these accounts represented before the accounts were removed.

For publishers, Facebook's new ranking system raised immediate concerns, including whether crowd-sourcing users' opinions on trustworthiness might be open to manipulation.

"It is absolutely a positive move to start to try to separate the wheat from the chaff in terms of reputation and use brands as proxies for trust," said Jason Kint, chief executive of Digital Content Next, a trade group that represents entertainment and news organizations, including The New York Times. "But the devil's in the details on how they're going to actually execute on that."

David Kaye, the United Nations' special rapporteur on freedom of expression, said Facebook would probably face more difficult questions as it rolled out the new ranking program globally.

"What will happen in situations where a community determines that a news source is trustworthy, but that news source is censored, or illegal, in that country?" asked Kaye, noting that in many parts of the world, governments controlled all official news channels while independent news sources were outlawed or forced to publish pseudo-anonymously.

Raju Narisetti, chief executive of the Gizmodo Media Group, the unit of Univision that operates Jezebel and other sites, said the user-generated ranking system "is a massive abdication of its social responsibility, as a vital platform, to be a good custodian of the Fourth Estate globally."

Other experiments Facebook has tried with news in the past have sometimes gone awry or had unintended consequences. For example, a recent test to remove all news publishers from the News Feed and to place them under a separate tab in six countries, including Bolivia and Slovakia, yielded an unexpected result of magnifying the amount of fake news on the platform.

Information for this article was contributed by Sheera Frenkel and Sapna Maheshwari of The New York Times; and by Eli Rosenberg of The Washington Post.

A Section on 01/21/2018

Upcoming Events