facebook’s reckoning


John Herrman, reporting in the New York Times Magazine, in August.

Facebook, in the years leading up to this election, hasn’t just become nearly ubiquitous among American internet users; it has centralized online news consumption in an unprecedented way. According to the company, its site is used by more than 200 million people in the United States each month, out of a total population of 320 million. A 2016 Pew study found that 44 percent of Americans read or watch news on Facebook. These are approximate exterior dimensions and can tell us only so much. But we can know, based on these facts alone, that Facebook is hosting a huge portion of the political conversation in America.


Mark Zuckerberg, earlier this week, responding to speculation over the impact of fake news on Facebook, as reported in the Verge

After a day of criticism over his company’s role in spreading fake news about political candidates, Facebook CEO Mark Zuckerberg rejected the idea that the News Feed had tilted the election in favor of Donald Trump. “Personally I think the idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way — I think is a pretty crazy idea. Voters make decisions based on their lived experience.”


Mike Isaac in the Times

Even as Facebook has outwardly defended itself as a nonpartisan information source — Mark. Zuckerberg, chairman and chief executive, said at a conference on Thursday that Facebook affecting the election was “a pretty crazy idea” — many company executives and employees have been asking one another if, or how, they shaped the minds, opinions and votes of Americans.

In May, the company grappled with accusations that politically biased employees were censoring some conservative stories and websites in Facebook’s Trending Topics section, a part of the site that shows the most talked-about stories and issues on Facebook. Facebook later laid off the Trending Topics team.

In September, Facebook came under fire for removing a Pulitzer Prize-winning photo of a naked 9-year-old girl, Phan Thi Kim Phuc, as she fled napalm bombs during the Vietnam War. The social network took down the photo for violating its nudity standards, even though the picture was an illustration of the horrors of war rather than child pornography.

Both those incidents seemed to worsen a problem of fake news circulating on Facebook. The Trending Topics episode paralyzed Facebook’s willingness to make any serious changes to its products that might compromise the perception of its objectivity, employees said. The “napalm girl” incident reminded many insiders at Facebook of the company’s often tone-deaf approach to nuanced situations.


Michael Nunez in Gizmodo

According to two sources with direct knowledge of the company’s decision-making, Facebook executives conducted a wide-ranging review of products and policies earlier this year, with the goal of eliminating any appearance of political bias. One source said high-ranking officials were briefed on a planned News Feed update that would have identified fake or hoax news stories, but disproportionately impacted right-wing news sites by downgrading or removing that content from people’s feeds. According to the source, the update was shelved and never released to the public. It’s unclear if the update had other deficiencies that caused it to be scrubbed.


Ben Thompson, back in September of this year, writing in Stratechery about the incident regarding the photo of Phan Thi Kim Phuc.

The truth is that Facebook may not be a media company, but users do read a lot of news there; by extension, the company may not have a monopoly in news distribution, but the impact of so many self-selecting Facebook as their primary news source has significant effects on society. And, as I’ve noted repeatedly, society and its representatives may very well strike back; this sort of stupidity via apathy will only hasten the reckoning.