Your NPR news source

Facebook Updates Community Standards, Expands Appeals Process

The social media company announced changes to its community standards, releasing internal review guidelines and allowing appeals of content removal decisions.

SHARE Facebook Updates Community Standards, Expands Appeals Process

Facebook announced changes to its content review policy Tuesday, adding an appeals process for removed content and releasing the internal guidelines it relies on to make content determinations.

While the social media giant has listed a set of publicly available community standards for several years, the latest update includes the more detailed guidelines utilized internally by content reviewers when deciding whether to allow or remove posts.

The updated appeals process will allow posters of removed photos, videos or posts to contest detterminations they feel were wrongly made. Previously, appeals of community standards determinations were only allowed when a Facebook page, group or profile was removed entirely.

Facebook has been hesitant to reveal details of its content review policy in the past. But the company says Tuesday’s announcement is part of its promise to “do better” and be more transparent about how it decides what stays up and what gets taken down. The changes come just weeks after CEO Mark Zuckerberg was grilled on Capitol Hill about Facebook’s alleged censorship of conservative viewpoints.

Zuckerberg was asked by multiple lawmakers during his marathon testimony about “Diamond and Silk,” two pro-Trump commentators who claim Facebook intentionally limited their presence on the site because of their political views. Zuckerberg apologized, calling the situation an “enforcement error” on Facebook’s part, but the controversy raised questions about what type of content Facebook restricts and how it makes those decisions. Diamond and Silk are themselves set to testify before Congress later this week.

Granular standards

The newly released standards are a stark departure from Facebook’s prior guidance, which had been crafted to express the company’s values and priorities in a way that did not overwhelm readers.

“We’ve always had a set of community standards that the public can see,” Facebook Vice President Monika Bickert told NPR’s Steve Inskeep, “but now we’re actually explaining how we define those terms for our review teams and how we enforce those policies.”

Those new explanations are nothing if not comprehensive. They detail dozens of reasons posts can be removed, and read more like the product of a team of lawyers than the words of an upstart tech company. The standards outline methods for categorizing content and provide specific definitions for terms like “hate speech,” “terrorism,” and “threats of violence.”

“People define those things in different ways,” Bickert said, “and people who are using Facebook want to know how we define it and I think that’s fair.”

Some objectionable content is classified into tiers, with Facebook’s response matching the severity of the violation. Other content is removed if it satisfies multiple conditions in a point-like system. A threat of violence, for example, can be deemed “credible” and removed if it provides a target and “two or more of the following: location, timing, method.”

Still other standards target particular categories of offensive posts. Content that “promotes, encourages, coordinates, or provides instructions” for eating disorders or self harm are specifically mentioned. And under its “harassment” section, Facebook says it will not tolerate claims that survivors of traumatic events are “paid or employed to mislead people about their role in the event.” Other standards prohibit advertising drugs, revealing the identities of undercover law enforcement officers, and depicting graphic violence.

Context matters

When it comes to judging content, though, context is crucial. Facebook has been criticized in the past for its blundering approach to community moderation. In 2016, for example, the company reversed its decision to remove a post containing the Pulitzer-winning “napalm girl” photo, which depicted a nude and burned child in the Vietnam War.

Bickert says that example proves that exceptions are needed for newsworthy and culturally significant content.

Facebook’s updated standards now list some exceptions for depictions of adult nudity, including “acts of protests,” “breast-feeding,” and “post-mastectomy scarring.”

Still, questions remain over Facebook’s content moderation program. Despite Zuckerberg’s stated desire to utilize artificial intelligence to flag offensive content, the process remains very human. According to Bickert, the company has over 7,500 moderators who are stationed around the globe and work 24/7.

But conversations with those moderators paint a much bleaker image of Facebook’s processes than the one Bickert provides. In 2016, NPR’s Aarti Shahani detailed a workforce comprised primarily of subcontractors who are stationed in distant countries and asked to review large quantities of posts every shift.

It’s not hard to imagine how someone located thousands of miles away, who grew up in a different culture, and who is under immense pressure to review as many posts as possible, might mess up.

The appeal of appeals

Facebook is seeking to address that problem with its new appeals system. Now, if your post is removed for “nudity, sexual activity, hate speech, or violence” you will be presented with a chance to request a review.

Facebook promises that appeals will be reviewed within 24 hours by its Community Operations team. But it remains unclear what relationship the team has with Facebook and with its first-line reviewers. If appeals are reviewed under the same conditions that initial content decisions are made, the process may be nothing more than an empty gesture.

Facebook points out that the content review and appeals process is just one way to clean up your experience on the site. Users have the ability to unilaterally block, unfollow, or hide posts or posters they don’t want to see.

For the social media giant, it’s a question of balance. Balance between free speech and user safety. Balance between curbing “fake news” and encouraging open political discourse. And balance between Facebook’s obligation to serve as a steward of a welcoming environment and the realities of running a for-profit, publicly owned corporation.

“We do try to allow as much speech as possible,” Bickert said, “and we know sometimes that might make people uncomfortable.”

Facebook says that Tuesday’s announcements are just one step in a continuous process of improvement and adjustment to its standards and policies. How much of an improvement this step represents remains to be seen.

Copyright 2018 NPR. To see more, visit http://www.npr.org/.

The Latest
In a subpoena obtained by WBEZ, the feds wanted a list of county documents about a hack that potentially affected 1.2 million patients here.
Supreme Court Justices heard arguments that could upend Section 230, which has been called the law that created the internet.
TikTok has a reputation for its seemingly bottomless well of dance trends and lip sync videos, but there are as many sides of TikTok as there are users. It has quickly become a forum for cultural conversation, and many Gen Z users even get their news from the app. Reset hears from two fan-favorite TikTokkers about building an audience, keeping people from scrolling away, and what makes the app tick. GUESTS: Chris Vazquez, Associate Producer on the Washington Post TikTok team Jack Corbett, video producer for NPR’s Planet Money
If you don’t think news out of the tech world affects you, think again. With your favorite streaming service shaking things up and the metaverse looming, this might be a big year for the tech we use every day. Reset checks in with a tech writer at CES. GUEST: Tatum Hunter, Washington Post technology writer
Twitter CEO Elon Musk finalized his purchase of the social media platform in October and already has plans to step down. Reset digs into his reign at the company and how it could change going forward. GUEST: Cat Zakrzewski, technology policy reporter for the Washington Post