Facebook 'Supreme Court' Orders Social Network To Restore 4 Posts In 1st Rulings

Jan 28, 2021
Originally published on January 28, 2021 8:35 pm

Updated at 3:16 p.m. ET

Facebook's oversight board on Thursday directed the company to restore several posts that the social network had removed for breaking its rules on hate speech, harmful misinformation and other matters.

The decisions are the first rulings for the board, which Facebook created last year as a kind of supreme court, casting the final votes on the hardest calls the company makes about what it does and does not allow users to post.

The rulings announced on Thursday do not include the most high-profile and high-stakes case on the board's docket: Facebook's suspension of former President Donald Trump from both its namesake platform and Instagram, which the company owns. Facebook banned Trump earlier this month, after a mob of his supporters stormed the U.S. Capitol.

The Trump case, which the board has 90 days to consider since receiving it last week, is seen as a crucial test of the panel's legitimacy. The board will begin taking public comment on it on Friday.

Still, Thursday's decisions are an important first glimpse into how the board sees its governance role and indicated skepticism among members with how Facebook has communicated and enforced several key policies.

"We believe the board has the ability to provide a critical independent check on how Facebook moderates content and to begin reshaping the company's policies over the long term," Helle Thorning-Schmidt, a former prime minister of Denmark who is one of the board's four co-chairs, told reporters on a press call on Thursday.

In its first batch of decisions, the board overturned Facebook's post removals in four of five cases:

  • It overruled Facebook's removal of a post from a user in France criticizing the government for withholding an alleged "cure" for COVID-19. Facebook had removed the post because it said it could lead to imminent harm, but the board said the user's comments were directed at opposing government policy. "Facebook had not demonstrated the post would rise to the level of imminent harm, as required by its own rule in the Community Standards," the board said. It also recommended that the company create a new policy specifically about health misinformation, "consolidating and clarifying the existing rules in one place."

  • It overruled a case in which Facebook took down a post from a user in North America allegedly quoting Joseph Goebbels, a Nazi official on Facebook's list of "dangerous individuals." The board found it "was a criticism, not a celebration of the attitude exemplified by the alleged Goebbels quote," board co-chair Michael McConnell, director of Stanford Law School's Constitutional Law Center, told reporters.

  • In a case dealing with nudity, the board overturned the removal of an Instagram post promoting breast cancer awareness in Brazil that showed women's nipples. The board pointed out that Facebook's nudity rules include an exception for posts about breast cancer. Facebook restored the post back in December, after the board announced it would be reviewing the case.
  • "I think this is a really good example of how the mere prospect of a board review has already begun to alter how Facebook acts," McConnell said.

    The board also sounded the alarm that the Instagram post was initially removed by automated systems. "The incorrect removal of this post indicates the lack of proper human oversight which raises human rights concerns," it said in its decision. It recommended that Facebook tell users when their posts have been taken down by automated systems and that they can appeal those decisions to a person.

  • The final two cases dealt with hate speech. In the first, the board overruled the removal of a post from a Facebook user in Myanmar that Facebook said violated its rules against hate speech for disparaging Muslims as psychologically inferior. While the board found the post "pejorative," taking into account the full context, it did not "advocate hatred or intentionally incite any form of imminent harm," the board said in its decision.

    The board acknowledged the case was fraught because Facebook has been criticized for its role in the genocide of the country's Muslim minority.

    Muslim Advocates, a civil rights group, slammed the ruling. "Facebook's Oversight Board bent over backwards to excuse hate in Myanmar — a country where Facebook has been complicit in a genocide against Muslims," said Eric Naing, a spokesperson for the group. "Instead of taking meaningful action to curb dangerous hate speech on the platform, Facebook punted responsibility to a third party board that used laughable technicalities to protect anti-Muslim hate content that contributes to genocide."

  • The board upheld Facebook's removal of another post for breaking hate-speech rules, however. It said the company was right to remove a post that used a slur against Azerbaijanis. "The context in which the term was used makes clear it was meant to dehumanize its target," the board said in its decision.

Facebook has said it will abide by the board's decisions and has already reinstated the posts the board said shouldn't have come down.

The board also issued recommendations that Facebook be more transparent, including explaining to users whose posts are removed which rules they had violated and giving more clarity and definitions for issues including health misinformation and dangerous individuals. Facebook has 30 days to respond to the board.

"We believe that the board included some important suggestions that we will take to heart. Their recommendations will have a lasting impact on how we structure our policies," Monika Bickert, Facebook's vice president of content policy, said in a blog post.

Evelyn Douek, a Harvard Law School lecturer who studies online content moderation, said the board's inaugural rulings "are a true shot across the bow" because they take aim not just at Facebook's handling of individual posts but at its broader policies and enforcement.

"Now the question is, how seriously will Facebook take those recommendations and how openly will it engage with what the board said?" Douek said.

The board is funded by Facebook through an independent trust, and made up of 20 experts around the world. They include specialists in law and human rights, a Nobel Peace laureate from Yemen, the vice president of the libertarian Cato Institute and several former journalists.

Each case was reviewed by a group of five randomly selected members, with the final decision approved by the full board.

Editor's note: Facebook is among NPR's financial supporters.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

AILSA CHANG, HOST:

Facebook has created its own sort of Supreme Court. It's an oversight board that has the final say on some of its hardest decisions over what users can and cannot post. Today, that board issued its first rulings. It ordered the social network to restore several posts that it had removed for breaking Facebook rules. NPR tech correspondent Shannon Bond joins us now to explain.

Hey, Shannon.

SHANNON BOND, BYLINE: Hey, Ailsa.

CHANG: So we should first note Facebook is among NPR's financial supporters. All right. So, Shannon, tell us a little more about some of the cases this board considered.

BOND: Yeah, there were five in total announced today. And in each of these, the board was reviewing posts that Facebook had taken down for violating policies against things like hate speech, nudity and harmful misinformation about COVID-19. And when you dig into the details of these rulings, you know, enforcing these rules is really complicated. And ultimately, the board overturned Facebook's decision to remove in four of these first five cases.

CHANG: OK, so give us a quick example.

BOND: Right. So in one case, Facebook had removed a post from a user in Myanmar who had suggested there was something wrong with Muslims. And Facebook says this broke its rules against hate speech. This is an especially fraught issue because, of course, Facebook has been criticized for its role in the genocide of the country's Muslim minority. But the board looked at this and said, you know, if you take into consideration the full context, this post was pejorative, but the board didn't think it crossed the line into hate speech. And so it said Facebook needs more justification if it's going to take down posts like this. And the board told Facebook to reinstate it. Now, Facebook has agreed to abide by these rulings, and the post is already back up.

CHANG: Wait. So who is on this board exactly?

BOND: It's made up of 20 international experts. They're mainly in things like law and human rights, but there's also a Nobel Peace laureate, some journalists and even the former prime minister of Denmark. It was created by Facebook last year, and it's funded by Facebook through an independent trust.

CHANG: And do you think these decisions give us any clues as to how the board sees its overall role?

BOND: Well, I spoke to Evelyn Douek, a Harvard Law School lecturer who's been following the board very closely.

EVELYN DOUEK: These five cases, even though it's only five cases out of the thousands or millions of decisions that Facebook makes in a week, are a true shot across the bow from the oversight board to Facebook.

BOND: And she says it's a shot across the bow because the board is taking aim directly at some of Facebook's policies and enforcement. You know, it warned about the extent to which the company relies on artificial intelligence. It says those systems need more human oversight. It emphasized taking context into account, and it wants Facebook to just be much more clear about its rules on policies like health misinformation or dangerous groups.

And you know, Ailsa, we know Facebook has this immense power over what its billions of users can post. Now it's created this board. And from what we've seen today, the board has ambitions to be a real check on that power. You know, it's kind of flexing its muscles.

CHANG: Yeah, so interesting. Well, what I did notice is we did not hear today about Facebook's decision to suspend former President Trump after the whole insurrection at the Capitol on January 6. What do we know about the board's review of that case?

BOND: Right. Facebook reviewed the Trump suspension to the board last week. This is the case everyone has their eyes on, of course, right? It's a huge deal. The board is opening up for public comment tomorrow, and it has about three months to make a ruling. And ultimately, it's going to be up to the board to settle this very fraught debate over whether Trump should get his account back. So we'll stay tuned.

CHANG: That is NPR's Shannon Bond.

Thank you, Shannon.

BOND: Thanks, Ailsa.

(SOUNDBITE OF MUSIC) Transcript provided by NPR, Copyright NPR.