News

Oversight Board Criticizes Meta for Preferential Treatment

Meta, the owner of Facebook and Instagram, was harshly criticized on Tuesday by a company-appointed oversight board for policies that give celebrities, politicians and business partners special treatment compared with the vast majority of its users.

Under a program called cross-check, people with a high number of followers were able to say and share things on Facebook and Instagram that would otherwise have been quickly removed for violating company policies, according to the Oversight Board, which Meta had created to adjudicate thorny policy questions related to free speech, human rights and content moderation.

“The board is concerned about how Meta has prioritized business interests in content moderation,” the board said in a report. The cross-check program, it said, “provided extra protection for the expression of certain users.”

The oversight board recommended that Meta overhaul its cross-check system by “radically” increasing transparency over who is on the program’s list of V.I.P.s and hiding their posts while they are reviewed. Meta should prioritize speech, which is “of special public importance,” it added. Recommendations by the board, which includes about 20 academics, human rights experts and lawyers, are nonbinding.

The report was a reminder of the power that social networks have in deciding what posts to keep up, what to take down and how to treat specific accounts. Twitter, Facebook, TikTok and others have long been scrutinized for making unilateral rulings on content on their platforms that can influence political debates and societal issues.

More on Big Tech

  • Microsoft: The company’s $69 billion deal for Activision Blizzard, which rests on winning the approval by 16 governments, has become a test for whether tech giants can buy companies amid a backlash.
  • Apple: Apple’s largest iPhone factory, in the city of Zhengzhou, China, is dealing with a shortage of workers. Now, that plant is getting help from an unlikely source: the Chinese government.
  • Amazon: The company appears set to lay off approximately 10,000 people in corporate and technology jobs, in what would be the largest cuts in the company’s history.
  • Meta: The parent of Facebook said it was laying off more than 11,000 people, or about 13 percent of its work force

Elon Musk, Twitter’s new owner, is now in the spotlight for how his social media service will moderate content. Twitter had created policies around keeping misinformation and hate speech off the platform, but Mr. Musk has said that he believes in unfettered discourse and has dropped the enforcement of some of those policies.

Meta has de-emphasized its social networking business in recent months after criticism about toxic content on those platforms. Mark Zuckerberg, the company’s chief executive, has instead prioritized a move into the immersive digital world of the metaverse. Meta has spent billions of dollars on the shift, though it is unclear if consumers will embrace metaverse-related products. The company recently laid off more than 11,000 employees, or about 13 percent of its work force.

Nick Clegg, Meta’s vice president of global affairs, said on Tuesday that Meta created the cross-check system to prevent erroneously removed posts from having an outsize impact. He said the company would respond to the oversight board’s report within 90 days.

The oversight board began investigating the cross-check program last year after its existence was reported by The Wall Street Journal and a whistle blower, Frances Haugen. The board sharply criticized the company last year for not being transparent about the program.

On Tuesday, the oversight board found that the cross-check program ensured that high-profile users received additional review from a human moderator before their posts were removed for running afoul of the company’s terms of service. The board criticized the company for a lack of transparency and for “unequal treatment” of Facebook and Instagram’s most influential and powerful users at the expense of its human rights and company values. Meta took as long as seven months to reach a final decision on a piece of content posted by an account in the cross-check program, the report said.

Mr. Zuckerberg had pushed for the creation of the oversight board so his company would not be the only entity involved in content moderation decisions. Since the board began hearing cases in the fall of 2020, it has issued a number of objections to Meta’s actions on content.

In 2021, the board recommended that Meta restore photographs of post-surgery breasts that the company’s automated systems had taken down for nudity reasons. The photos, which Meta restored, had been posted by a Brazilian Instagram user who was touting a breast cancer awareness campaign. The board criticized Meta’s reliance on automated systems to remove posts.

The board also considered Meta’s barring of former President Donald J. Trump from Facebook and Instagram after the riot at the U.S. Capitol in January 2021. In May 2021, the board said Meta should review its decision to bar Mr. Trump, adding that the company did not have appropriate systems in place to issue a permanent suspension of the former president.

Mr. Trump had been part of the cross-check program. The board rebuked Meta for not being “fully forthcoming” in its disclosures about cross-check, including which figures were part of it.

Mr. Clegg has since said that Meta will decide whether to allow Mr. Trump’s accounts to be restored by January 2023.

Thomas Hughes, the director of the oversight board, said Tuesday’s report was “a significant step in the board’s ongoing efforts to bring greater accountability, consistency and fairness across Meta’s platforms.”

Other social media companies have sought to replicate Meta’s oversight board system. Last month, Mr. Musk said he planned to form a “content moderation council” at Twitter following his takeover of that company. Mr. Musk has not followed through on that plan, blaming activists and investors for pressuring him to follow Meta’s model.

Meta is also facing the prospect of not being able to show personalized ads in the European Union without receiving prior consent from users. Decisions approved by a European data protection body this week would require the company to allow users of Facebook and Instagram to opt out of ads based on personal data collected by Meta, according to a person with knowledge of the decision.

A final judgment, which can be appealed, is expected to be announced next month by Irish authorities, which serve as Meta’s main data privacy regulator in Europe because the company’s E.U. headquarters is in Dublin.

Back to top button