Facebook vows to fight hate speech, but it may be too late for Rohingya Muslims

Rohingya Muslim refugee Mohammad Karim, 26, center, shows a disturbing video of mass graves in Myanmar to other refugees in Kutupalong refugee camp, Bangladesh, on Jan. 14, 2018. (AP/Manish Swarup)

Emily McFarlan Miller

Contributor

View Author Profile

Religion News Service

View Author Profile

A social media network set up to connect people is now acknowledging it needs to do more to prevent the spread of hate speech that contributed to a campaign to drive out Myanmar's Rohingya Muslim minority.

That admission came from Monika Bickert, Facebook's vice president of global policy management, this month at an event, "Countering Disinformation and Violent Extremism in the Digital Age," at New America in Washington, D.C. And it comes as activists in Myanmar are urging Facebook to more carefully moderate the social network in countries like theirs, where it quickly has dominated how people communicate and access information.

"It's a complicated landscape, and there's a lot more we can do there," Bickert said.

More than 85 percent of Facebook's 2.2 billion users are outside the United States, the company's top policy official said.

That includes 18 million users in Myanmar alone, Facebook Public Policy Director Mia Garlick told Religion News Service. 

Facebook's user base in Myanmar has exploded since 2012, when "major democratic reforms" came to the country, opening it to the rest of the world, to the internet and to Facebook, said Patrick Balazo, a researcher with Dalhousie University and the Canadian Centre on Statelessness.

Cheap cell phones, which virtually no one previously could access, flooded the country almost overnight — all preloaded with the app. For people in Myanmar, Balazo said, "Facebook is the world wide web. The world wide web is Facebook."

Religious persecution

The United Nations has said Myanmar's military campaign against the Rohingya "seems a textbook example of ethnic cleansing" that led more than 650,000 Rohingya to flee to neighboring Bangladesh between August 2017 and March 2018.

The Rohingya have been called "foreigners," "illegal immigrants" and, most recently, "terrorists;" have been denied citizenship; and have been taunted and attacked by military and security officials and in religiously motivated mob violence, said Engy Abdelkader, a scholar and researcher at Rutgers University who has written about the history of Rohingya persecution in majority-Buddhist Myanmar.

"Such messages dehumanizes the Rohingya and helps justify official atrocities while reinforcing and spreading anti-Rohingya, anti-Muslim sentiment," Abdelkader said.

Facebook entered into an environment in which it was easy for an existing current of Buddhist "ultranationalism" to go viral. Those ultranationalists have portrayed the Rohingya as "bent on destroying the Buddhist nation," said Balazo.

On Facebook, Balazo has written, that became viral racist political cartoons, doctored images and fake news reports. That played a "determining role" in the ensuing violence against Rohingya Muslims, which sent them fleeing into neighboring Bangladesh, U.N. Independent International Fact-Finding Mission on Myanmar Chairman Marzuki Darusman concluded.

This raises a number of ethical questions and responsibilities for Facebook, according to Irina Raicu, director of the Internet Ethics Program at the Markkula Center for Applied Ethics at Santa Clara University.

"If you're a multinational company and you're going to operate in different communities, you do take that on," Raicu said. "You take on responsibility, and I think that was not well understood — that you can't just take your values and your understanding of the world and apply it worldwide."

Handling hate speech

Billions of posts appear on Facebook timelines every day, Bickert said, and the company can't review every one. Instead, it relies on "technical tools" and other users to flag posts that don't meet its community standards, sending those posts on to a team of more than 7,500 Facebook employees to review.

Those community standards were developed with input from a number of religious groups, according to the company. They are "designed to give people the freedom to express themselves while ensuring Facebook remains a safe environment for all," Garlick said.

Technical tools catch more than 99 percent of videos that ultimately are removed from the platform for being "terror propaganda," according to Bickert. But that's harder to do with something like hate speech, which the company defines as any attack on people based on protected characteristics like religious affiliation.

Hate speech demands more context, Bickert noted. A machine can recognize a religious slur, for example, but it can't tell if that word is being hurled at another by a user or if that user is saying he or she was on the receiving end of the slur.

Rohingya Muslims, who crossed over from Myanmar into Bangladesh, carry an elderly woman in a basket and walk toward a refugee camp in Shah Porir Dwip, Bangladesh, on Sept. 14, 2017. (AP/Dar Yasin)

In an open letter addressed to Facebook CEO Mark Zuckerberg in April, members of several civil society organizations described a particular instance when they flagged viral posts spreading harmful misinformation on Facebook's Messenger app. Buddhists received messages that Muslims were planning to attack them on a certain date and vice versa, an attempt to incite the two groups to do violence against one another.

The activists asked the company to invest in more moderators, engage local groups and be more transparent about what it's doing to solve the problem.

What Facebook is doing

Facebook has increased the number of people working to understand and respond to the situation in Myanmar, and its Community Operations team includes country specialists with insight into the social, cultural and religious context of those places, including Myanmar.

Its efforts to stop the spread of hate speech and misinformation in the country include designating hate groups and organizations; removing repeat offenders and fake accounts; and giving users the ability to report content shared in Facebook Messenger. 

But Facebook needs to ramp up its ability to review posts in the Burmese language, Bickert said. And, while the company has been working with groups in Myanmar for years, she said, "One of the biggest things we have to do is improve our relationships with civil society groups on the ground."

Some activists are wary about allowing Facebook, a private company, to police itself in such a way, according to Raicu of the Markkula Center.

But if not Facebook, she said, "Then who?"

"Nobody's kind of proposing a solution, and it's too easy to just criticize the current proposition without really offering anything else. At the same time, allowing this to continue risks people losing their lives over incitement to violence very clearly."

Latest News

Advertisement