Rohingyas seek compensation from Facebook for their role in the massacre

With roosters crowing in the background as he speaks from the crowded refugee camp in Bangladesh that has been his home since 2017, 21-year-old Maung Sawyeddollah describes what happened when violent hate speech and misinformation targeting the Rohingya minority in Myanmar began to spread on Facebook.

“We were fine with most people there. But some very narrow-minded and very nationalistic types have escalated the hate against Rohingyas on Facebook,” he said. “And the people who were good, in close communication with the Rohingyas. changed their minds against the Rohingyas and it turned into hatred.

For years, Facebook, now called Meta Platforms Inc., pushed the narrative that it was a neutral platform in Myanmar that was being misused by bad people, and that despite its efforts to remove violent and hateful content, it unfortunately failed. This account echoes his response to the role he has played in other conflicts around the world, from the 2020 elections in the United States to hate speech in India.

But a comprehensive new report from Amnesty International says Facebook’s favorite narrative is wrong. The platform, according to Amnesty, was not simply a passive site with insufficient content moderation. Instead, Meta’s algorithms “proactively amplified and promoted content” on Facebook, which incited violent hatred against the Rohingya as early as 2012.

Despite years of warnings, Amnesty found, the company not only failed to suppress violent hate speech and misinformation against the Rohingya, it actively disseminated and amplified it until it culminate in the 2017 massacre. The timing coincided with the growing popularity of Facebook in Myanmar, where for many people it was their only connection to the online world. This has effectively made Facebook the internet for a large number of people in Myanmar.

More than 700,000 Rohingya fled to neighboring Bangladesh that year. Myanmar’s security forces have been accused of mass rapes, murders and burning down thousands of Rohingya-owned homes.

“Meta – through its dangerous algorithms and relentless pursuit of profit – has contributed significantly to the serious human rights violations perpetrated against the Rohingya,” the report said.

A spokesperson for Meta declined to answer questions about Amnesty’s report. In a statement, the company said it “stands in solidarity with the international community and supports efforts to hold the Tatmadaw accountable for its crimes against the Rohingya people.”

“Our security and integrity work in Myanmar continues to be guided by feedback from local civil society organizations and international institutions, including the United Nations Fact-Finding Mission on Myanmar; the human rights impact assessment we commissioned in 2018; as well as our ongoing human rights risk management,” Rafael Frankel, public policy director for emerging markets, Meta Asia-Pacific, said in a statement.

Like Sawyeddollah, who is quoted in the Amnesty report and spoke to the AP on Tuesday, most of those who fled Myanmar – around 80% of Rohingya living in western Rakhine state Myanmar at the time – still reside in refugee camps. And they are asking Meta to pay reparations for his role in the violent crackdown on Rohingya Muslims in Myanmar, which the United States declared a genocide earlier this year.

Amnesty’s report, released on Wednesday, is based on interviews with Rohingya refugees, former Meta staff, academics, activists and others. He also relied on documents leaked to Congress last year by whistleblower Frances Haugen, a former Facebook data scientist. It notes that digital rights activists say Meta has improved its engagement with civil society and some aspects of its content moderation practices in Myanmar in recent years. In January 2021, after a violent coup toppled the government, he banned the country’s military from his platform.

But critics, including some of Facebook’s own employees, have long argued that such an approach will never really work. This means that Meta plays the mole trying to remove harmful content while its algorithms designed to push “engaging” content that is more likely to annoy people basically work against it.

“These algorithms are really dangerous for our human rights. And what happened to the role of the Rohingya and Facebook in this specific conflict is likely to happen again, in many different contexts around the world,” said Pat de Brún, researcher and adviser on artificial intelligence and human rights. at Amnesty.

“The company has been completely unwilling or unable to address the root causes of its impact on human rights.”

After the UN’s independent international fact-finding mission on Myanmar highlighted Facebook’s “significant” role in the atrocities against the Rohingya, Meta admitted in 2018 that “we were not doing enough to prevent our platform is used to foment division”. and incite offline violence.

Over the next few years, the company “has touted some improvements in its community engagement and content moderation practices in Myanmar”, Amnesty said, adding that its report “concludes that these measures have proven to be grossly inadequate”.

In 2020, for example, three years after violence in Myanmar killed thousands of Rohingya Muslims and displaced 700,000 others, Facebook investigated how a video of a leading anti-Rohingya hate figure, U Wirathu, circulated on his site.

The survey found that more than 70% of the video’s views came from ‘chaining’, i.e. it was suggested to people who watched a different video, showing what’s going on. afterwards. Facebook users weren’t searching or searching for the video, but fed it through the platform’s algorithms.

Wirathu had been banned from Facebook since 2018.

“Even a well-resourced approach to content moderation, taken in isolation, would likely not have been sufficient to prevent and mitigate this algorithmic harm. This is because content moderation fails to address the root cause of Meta’s algorithmic amplification of harmful content,” Amnesty’s report states.

Rohingya refugees are seeking unspecified reparations from the Menlo Park, Calif.-based social media giant for its role in perpetuating the genocide. Meta, who faces twin lawsuits in the US and UK seeking $150 billion for Rohingya refugees, has so far refused.

“We believe that the genocide against the Rohingya was only possible because of Facebook,” Sawyeddollah said. “They communicated with each other to spread hatred, they organized campaigns via Facebook. But Facebook was silent.

Copyright 2022 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.

Comments are closed.