Facebook ‘double standard’ on hate speech against Russians

By Rina Chandran and Maya Gebeily?/Bangkok/Beirut

Facebook’s decision to allow hate speech against Russians because of the war in Ukraine violates its own rules on incitement and shows a “double standard” that could hurt users caught up in other conflicts, said digital rights experts and activists.
Facebook owner Meta Platforms will temporarily allow Facebook and Instagram users in certain countries to call for violence against Russians and Russian soldiers amid the invasion of Ukraine, Reuters reported the week. last.
It will also allow a right-wing battalion to be praised “strictly within the framework of the defense of Ukraine”, in a decision that experts say demonstrates the platform’s bias.
The move represents a “glaring” double standard in the face of Meta’s failure to tackle hate speech in other war zones, said Marwa Fatafta of digital rights group Access Now.
“The disparity in metrics compared to Palestine, Syria or any other non-Western conflict reinforces that the inequality and discrimination of tech platforms is a feature, not a bug,” said policy manager Fatafta. for the Middle East and North Africa.
“Technology platforms have a responsibility to protect the safety of their users, uphold freedom of expression and respect human rights. But this begs the question: Whose security and whose speech? Why have these measures not been extended to other users? she added.
Last year, hundreds of posts by Palestinians protesting deportations from East Jerusalem were deleted by Instagram and Twitter, which later blamed technical errors.
Digital rights groups have criticized the censorship, calling for greater transparency on how moderation policies are set and ultimately enforced.


One policy for all?
Facebook has been criticized for failing to curb Ethiopia’s incitement to conflict in Myanmar, where UN investigators say it has played a key role in spreading hate speech that has fueled violence against Rohingya Muslims.
“Under no circumstances is the promotion of violence and hate speech on social media platforms acceptable, as it may hurt innocent people,” said Nay San Lwin, co-founder of advocacy group Free Rohingya Coalition. , which was abused on Facebook.
“Meta must have a strict hate speech policy regardless of country and situation – I don’t think it’s okay to decide whether to allow the promotion of hate or calls for violence in on a case-by-case basis,” he told the Thomson Reuters Foundation. .
The scrutiny of how it is tackling abuse on its platforms has intensified after whistleblower Frances Haugen leaked documents showing problems Facebook has had policing content in countries with the most great risk for users.
In December, Rohingya refugees filed a $150 billion class action lawsuit in California, claiming that Facebook’s failure to control the content and design of its platform had contributed to violence against the minority group in 2017.
Meta recently said he would “assess the feasibility” of commissioning an independent human rights review of his work in Ethiopia, after his oversight board recommended a review.


Ukrainian exception
In a report released last Wednesday, Human Rights Watch said tech companies must show their actions in Ukraine are “procedurally fair” and avoid “arbitrary, biased, or selective decisions” by basing them on clear, established, and transparent processes. .
In the case of Ukraine, Meta said native Russian and Ukrainian speakers monitor the platform 24 hours a day and the temporary policy change is intended to allow forms of political expression that would “normally violate” its rules. .
“This is a temporary decision made under extraordinary and unprecedented circumstances,” Nick Clegg, president of global affairs at Meta, said in a tweet, adding that the company was focused on “protecting the right to expression of people” in Ukraine.
Russia has blocked Facebook, Instagram and Twitter.
And Meta’s new approach highlights how difficult it is to write rules that work universally, said Michael Caster, digital program manager for Asia at Article 19, a human rights organization.
“While global company policies should be expected to vary slightly from country to country, based on ongoing human rights impact assessments, it should also have a degree of transparency, consistency and accountability,” he said.
“Ultimately, Meta’s decisions should be shaped by its expectations under the UN Guiding Principles on Business and Human Rights, not by what is most economical or strongest on the market. logistics plan for the business,” he said in emailed comments.


Unilateral decision
For Wahhab Hassoo, a Yazidi activist who has campaigned to hold social media companies accountable for their failure to act against members of the Islamic State (IS) using their platforms to trade Yazidi women and girls, the measures Facebook are deeply troubling.
Hassoo’s family had to pay $80,000 to buy her niece’s release from jihadists, who kidnapped her in 2014 and then offered her “for sale” in a WhatsApp group.
“I’m shocked,” Hassoo, 26, said of Meta’s decision to allow hate speech against Russians.
“When they can make certain decisions unilaterally, they can essentially promote propaganda, hate speech, sexual violence, human trafficking, slavery and other forms of content related to human abuse – or the ‘prevent,’ he said.
“The last part is still missing.”
Hassoo and other Yazidi activists wrote a report urging the United States and other countries to investigate the role played by social media platforms, including Facebook and YouTube, in crimes against their Yazidi minority community.
Meta’s actions on Ukraine confirm what their research has shown, said Hassoo, who relocated to the Netherlands in 2012.
“They can promote or ban what suits their interests and what they find important,” Hassoo said. “It’s not fair that a company can decide what’s good and what’s not.” — Thomson Reuters Foundation)

Comments are closed.