Meta’s ‘horrendous’ role in facilitating Rohingya genocide detailed in new report
Meta is facing calls from Amnesty International to pay reparations to the Rohingya people for Facebook’s role in the genocide in Myanmar.
A new report from the humanitarian organization provides a “first-of-its kind, in-depth human rights analysis” of Meta’s role in the Rohingya genocide. The report outlines how Meta’s “dangerous algorithms” and “relentless pursuit of profit” led to human rights violations against the Rohingya people.
Amnesty International called on Meta, formerly Facebook, to work with survivors of the genocide—which has killed over 25,000 people and displaced over 700,000—to provide an effective remedy for its role in the violence. Although Amnesty International did not attempt to quantify damages from Meta's role, it called on it to provide financial compensation, aid rehabilitation efforts, issue an apology, and offer a guarantee it would prevent something like this from ever happening again.
Meta is being sued for $150 billion by refugees for its role in the violence.
Meta’s persistent defense and prioritization of inflammatory content led to “incitement to hatred, violence and discrimination,” according to the 74-page exposition. Amnesty International relies heavily on documents reviewed from the Facebook Papers, a set of internal documents leaked by an ex-Meta employee that details its knowledge of the harm it causes around the world.
The report details repeated attempts to communicate with Meta by activists in the region from 2012 to 2017, including multiple trips from activists to Meta’s California headquarters where the company was repeatedly warned that it was contributing to the genocide. The report says Meta refused to act. The company refused to implement its policy of taking down hate speech on Facebook despite numerous reports from Rohingya people on the platform, and it allowed hate on the platform to fester. At points, it even made the situation worse.
In one instance, Meta supported an initiative called “Panzagar” or “flower speech” by releasing stickers that people could react to posts with. The purpose was supposed to be to react with a Panzagar sticker to posts that contained hate speech or hateful rhetoric as a means to counter it and “promote peace.” However, activists began to notice that Facebook counted the flower sticker as another way of engaging with a post, so instead of diminishing its visibility, the post would grow increasingly visible and be pushed to more people.
Amnesty International claims Facebook’s news feed, ranking system, and algorithm “all actively amplify and distribute content which incites violence and discrimination, and deliver this content directly to the people most likely to act upon such incitement.”
“Amnesty International’s analysis shows how Meta’s content-shaping algorithms and reckless business practices facilitated and enabled discrimination and violence against the Rohingya,” the report said. “Meta’s algorithms directly contributed to harm by amplifying harmful anti-Rohingya content, including advocacy of hatred against the Rohingya. They also indirectly contributed to real-world violence against the Rohingya, including violations of the right to life, the right to be free from torture, and the right to adequate housing, by enabling, facilitating, and incentivizing the actions of the Myanmar military.”
The report also claims that Meta was well aware of the problems with its algorithm prior to the Myanmar genocide and places the blame for the company's inaction and unpreparedness squarely at Meta's feet.
“Meta utterly failed to engage in appropriate human rights due diligence in respect of its operations in Myanmar ahead of the 2017 atrocities,” it said. “This analysis leaves little room for doubt: Meta substantially contributed to adverse human rights impacts suffered by the Rohingya and has a responsibility to provide survivors with an effective remedy.”
“Meta stands in solidarity with the international community and supports efforts to hold the Tatmadaw accountable for its crimes against the Rohingya people,” Rafael Frankel, Director of Public Policy for Emerging Markets at Meta, said to the Daily Dot. “To that end, we have made voluntary, lawful data disclosures to the UN’s Investigative Mechanism on Myanmar and to The Gambia, and are also currently participating in the [Organisation for Economic Co-operation and Development] complaint process.”
The report urges Meta to undertake a “comprehensive review and overhaul of human rights due diligence” and “constant, ongoing, and proactive human rights due diligence throughout the lifecycle of algorithmic technologies, so that risks and abuses can be identified.”
It also calls for the U.S, Ireland, and the European Union to ban targeted advertising that requires user tracking, force companies to address human rights issues with their algorithms, require companies to report human rights due diligence publicly, and regulate tech companies to make algorithms opt-in instead of opt-out.
Sign up to receive the Daily Dot’s Internet Insider newsletter for urgent news from the frontline of online.
The post Meta’s ‘horrendous’ role in facilitating Rohingya genocide detailed in new report appeared first on The Daily Dot.