Facebook and Instagram violated Palestinian human rights, internal report finds

California (QNN)- Facebook and Instagram’s speech policies harmed fundamental human rights of Palestinian users during Israel’s aggression on the besieged Gaza Strip in May 2021, according to a study commissioned by the social media sites’ parent company Meta.

“Meta’s actions in May 2021 appear to have had an adverse human rights impact … on the rights of Palestinian users to freedom of expression, freedom of assembly, political participation, and non-discrimination, and therefore on the ability of Palestinians to share information and insights about their experiences as they occurred,” says the report, which was obtained by The Intercept in advance of its publication.

Commissioned by Meta last year and conducted by the independent consultancy Business for Social Responsibility, or BSR, the report focuses on the company’s censorship practices and allegations of bias during Israel’s bombardment of the Gaza Strip last spring, when 256 Palestinians, including 66 children were killed by Israeli forces.

That time, many Palestinians attempting to document and protest the aggression using Facebook and Instagram found their posts spontaneously disappeared without recourse, a phenomenon the BSR inquiry attempts to explain.

According to the report, Meta deleted Arabic content related to last year’s aggression at a much greater rate than Hebrew-language posts. This was found among posts reviewed by both automated services and employees.

“The data reviewed indicated that Arabic content had greater over-enforcement (e.g., erroneously removing Palestinian voice) on a per user basis,” the report says. “Data reviewed by BSR also showed that proactive detection rates of potentially violating Arabic content were significantly higher than proactive detection rates of potentially violating Hebrew content.”

While BSR credits Meta for taking steps to improve its policies, it further blames “a lack of oversight at Meta that allowed content policy errors with significant consequences to occur.”

Though BSR is clear in stating that Meta harms Palestinian rights with the censorship apparatus it alone has constructed, the report absolves Meta of “intentional bias.”

Rather, BSR points to what it calls “unintentional bias,” instances “where Meta policy and practice, combined with broader external dynamics, does lead to different human rights impacts on Palestinian and Arabic speaking users” — a nod to the fact that these systemic flaws are by no means limited to the events of May 2021.

According to The Intercept, Meta responded to the BSR report in a document to be circulated along with the findings. In a footnote in the response, which was also obtained by The Intercept, the company wrote, “Meta’s publication of this response should not be construed as an admission, agreement with, or acceptance of any of the findings, conclusions, opinions or viewpoints identified by BSR, nor should the implementation of any suggested reforms be taken as admission of wrongdoing.”

BSR attributed the difference in treatment to a lack of expertise. It concluded that Meta lacked staff who understood other cultures, languages and histories – despite having over 70,000 employees and $24bn in cash reserves.

“Potentially violating Arabic content may not have been routed to content reviewers who speak or understand the specific dialect of the content,” the report stated.

It found that Meta’s Dangerous Individuals and Organisations policy, referred to in the report as the DOI, which prevents its users from praising or representing a number of groups, focused mainly on Muslim entities and therefore disproportionately impacted Palestinians.

“Meta’s DOI policy and the list are more likely to impact Palestinian and Arabic-speaking users, both based upon Meta’s interpretation of legal obligations, and in error,” it said.

“Palestinians are more likely to violate Meta’s DOI policy because of the presence of Hamas as a governing entity in Gaza and political candidates affiliated with designated organisations.”

The study concluded with 21 non-binding recommendations, which included increasing staff capacity to analyse Arabic posts and reforming the Dangerous Individuals and Organisations policy.

Meta vaguely committed to implementing 20 of the recommendations, according to The Intercept. The exception is a call to “Fund public research into the optimal relationship between legally required counterterrorism obligations and the policies and practices of social media platforms,” which the company says it will not pursue because it does not wish to provide legal guidance for other companies. Rather, Meta suggests concerned experts reach out directly to the federal government.

Related Articles

Back to top button