Meta Platforms lifted its blanket ban on “shaheed” (English: “martyr”) on Tuesday after a year-long oversight board assessment found it “overbroad”
In a 2021 study that Meta itself commissioned, the company was found to have an “adverse human rights impact” on Palestinians and other Arabic-speaking consumers of its services. The company has been criticized for years for handling content involving the Middle East.
Since the commencement of hostilities between Israel and Hamas in October, those criticisms have intensified.
The governance board, funded by Meta but operates independently, initiated its review last year because the term was responsible for the most content removals on the company’s platforms compared to any other single word or phrase.
Meta is the parent entity of Instagram and Facebook.
In March, the review determined that Meta’s regulations regarding “shaheed” were inadequate in considering the term’s multifaceted connotations. This led to the removal of content that did not glorify violent actions.
On Tuesday, Meta acknowledged the review’s findings. She stated that its tests demonstrated that the most potentially harmful content was captured without disproportionally influencing the voice when “shaheed” was “paired with otherwise violating content.”
The oversight board approved the modification, asserting that Meta’s policy regarding the term had censored millions of individuals across its platforms.