AfricaEconomyPolitics

Meta faces $2.4bn legal action in Africa over war content

The lawsuit was initiated by two Ethiopian nationals and the Kenyan civil rights organization, Katiba Institute. One plaintiff, Abrham Meareg, claims that his father, Professor Meareg Amare, was murdered in 2021 after being targeted in Facebook posts that shared his personal details and called for his death. Another plaintiff, Fisseha Tekle, a former Amnesty International researcher, alleges he received death threats on Facebook due to his human rights work. Both plaintiffs argue that Facebook’s algorithms amplified harmful content, leading to real-world violence. ​

The plaintiffs are seeking several remedies:​

  • Algorithm Reforms: Modification of Facebook’s algorithms to prevent the promotion of hate speech and incitement to violence.​
  • Enhanced Content Moderation: Hiring more content moderators with expertise in African languages and regional issues to effectively monitor and manage harmful content.
  • Restitution Fund: Establishment of a $2.4 billion fund to compensate victims of violence and hate speech propagated through the platform.​
  • Formal Apology: Issuance of an official apology from Meta for its role in the harm caused.

Meta has contested the Kenyan court’s jurisdiction, arguing that legal proceedings should occur in the United States, where the company is headquartered. However, the Kenyan High Court has ruled that the case can proceed in Kenya, citing the presence of Facebook’s content moderation operations in Nairobi and the impact of the alleged actions on Kenyan users. ​

This lawsuit is part of a broader pattern of legal challenges facing Meta in Africa. In addition to the allegations of amplifying hate speech, over 100 former Facebook content moderators in Kenya have filed a lawsuit against Meta and its subcontractor, Samasource, claiming severe mental health issues due to exposure to graphic content without adequate support.

These legal actions highlight growing concerns about the responsibilities of tech companies in moderating content and preventing the spread of harmful material. As social media platforms continue to influence public discourse and societal dynamics, the outcomes of these cases may set significant precedents for corporate accountability and the enforcement of ethical standards in digital content management.

Meta Platforms Inc., the parent company of Facebook, is currently facing a $2.4 billion lawsuit in Kenya. The case alleges that the company’s platform played a significant role in inciting violence during Ethiopia’s Tigray conflict (2020-2022) by failing to adequately moderate hate speech and inflammatory content. ​

The lawsuit was initiated by two Ethiopian nationals and the Kenyan civil rights organization, Katiba Institute. One plaintiff, Abrham Meareg, claims that his father, Professor Meareg Amare, was murdered in 2021 after being targeted in Facebook posts that shared his personal details and called for his death. Another plaintiff, Fisseha Tekle, a former Amnesty International researcher, alleges he received death threats on Facebook due to his human rights work. Both plaintiffs argue that Facebook’s algorithms amplified harmful content, leading to real-world violence.

The plaintiffs are seeking several remedies:​

  • Algorithm Reforms: Modification of Facebook’s algorithms to prevent the promotion of hate speech and incitement to violence.​
  • Enhanced Content Moderation: Hiring more content moderators with expertise in African languages and regional issues to effectively monitor and manage harmful content.​
  • Restitution Fund: Establishment of a $2.4 billion fund to compensate victims of violence and hate speech propagated through the platform.​
  • Formal Apology: Issuance of an official apology from Meta for its role in the harm caused.

Meta has contested the Kenyan court’s jurisdiction, arguing that legal proceedings should occur in the United States, where the company is headquartered. However, the Kenyan High Court has ruled that the case can proceed in Kenya, citing the presence of Facebook’s content moderation operations in Nairobi and the impact of the alleged actions on Kenyan users. ​

This lawsuit is part of a broader pattern of legal challenges facing Meta in Africa. In addition to the allegations of amplifying hate speech, over 100 former Facebook content moderators in Kenya have filed a lawsuit against Meta and its subcontractor, Samasource, claiming severe mental health issues due to exposure to graphic content without adequate support.

These legal actions highlight growing concerns about the responsibilities of tech companies in moderating content and preventing the spread of harmful material. As social media platforms continue to influence public discourse and societal dynamics, the outcomes of these cases may set significant precedents for corporate accountability and the enforcement of ethical standards in digital content management.

Related posts
AfricaPolitics

DR Congo, Rwanda agree to draft peace deal by May 2

On April 25, 2025, the Democratic Republic of Congo (DRC) and Rwanda signed a U.S.-brokered…
Read more
AfricaHealth

Uganda declares end to latest Ebola outbreak

Uganda has officially declared an end to its latest Ebola outbreak, which began in January 2025. The…
Read more
AfricaPolitics

Insurgents kill 22 people in Nigeria

In a devastating attack in northwestern Nigeria’s Zamfara state, armed assailants killed at…
Read more

Sign up for Africa Insider’s Daily Digest and get the best of  news, tailored for you.

Leave a Reply

Your email address will not be published. Required fields are marked *