In April 2025, the Kenya Human Rights Court issued an unprecedented ruling that it had a jurisdiction to hear a case about the harmful content on one of Meta platforms. The lawsuit was filed in 2022 by Abraham Merig, an Ethiopian academic son who was killed after he was threatened and threatened with Facebook, Ferkhah Tikal, an Ethiopian human rights activist, who was also manual and threatened on Facebook, and the Katiba Institute, which is the non -programmed Kenyan that benefits from it. They confirm that the design of the Facebook algorithm and its moderate decisions in Kenya resulted in damage to two claims, nurtured the conflict in Ethiopia and led to widespread human rights violations inside and outside Kenya.
The content concerned with the categories protected from speech under Article 33 of the Kenya Constitution includes propaganda of war, incitement to violence, hate speech and calling for hatred that constitute ethnic incitement, the message of others, and incitement to cause harm and discrimination.
The key to the Kenyan issue is the question of whether Meta, a US -based company, can financially benefit from the unconstitutional content and whether there is a positive duty for the company to bring down the unconstitutional content that also violates the standards of its community.
In emphasizing the jurisdiction of the Kenyan Court in the case, the judge was affirming that the Kenya constitution allowed the Kenyan court to adjudicate dead acts or negligence related to the content published on the Facebook platform that may affect human rights control inside and outside Kenya.
The Kenyan decision indicates a transformation of a model towards the responsibility of the statute, as the judges define responsibility by asking the question only: Do you monitor the decisions of the platform and support human rights?
The ultimate goal of the rights of the rights, a common feature in African constitutions, is to support and protect the inherent dignity of all people. For example, the Kenya Rights Law, for example, is its only task to preserve the dignity of individuals and societies, promote social justice and realize the potential of all human beings. The superiority of the constitution also guarantees that in the event of safe provisions for the port in the laws of that country, it will not be a shield of responsibility for platforms if their business decisions do not eventually adhere to human rights.
The case related to the exaggeration of algorithm has passed the stage of the judicial hearing in Kenya is a testimony that the Human Rights and Constitutional Law provides an opportunity for those who suffered from harm as a result of the content of social media to request recession.
Until this point, the idea of the social media platform could be dissolved to be responsible for the content on its platform through the immunity of the blanket submitted under Article 230 of the United States December Law in the United States, and to a lesser extent, the principle of lack of responsibility in the European Union, with the necessary exceptions detailed in various laws.
For example, Article 230 was one of the reasons that the province’s judge in California was in its ruling to reject A case made by refugees in Myanmar in a similar claim that Mita had failed to reduce the hate speech that fueled the genocide of the Rohingya.
The ambition to hold the statute is diluted under the US Supreme Court Resolution in Twitter v TaamnehAs it ruled against the prosecutors who sought to prove that social media platforms bear the responsibility for the content published on them.
The immunity of the platforms came at a high cost, especially for victims of harm in places where platforms do not contain material offices.
This is why a decision like a decision by the Kenyan courts is a welcome development; It restores the hope that the victims of the platform will have an alternative way to asylum, a path that restores human rights in the essence of the discussion on the accountability of the platform.
The justification of the provisions of the safe port, such as section 230, was the protection of “emerging” technologies from suffocation due to the multiplicity of cases. However, now, dominant social media platforms are not emerging and do not need protection. They have both criticism and technician as it determines the priority of people on profits, but they choose not to.
Since Kenyan issues are following the judicial process, there is cautious optimism that the law of the constitution and human rights has been attracted in African countries that can provide a necessary expression of the vanity of the platform.
The Mercy Mutemi Fisseha Tekle is in the case shown in the article.
The opinions expressed in this article are the author of the author and do not necessarily reflect the position of the editorial island.
https://www.aljazeera.com/wp-content/uploads/2025/08/2019-04-23T000000Z_745869880_RC1F66BBD400_RTRMADP_3_FACEBOOK-LANGUAGES-1755361929.jpg?resize=1920%2C1440
Source link