A lawsuit against Facebook about instigating violence which led to an Ethiopian professor's death
The son of an Ethiopian chemistry professor killed last year during unrest in the country has filed a lawsuit against Facebook's parent company (Meta), claiming the social media platform incites viral hatred and violence and harms people in eastern and southern Africa.
Abraham Meareg Amare claims in the lawsuit that his father, 60-year-old Tigrayan academic Meareg Amare, was shot dead outside his home in Bahir Dar, the capital of Ethiopia's Amhara region, in November 2021. Hateful messages were posted on Facebook, defaming and belittling the professor and calling for his murder.
The case is a constitutional petition filed in the High Court of Kenya, which has jurisdiction over the matter because Facebook's content policing centre for eastern and southern Africa is based in Nairobi.
It accuses Facebook's algorithm of prioritising dangerous, hateful and inflammatory content to increase engagement and advertising revenue in Kenya. The legal document claims that Facebook has not invested enough in moderating content in African, Latin American and Middle Eastern countries, especially in Nairobi.
Meareg said his father was followed home from Bahir Dar University, where he spent four years managing one of the country's largest laboratories, and was shot twice at close range by a group of men.
He said the men shouted "junta", echoing a false allegation circulating on Facebook about his father being a member of the Tigray People's Liberation Front (TPLF), which is locked in war with the Federal Government of Ethiopia. two years.
Meareg said he tried desperately to get Facebook to remove some of the posts, which included a picture of his father and his home address, but he never got a positive response up until after he was killed.
An investigation into the killing by the Ethiopian Human Rights Commission, included in the petition and seen by CNN, confirmed that Meareg Amare was killed by armed attackers in his residence, but their identity was not known.
The plaintiffs are asking the court to order Meta to lower violent content, increase content monitoring staff in Nairobi and establish a $1.6 billion compensation fund for victims of hate and violence fuelled by Facebook.
Ethiopia is an ethnically and religiously diverse nation of about 110 million people who speak several languages. Its two largest ethnic groups, the Oromo and Amhara, make up more than 60 percent of the populations.
Meta said the company's policy and security work in Ethiopia is guided by feedback from local NGOs and international institutions.
According to Meareg's statement, Meta has only 25 employees to guard the main languages of Ethiopia. CNN could not independently confirm the number, and Facebook does not disclose exactly how many speakers of the local Ethiopian language review content it has flagged as potentially violating its standards.
The trial comes after two years of conflict in Ethiopia that have killed thousands, displaced more than 2 million people and unleashed a wave of atrocities including massacres, sexual violence and armed starvation.
This is not the first time Meta has come under scrutiny for user safety issues on its platforms, especially in countries where online hate speech is likely to spread offline and cause harm.
Internal documents, drafted by Haugen's lawyer to Congress and seen by CNN, showed that Facebook employees repeatedly expressed concern about the company's failure to curb the spread of messages inciting violence in "dangerous countries" such as Ethiopia. The documents also showed that the company's modest efforts did not match the inflammatory content on its platform, and in many cases it failed to increase staff or local language resources enough to protect people in those areas.
Meta's independent supervisory board last year recommended that the company commission a human rights review into how Facebook and Instagram were used to spread hate speech and misinformation that increased the risk of violence in Ethiopia.
The social media company admitted it had not done enough to prevent bloodshed on its platform, as CEO Mark Zuckerberg wrote a public letter apologising to activists and vowing to increase its moderation efforts.