A conflict kills thousands, forces millions of people from their homes and involves credible accusations of war crimes.

This is not Ukraine, but Ethiopia, where a civil war broke out in November 2020 and which Facebook has been accused by whistleblower Frances Haugen of exacerbating by ‘literally fanning ethnic violence’.

Our new investigation, which we’ve done in partnership with legal non-profit Foxglove and independent researcher Dagim Afework Mekonnen, exposes how Facebook is extremely poor at detecting hate speech in the main language of Ethiopia and follows on from our previous investigation which showed the same in Myanmar.

Facebook says that Ethiopia is ‘one of our highest priorities for country-specific interventions to keep people safe’, and that for more than two years it has invested in safety and security measures ‘including building our capacity to catch hateful and inflammatory content in the languages that are spoken most widely in the country’. Specifically, they state that they have employed more staff who speak Amharic, and that they have technology to automatically identify hate speech in Amharic. Their efforts are ‘industry-leading’, they say.

The adverts contained hate speech
The hate speech examples we used are highly offensive and we are therefore deliberately not repeating all of the phrases used here [iv]. The sentences used included violent speech that directly calls for people to be killed, starved or ‘cleansed’ from an area and dehumanising speech that compares people to animals. Several of them amount to a call for genocide. None of the sentences were dog-whistles or in any way difficult to interpret.

All the ads fall within Facebook’s definition of hate speech in their community standards and would have breached the International Convention on the Elimination of All Forms of Racial Discrimination had they been published.