Algorithms against linguistic biases: the Africa Stereotype Scanner
Project Lead
Overview
International media have largely portrayed the African continent through the lenses of underdevelopment, tribalism, crisis, violence, poverty, and humanitarian tragedy. Research has shown that such reporting often fuels harmful stereotypes that dehumanize Africans and hinder the continent’s development. The Africa Stereotype Scanner (ASTRSC) is a tool that seeks to help reduce misrepresentations of Africa in the media. This sharable, open-source tool allows authors to instantly identify and address clichés and linguistic biases in their writing. Designed initially for reporters and editors, many staff and volunteers within global civil society should find this tool of use, both for professional training and ongoing support in the context of their work in Africa.
Approach
Key questions driving this work include:
- How can technology play a role in confronting social biases?
- What types of collaborations and expertise are needed to develop such a tool?
- How do we create meaningful engagement between humanists, social scientists, and engineers?
- What are the limits, and pitfalls, of designing such a tool?
Process
The project started in the summer of 2017 as a collaboration between Toussaint Nothias and Zineb Oulmakki at Stanford’s Center for African Studies. The project is at the intersection of scholarly research, technology, and journalism. It draws on decades of research in the humanities and social sciences on representation and stereotyping of Africa in global media. The project uses natural language processing and computational linguistic and continuously seeks collaborations with technologists. Lastly, and importantly, an advisory board of journalists, scholars, and bloggers with unique expertise on Africa’s media image contributes to the development and design of the scanner. Find out more here.