The Digital Civil Society Lab (DCSL) and the Center for Comparative Studies in Race and Ethnicity (CCSRE) are pleased to announce a new joint cohort of Non-Resident Fellows.
The cohort includes, for CCSRE: Elizabeth Adams, Renata Avila, Samir Doshi, and Hong Qu. For DCSL: Beatrice Martini, Mutale Nkonde, Julie Owono, Tawana Petty, Zara Rahman, and ‘Gbenga Sesan. Topics covered by the selected projects cover disinformation, surveillance, fair lending, artificial intelligence, transparent technology design, algorithmic bias, public interest internet infrastructure, and more. 2020 fellows were selected from a competitive group of over 150 applications.
The Non-Resident Fellowship supports social sector leaders to dedicate time to working on ideas that apply to broad swaths of civil society but that may not quite fit into their “day job.” The fellowship provides time, space, expertise, and financial support to help turn ideas into prototypes or action, and to build a cohort of fellows to support ongoing learning and community.
The 2020 fellows hail from three continents and represent a broad range of expertise within civil society. While all the fellows are developing ideas at the intersection of civil society and digital issues, the four fellows hosted by CCSRE are working on challenges specifically related to racial equity and technology. All fellows will be in residence on the Stanford campus for an intensive week in late January to kickstart their projects and will work closely together as part of the same cohort throughout the yearlong fellowship term.
The fellowship offers a number of opportunities to engage with scholarship and with the Stanford community. As appropriate to each project, fellows will have the option to work with undergraduate students through the Comm230ABC seminar series, participate in on-campus events and conferences, receive research assistant support, and assist in designing a Race and Technology Action Summit. All the fellowship activities are designed with the goal of bridging academic scholarship and new developments within the social sector.
Previous DCSL fellows have built online tools for understanding privacy regulations, drafted new data governance mechanisms, and incubated a digital security exchange. Find more information on the 2019 and 2017-2018 cohorts of Non-Resident Fellows here.
Supported by the Public Interest Technology University Network and the Stanford Institute for Human-Centered Artificial Intelligence
Civic Tech: Racial Equity, Technology & the City of Minneapolis
This project seeks to build upon the existing City of Minneapolis’ Strategic and Racial Equity Action Plan by adopting technology considerations such as transparent tech design in predictive analytics used in bail algorithms, data collected from body cameras, facial recognition and video surveillance.
<A+ Alliance> Affirmative Action Algorithms to Correct Gender and Race Bias in Algorithms
A global alliance to create, pilot and apply Affirmative Action Algorithms that upturn the current path of ADM at a critical point in history, fostering social justice through gender and race equality.
The Role of AgTech in Farmworker Communities
Digital technologies used for agricultural production, often referred to as AgTech, are a burgeoning industry currently valued at $17 billion and expected to grow exponentially. With all this excitement around technological growth, there has been virtually no exploration of how AgTech will impact the millions of farmworkers in the US, the majority of whom are immigrants. This project will explore the impact of the emerging AgTech industry on farm labor, and how the industry can be held accountable to support farmworker communities rather than harming them.
Impact of AI on Fair Lending Laws and Practices
The emergence of AI in the consumer credit market has brought new types of opportunities and risks for historically disadvantaged populations seeking loans. This project’s contribution will be to develop a mechanism for monitoring disparate impact by testing for and reporting on lending practices that are unfair, deceptive or abusive to vulnerable groups and communities.
Supported by the Gordon and Betty Moore Foundation
Civil Society Advocacy for a Public Interest Internet Infrastructure
This project seeks to identify pathways for civil society to advocate for a public interest Internet infrastructure. It strives to do so by identifying research and advocacy opportunities developed through a growing network of civil society actors, scholars, and technologists continuously cooperating to explore and prototype practices for strengthening civil society’s impact.
Black Disinformation Resistors
The Black disinformation resistors project is an attempt to use hip-hop culture to educate Black audiences about the way AI systems mediate their access to political information.
Detecting Disinformation and Hate Speech in Sub-Saharan Africa
The aim of this project will be to strengthen the capacities of civil society organizations in Africa in the detection of disinformation and hate speech, and pair it with their acute knowledge of the local political and social context. This will allow the creation of a reliable system that can preventively spot such content before it becomes viral on social media platforms. The system will entail multi-stakeholder collaborations, which will involve civil society organizations and content platforms.
Legitimizing True Safety
A centuries long conflation between safety and security has helped propel society down a trajectory prohibiting numerous opportunities for visionary resistance to societal ills. This project seeks to create tools and initiatives for systematizing true safety by minimizing the conflation between safety and security, countering the public safety narrative which has become synonymous with surveillance and activating opportunities for visionary resistance.
Exploring the social and political consequences of the spread of DNA testing
The use of DNA testing is spreading around the world, but within these technologies lie a great deal of assumptions and biases, and many uses are based on science which has deep xenophobic assumptions at its core. This project will document how these technologies have been developed and their histories, as a way of building up an evidence base to support civil society in advocating on these issues, and identify potential advocacy strategies that can and have been used to mitigate negative impacts.
Ayeta: Proactive Toolkit for African Digital Rights Actors
As with ayeta, the protective gear that ancient Yoruba warriors wore for protection against gunshots during warfare, this toolkit will arm civil society actors — who protect the rights of citizens in their countries — with the information, resources and community they need should their work put them in harm’s way. The toolkit will also built on existing resources to allow Africa’s digital rights advocates and organizations have access to information that can help them run efficient campaigns.