PACS news/January 28, 2022

Digital Civil Society Lab Announces 2022-23 Cohort of Practitioner Fellows

Eight social sector leaders to develop research and tools to benefit global civil society

The Digital Civil Society Lab (DCSL) and the Center for Comparative Studies in Race and Ethnicity (CCSRE) are pleased to announce our new joint cohort of Practitioner Fellows for 2022-23. This highly accomplished cohort includes eight social sector leaders whose critical work ranges from digital rights, privacy, policy, and governance to intersections among race, media, and digital systems; explorations of artificial intelligence (AI) in Africa; and healing justice in evolving democracies around the world.

Bringing a wealth of knowledge and expertise to their fellowship projects, the 2022-23 cohort will develop tools and understanding to inform and benefit civil society at large.


Nila Bala is a Senior Staff Attorney with the Policing Project, where she works on legislation to reform policing, as well as the intersection between technology and policing. Prior to this role, Nila was the Associate Director of Criminal Justice & Civil Liberties and a Senior Fellow at the R Street Institute, where she specifically developed policies to advance reforms in juvenile justice and reentry. Read full bio

Nila’s Project: A Comprehensive Framework for Children’s Biometric Privacy 

Parents are increasingly turning to direct-to-consumer companies (like 23andMe) to test their children’s DNA. Parents then upload children’s DNA data to databases, sharing this biometric information for perpetuity. An applicable legal framework to respond to this issue doesn’t exist yet. The few American privacy laws that exist focus on traditional data protection standards (e.g. notice and consent), and don’t consider children and parent’s interests as separate. I hope to truly explore what it means to protect children’s rights over their own data. 

I hope to draft a regulatory framework that considers all the players: children, parents, private companies, government, schools, and even physicians. I also wish to research and craft policy/soft law solutions responding to the various facets of this issue: a parent-focused education campaign on genetic testing that uses a public health approach, a child-centered campaign that educates children about their own biometric data through the school system, and recommendations to DTC companies, nonprofits, and open-source databases.


Laura Bingham is the inaugural Executive Director of Temple University’s Institute for Law, Innovation & Technology. Laura is a globally recognized expert on nationality and migration law and human rights, with extensive experience in international human rights advocacy. She has led complex investigations and transnational human rights litigation in every major regional system as well as many national courts. Read full bio.

Elizabeth Eagen is Deputy Director of the Citizens and Technology Lab at Cornell University, which works with communities to study the effects of technology on society and test ideas for changing digital spaces to better serve the public interest. Previously, Elizabeth was a Senior Program Officer at the Open Society Foundations’ Information Program, where she ran the Emerging Technology for Evidence and Advocacy portfolio. Read full bio.

Laura and Elizabeth’s Project: Profiling the Digital State

Our project documents incentives and tradeoffs between technical interventions and the rule of law. Our work examines how current and proposed accountability measures fail to meet the challenge of pervasive data systems. 

We track two trends: first, revisions to the delivery of public services are premised on narratives of problem-solving and progress. Experimental, inscrutable technology projects are piloted across critical state functions including education, health, welfare, law enforcement, and borders. Aggressive data-harvesting binds government and corporate interests in identification and electoral systems. 

Second, the onus is on individuals to protect information from massive, multi-country forces. Individuals are pressured to trade asymmetrically: food for data; assistance for surveillance; access to labor markets as test subjects of AI systems. Current and emerging compliance-based approaches to technical overreach, whether regulatory, self-imposed, or via market advantage, invoke public interest objectives but often fail to challenge commodification of data. 

Our project questions whether governance by cycles of harm and redress is an inescapable part of an inevitable transition to a more datafied future, and if that future is the only pathway to public infrastructure and service delivery. We’ll propose diagnostics of compliance, accountability, and remedial approaches, and examine alternative forms of organizing capital in the public interest.


Panthea Lee is a strategist, curator, facilitator, and mediator working for structural justice and collective liberation. She builds and supports coalitions of community leaders, artists, healers, activists, and institutions to win dignity and joy for all. Panthea currently serves as the Executive Director of Reboot, and as a fellow at Arizona State University’s Center for Science and Imagination. She is a pioneer in guiding diverse coalitions to tackle complex social challenges, with experience doing so in over 30 countries with partners including UNDP, CIVICUS, Wikimedia, MacArthur Foundation, the City of New York, as well as civil society groups and governments from the local to federal level. Read full bio.

Panthea’s Project: The Healing Justice Collaborative 

The Healing Justice Collaborative seeks to support communities in collectively healing from systems of oppression, and in shaping the world we deserve. Through research, programming, and advocacy, we seek to bring together communities, artists, healers, mediators, and policymakers in exercises of radical co-creation to imagine bold alternative futures, design viable paths forward, and enact courageous proposals for change. 

Our foundational research seeks to explore the role of healing justice in evolving democracies and building just systems of global governance. Key questions we will explore include: What are the impacts of unaddressed trauma on our democracies? How have restorative and transformative justice practices been applied on societal-level scales? What are concrete ways to bring methods from art, healing, and conflict transformation, into the realms of public policy, international development, and global governance? 

We will explore these questions through research and writing, and through new programs called People’s Commissions for Justice. These Commissions empower historically marginalized communities to shape a society that honors our dignity and joy. Drawing from healing justice, cultural organizing, and deliberative democracy practices, they’re a model for social transformation that grapples with historical harms, imagines nourishing futures, and co-creates ways to realize these visions.


Yolanda Jinxin Ma is an award-winning cross-disciplinary professional, always finding the space for data and technology to make the most impact in an ethical and sustainable manner. She is currently Head of Digital Policy and Global Partnerships at the United Nations Development Programme, where she focuses on transforming the development space with digital technologies, based in New York. Before this post, Yolanda was with UNDP’s Asia Pacific regional office in Bangkok for five years, focusing on social innovation and impact investment. Read full bio

Yolanda’s Project: 31 Voices—Different Faces of Digital Rights

The Universal Declaration of Human Rights has 30 articles. A crucial one is missing—about the rights to access, use, and benefit from the internet. The bigger challenge is the lack of shared understanding on key concepts that are critical for digital rights, and the lack of global representation. By collecting 31 stories globally, giving channel to 31 voices from different communities and cultures, this project aims to fill that gap in a creative way where the needs, opinions, and realities from developing countries could be represented, and also a collective way that anyone could participate. This project aims to build a toolkit that not only explains the key concepts of digital rights and responsible technology, but also paints different pictures of how people globally could use and/or build technology in a responsible and inclusive manner.


Evelyn (Wenjie) Mei is a community leader and technologist focusing on ethical AI for racial unity. She passionately engages and advocates for the Asian American and Pacific Islander communities through events, policy, and civic education as an Emerging Civic Leader at the Asian Pacific American Leadership Institute, and a Board Member of the Stanford Asian Pacific American Alumni Club. She is a Professional Mentor at TechWomen, a US Department of State initiative, and a frequent speaker and mentor at AI4ALL and hackathons. Read full bio

Evelyn’s Project: Assessing AI-induced Interminority Racism in Press and Social Media 

This project analyzes interminority racism and its influence by media and AI, in the context of anti-blackness and “Asian hate”. We will collect a comprehensive set of news articles and social media posts covering violent incidents related to racial unrest, analyze the role of the media and AI recommendation systems in exacerbating racial animosity and mistrust, and execute an outreach program to educate the broader community.


A. Collier Navaroli’s expertise lies at the intersection of race, media, technology, law, and policy. Her current work within the technology industry examines the traditional balance between free expression and safety by developing and enforcing power-conscious global platform policy. A. Collier’s previous work has included fighting on the front lines for systematic change in big data and internet freedom work at Color of Change, driving Data & Society Research Institute’s inquiries examining race, civil rights, and fairness within emerging technologies, and teaching the principles of law and constitutional freedoms to high school students in Harlem. Read full bio.

A. Collier’s Project: The Impacts of Moderating Speech

My project seeks to begin studying the intersections of race and content moderation that have yet to be investigated. While the literature about the field has revealed exposure to hateful content to be some of the most harrowing upon commercial content moderators, the specificity of the harm created upon those most impacted has yet to be interrogated. Throughout this fellowship, I hope to better understand the distinct impacts of regulating speech on the humans who are tasked with the jobs of content moderation and platform policy enforcement.


Berhan Taye is an independent researcher, analyst, and facilitator who investigates the relationship between technology, society, and social justice. She is currently a Senior Advisor at Internews’ Global Internet Freedom Project and co-hosts the Terms and Conditions Podcast. Before joining Internews, Berhan was the Africa Policy Manager at Access Now and led the #KeepItOn campaign, a global campaign fighting against internet shutdown with a coalition of more than 220 member organizations worldwide. Read full bio.  

Berhan’s project: AI at the Margins

The AI at the Margins project will create a publicly accessible database of government and private sector use of AI tools and systems in African countries and research the impacts of selected cases of tools.