PACS news/July 26, 2023

2022-23 Practitioner Fellows Final Showcase    

The Digital Civil Society Lab (DCSL) and the Center for Comparative Studies in Race and Ethnicity’s (CCSRE) 2022-23 Practitioner Fellows met on Friday, June 23 to present their current fellowship projects exploring digital rights, privacy, policy, and governance; intersections among race, media, and digital systems; the use and impact of artificial intelligence (AI) in Africa; and the practice of healing justice in evolving democracies around the world.

It’s been an incredible year for our 2022-23 fellow cohort, and the DCSL and CCSRE look forward to following these fellows’ next steps and future endeavors.




Berhan Taye: The AI Harm Pipeline

Berhan Taye is an independent researcher, analyst, and facilitator who investigates the relationship between technology, society, and social justice. She is currently a Research Manager with One Project and co-hosts the Terms and Conditions Podcast. Before joining Internews, Berhan was the Africa Policy Manager at Access Now and led the #KeepItOn campaign, a global campaign fighting against internet shutdown with a coalition of more than 220 member organizations worldwide.

Bernan’s project, The AI Harm Pipeline, seeks to log data habits of AI harms that are reported throughout Africa through detailed interviews and case studies of those inputting data, those managed by automated systems, and those harmed by the effects of content moderation. The AI Harm Pipeline will create a publicly accessible database of government and private sector use of AI tools and systems, and research the impacts of selected cases of tools on various stakeholders, shedding light on feasible recommendations for future and current automated systems. The project will additionally include a series of exhibitions displaying stories of those affected by automated systems.

Bernan was inspired to pursue this topic through her own experiences in the health system in Kenya, where the risk assessment tools used to detect serious illnesses were utilizing data and assessment frameworks that were not applicable to the population they were serving, creating a disparity and lack of understanding surrounding the origins and applications of data.

“What has been fascinating and an important learning experience here is that you might blame technology, but it’s always the humans behind the scenes that are making the decisions. They need to be accountable and there are so many layers to it in the sense that it always preys on the most vulnerable. It’s always the people that don’t have an alternative. I am hoping that we’re able to support this process and figure out a way for those impacted by AI harms to tell their stories, and decide how to engage with the public.”

Evelyn Mei: Anti-Asian News Awareness and Accountability

Evelyn (Wenjie) Mei is a community leader and technologist focusing on ethical AI for racial unity. She passionately engages and advocates for the Asian American and Pacific Islander communities through events, policy, and civic education as an Emerging Civic Leader at the Asian Pacific American Leadership Institute, and a Board Member of the Stanford Asian Pacific American Alumni Club.

In partnership with the Virulent Hate Project, Evelyn’s project analyzed the influence of news media in the context of anti-Asian sentiment and racism during 2020. In collecting over 4,500 pieces of news coverage relating to racial unrest and violence, Evelyn and her team members documented key findings surrounding what types of coverage received the most attention, how the media is portraying these kinds of events, and recommendations moving forward for journalists and news media to accurately and empathetically portray the Asian-American experience.

“We found that the news media paid substantial attention to what people said about an issue rather than those who experienced it—giving politicians more of a platform to speak than community-led efforts and creating the narrative of Asian-Americans as victims. This victim narrative received two times more media attention than stories of Asian-Americans trying to change the status quo and build coalitions, with male victims additionally receiving more attention than female victims.”

Panthea Lee: The Healing Justice Collaborative 

Panthea Lee is a strategist, curator, facilitator, and mediator working for structural justice and collective liberation. She builds and supports coalitions of community leaders, artists, healers, activists, and institutions to win dignity and joy for all. Panthea currently serves as the Executive Director of Reboot, and as a fellow at the Stanford Digital Civil Society Lab and at Arizona State University’s Center for Science and Imagination.

Panthea’s project, The Healing Justice Collaborative, explores the impacts of unaddressed trauma on our societies, pathways to collective healing that center structural equity and justice, and how the realms of public policy, international development, and global governance can learn from practices in the arts, healing justice, and conflict transformation. Through research, programming, and advocacy, The Healing Justice Collaborative seeks to bring together communities, artists, healers, mediators, and policymakers in exercises of radical co-creation to imagine bold alternative futures, design viable paths forward, and enact courageous proposals for change.

“I’ve been examining the role of art and culture as spaces to create these communal healing experiences. Art, at its best, can create spaces where we temporarily suspend reality, subvert existing orders, and practice new ways of being.”

Nila Bala: Who Owns Children’s DNA?

Nila Bala is a Senior Staff Attorney with the Policing Project, where she works on legislation to reform policing, as well as the intersection between technology and policing. Prior to this role, she was the Associate Director of Criminal Justice & Civil Liberties and a Senior Fellow at the R Street Institute, where she specifically developed policies to advance reforms in juvenile justice and reentry.

Partly motivated by her experience as a juvenile public defender, Nila’s project challenges the notion that parents should be the absolute owners and controllers of their children’s DNA. Citing various instances where parents—through direct and indirect means—facilitated turning over DNA to law enforcement without adequate safeguards, Nila maps the harms of collecting children’s DNA and potential regulatory frameworks that consider all stakeholders: children, parents, private companies, government, schools, and even physicians, to protect children’s rights over their own data.

“The current safeguard being proposed in many states—and in fact, California just passed legislation last year—is to get parents involved. So California and other states are saying the thing we should do to protect children is to have parental consent, and I’m saying that’s just not enough.”

Laura Bingham and Elizabeth Eagen: Evidence for Flourishing Digital Life

Laura Bingham is the inaugural Executive Director of Temple University’s Institute for Law, Innovation & Technology. Laura is a globally recognized expert on nationality and migration law and human rights, with extensive experience in international human rights advocacy. She has led complex investigations and transnational human rights litigation in every major regional system as well as many national courts.

Elizabeth Eagen is Deputy Director of the Citizens and Technology Lab at Cornell University, which works with communities to study the effects of technology on society and test ideas for changing digital spaces to better serve the public interest. CAT Lab envisions a world where digital power is guided by evidence and accountable to the public.

Laura and Elizaeth’s project tracks public sector narratives of problem-solving and progress, documenting incentives and tradeoffs between technical interventions and the rule of law within digital state technologies. Their work examines how current and proposed accountability measures fail to meet the challenge of pervasive data systems, questioning whether governance by cycles of harm and redress is an inescapable part of an inevitable transition to a more datafied future, and if that future is the only pathway to public infrastructure and service delivery. 

Through case studies and field research, Laura and Elizabeth explored examples of digital state technologies and governance, mapping actors and understanding objectives around the creation of these kinds of tools and from where the demand stems. With the establishment of the Initiative for Digital Public Interest Fund, Laura and Elizabeth hope to continue to foster and support coalition building and overcome the misunderstanding perceived between different actors in digital ID, collectively asking the question, “What should digital ID look like?”

“What we were looking at here is how do people talk about problems? How do they create the notion that something is a problem that needs to be solved by technology or that needs to be addressed or carved out for exploration where innovation or experimentation might be an appropriate choice for a government to make.”

Anika Collier Navaroli: Black in Moderation

Anika Collier Navaroli’s expertise lies at the intersection of race, media, technology, law, and policy. Her current work within the technology industry examines the traditional balance between free expression and safety by developing and enforcing power-conscious global platform policy. 

Stemming from her own experience at Twitter as a content moderator, Anika’s project seeks to understand the distinct impacts of regulating hate speech on Black people who are tasked with the jobs of content moderation and platform policy enforcement. While the literature regarding the field has revealed exposure to hateful content to be some of the most harrowing upon commercial content moderators, the specificity of the harm created upon those most impacted has yet to be interrogated. Based on interviews with six individuals who identify as Black Americans and work within varying tech companies in Trust & Safety departments, Anika documented anecdotal evidence of the harms thrust upon Black bodies to share the realities and stories of what it is like to work in these spaces, and, in turn, how we can learn and grow as institutions.

“We have this entire world of the internet and social media that’s accessible and we have folks that sacrifice their brains and their bodies and their well being in order to maintain some sort of semblance of safety on these platforms for folks that look like them, love like them, and yet no one has ever asked them their stories. So I think it’s really important that we continue to ask folks to share their stories and that we chronicle them because we wouldn’t be able to experience the world as we do and I continue to hope that these lives, these experiences don’t go in vain and that we’re able to learn from them.”