PACS news / July 31, 2024
DCSL Practitioner Fellows Project Showcase
Technology & Racial Equity Practitioner Fellows share findings from their work to inform and benefit global civil society.
Increasing the quality and equity of AI systems.
Developing a practice and community for data healing.
Strengthening the role of African civil society in interrogating the digitalization of food and agriculture.
Members of the 2022-23 and 2023-24 cohorts of Digital Civil Society Lab Technology & Racial Equity Practitioner Fellows shared findings on these and other research topics in June, when they gathered to showcase their fellowship projects, supported in partnership with the Center for Comparative Studies in Race and Ethnicity.
Marking the end of their fellowship terms, the six highly accomplished practitioners discussed the varied ways in which they are working to spotlight and investigate critical issues for digital civil society, from lifting up community voices amid the push for digitalization, to ensuring representation and equity across digital platforms, and more.
Learn about the work of the Fellows and explore related resources below.
Kasia Chmielinski
Building “Dataset Nutrition Labels” for Common Race and Ethnicity Datasets to Mitigate Bias in Algorithmic Systems
Kasia Chmielinski is the co-founder of the Data Nutrition Project and a technologist focused on building responsible data systems across industry, academia, government, and non-profit domains. Previously, they held positions at the United Nations (OCHA), US Digital Service (EOP / OMB), MIT Media Lab, McKinsey & Company, and Google.
Kasia’s project set out with two goals: 1) to build Dataset Nutrition Labels for oft-cited race/ethnicity datasets, used for everything from policy recommendations to voter outreach; and 2) to leverage these labels as data literacy tools to drive conversations with data practitioners, policymakers, and community members about the lasting and real impact of problematic data.
In May 2023, Kasia re-focused their project to specifically address Big Data’s ever-changing landscape of AI-controlled private actors and the harms and lack of transparency in how datasets are obtained for large language models (LLMs). With a new scope, Kasia sought to create guidelines for policymakers about how to document data, models, and AI systems; develop approaches for practitioners to build responsible data and AI systems; increase public awareness and understanding of AI harms and highlight possible solutions; and adjust the Dataset Nutrition Label for large training datasets.
Kasia’s extensive project outcomes have since included the release of The CLeAR Documentation Framework for AI Transparency: Recommendations for Practitioners & Context for Policymakers (co-authored with Sarah Newman, et al); international AI and dataset discussions with AI Governance Global 2024, CVPR 2024 Workshop on Responsible Data, and Mozilla and EleutherAI; collaboration with the United Nations to create a Briefing Note on Artificial Intelligence and the Humanitarian Sector; a TED Talk on their work with the Data Nutrition Project and how “nutrition labels” can help bring greater transparency and fairness to AI systems; and the continued evolution of the Dataset Nutrition Label to be applicable to large language datasets.
Learn more about Kasia’s work via their “Fellow Fellowup” interview and The Data Nutrition Project.
Neema Githere
Data Healing: A Call for Repair
Neema Githere describes themselves as “an artist and guerrilla theorist whose work explores love and indigeneity in a time of algorithmic debris.” Neema’s work directly challenges harms caused by communication technologies, and their method for repair is grounded in love, or what Neema describes as a “practical solution-oriented application of care.”
Using near-future speculative fiction and Guerilla Theory as a design framework, Neema’s project prototypes what a therapeutic center—funded through reparations from Meta Inc. and geared towards post-social media psychosocial repair—could look like. This fictional center, the Data Healing Recovery Clinic, evolved into the creation and online publication of The Data Healing Workbook, through which individuals can critically and nonjudgmentally process, interrogate, and rewire their relationship to social media. The Workbook takes a holistic approach to digital wellness that centers clients as collaborators and active participants, and includes a series of questionnaires, exercises, daily reflections, digital nomenclature, harm reduction strategies, and more. To stay up-to-date with upcoming editions of the Workbook, follow Neema on Substack.
Learn more about Neema’s work via their “Fellow Fellowup” interview, selected artist interview with WePresent, and website.
Yolanda Jinxin Ma
Women of Color Working in Responsible Technology: Pathways Towards a More Inclusive Community
Yolanda Jinxin Ma is an award-winning cross-disciplinary professional committed to exploring how data and technology can make the most impact in an ethical and sustainable manner. She is currently a lecturer at the Journalism and Media Studies Centre at the University of Hong Kong, where she teaches courses on Media, Technology, and Society, and Storytelling for Social Impact. Previously, Yolanda was Head of Digital Policy and Global Partnerships at the United Nations Development Programme, where she focused on transforming the development space with digital technologies.
Setting out to explore the “different faces of digital rights” as part of the 2022-23 cohort of Technology & Racial Equity Practitioner Fellows, Yolanda refocused her fellowship project to examine the experiences of women of color who work in responsible technology. Observing that current responsible technology (RT) spaces were overwhelmingly led by the U.S. and lacking global representation from minorities, especially that of women of color from the Global South, Yolanda sought to map out the landscape of existing networks and communities for RT and to understand how they reach and support women of color globally; better understand the needs of women of color in the space and the gaps between the existing effort and their demands; and develop pathways that would fill the gap and make the community more inclusive.
Yolanda’s project methodology included an initial landscape analysis of more than 20 institutions that work within RT and address Diversity & Inclusion (D&I) issues. She then conducted deep interviews with four such institutions: Pollicy, All Tech Is Human, A+ Alliance, and Women in AI Ethics. Yolanda additionally created an online survey targeting women of color in the RT space, drawing respondents from 12 different countries, with more than 80 percent self-reporting as either Asian or Black.
Key findings from Yolanda’s research include:
- a widely shared frustration among women of color practitioners in the RT space, who generally report low satisfaction and low inclusivity ratings;
- while there are efforts towards building a more inclusive community in RT, these attempts are often fragmented, and their effectiveness is unclear due to lack of proper metrics and tracking;
- institutions continue to face funding challenges, with limited donor interest in D&I issues; and
- the gap between what women of color need and what existing programs offer is stark.
Following data collection, Yolanda is working to translate her preliminary findings and recommendations into various pathways for RT institutions and individual practitioners to utilize moving forward.
Learn more about Yolanda’s work via LinkedIn, the Journalism and Media Studies Centre at the University of Hong Kong, and a recent interview with Splice.
Daniel Maingi
Strengthening Agroecological Voices in the Digitalization of Kenya’s Agriculture and Food Systems
Daniel Maingi is a science and development practitioner in Kenya with a 15-year career helping to advance learning on appropriate and sustainable technologies for civil society organizations (CSOs) in Eastern Africa.
Daniel’s project sought to create a multi-stakeholder platform (MSP) of CSOs and government to track, study, and respond to the challenges and infractions presented by the corporate digitalization of agriculture, additionally guiding agroecology towards beneficial digital tools and serving as a watchdog to instill governance in the use of digital data in Kenya.
Centered around the goal of bringing data sovereignty concerns to the forefront of national discussion among both citizens and agroecology practitioners supporting the social and movement-organizing pillars espoused in sustainable agriculture, Daniel worked with approximately 300 small-holder farmers and CSOs to increase awareness of and education on their digital rights to empower community voices and trigger the development of agroecology friendly tools.
Learn more about Daniel’s work via his “Fellow Fellowup” interview.
Barbara Ntambirweki
Strengthening the Role of African Civil Society in Interrogating Digitalization of Food and Agriculture
Barbara Ntambirweki is a Ugandan lawyer and researcher working with ETC Group under the African Technology Assessment Platform. She is passionate about promoting technology justice within food systems in Africa, particularly with regard to the emerging developments in modern biotechnology and the digitalization of food and agriculture.
Barbara’s project explores the need to interrogate the digitalization of agriculture in Africa and identify principles for safe and effective governance to ensure food sovereignty. Through research and landscape analysis, identification of demographic working groups, case studies, and discussions with researchers, activists, and social movement leaders, Barbara evaluated the impacts of digital technologies on agriculture and food systems, farmers’ rights, regulatory agendas, and governance to deepen understanding and promote food sovereignty on the continent.
Her research finds that the implications of digitalization across African farming systems have resulted in a new form of inequality across agricultural food systems; promotion of industrial agriculture across the continent; elimination and erosion of traditional indigenous knowledge; and dispossession of small-scale food producers. Barbara has released various publications and articles, created educational materials, and held focused discussions and events to spotlight the harms—and alternatives—to food and agriculture digitalization.
Project outputs have included the release of a “Digitalization Corner” to help educate farmers about their rights in the framework of digital technologies; recent publication of “Datafying African Agriculture: From Data Governance to Farmers’ Rights,” which explores the impact of digital agricultural technologies on African agricultural systems and advocates for a paradigm shift towards a framework that prioritizes farmers’ rights and fosters participatory data governance; and frequent online webinars and speaking engagements.
Learn more about Barabara’s work via her “Fellow Fellowup” interview.
Jasmine Walker
#BlackModsMatter: Exploring the State of Reddit’s Black Protectors
With over a decade of experience as a volunteer moderator on Reddit, Jasmine Walker has been a steadfast advocate for minority voices, collaborating with Reddit’s administration to address various demographic issues on the platform.
Sparked out of an interest in how her fellow Black Reddit volunteer moderators (MODs) were approaching their work in spite of Reddit’s historic racism and hate speech, Jasmine sought to gather concrete data on current, former, prospective, and refuser Black MODs, to provide recommendations on how to better recruit and retain Black MODs on the platform.
Through 158 MOD surveys and 8 follow-up interviews, Jasmine’s project has generated new understanding and awareness of Black MOD sentiment surrounding Reddit content moderation and environment safety, including but not limited to 1) moderators’ desire to be heard, respected, and valued, over monetary compensation; 2) an overall lack of trust in Reddit’s focus and capabilities; and 3) general demotivation involving Reddit’s inaction of policy content violations, related to inaccurate AI-generated citation reviews.
Jasmine hopes to continue her work on other social media platforms like Twitter/X, Facebook, and NextDoor, to compare MOD experiences and gather further data.
Learn more about Jasmine’s work via her “Fellow Fellowup” interview.