This project is supported by the Stanford One Hundred Year Study on Artificial Intelligence and the Stanford Presence Center.
Smart home devices are diagnosing Alzheimer’s disease and supervising children, pet avatars are watching over dementia patients, and chatbots are helping to treat veterans living with PTSD. Going forward, such “Intimate Al” will continue to supplement and replace human care on the promise that serious social problems can be solved by developing technologies of care that are cheap and accessible.
But Intimate Al affects core human values: it challenges how we think of privacy, compassion, trust, and the very concept of care itself. This workshop addresses conceptual, ethical, and political issues of coding caring before Intimate Al becomes widely implemented. The workshop aims to advance our understanding of the interactions between human values and these powerful emerging technologies, and to inform the debates taking place in policy, industry, the academy, and the public sphere about what role Al can and should play in caregiving.
Key driving questions of this work include:
- What understanding of care, well-being, and flourishing is (implicitly or explicitly) encoded within existing technologies of Intimate Al?
- How, if at all, can technology be equipped with capabilities of care, such as attentiveness, responsiveness, compassion, benevolence, trustworthiness, and emotional support?
- What key values should define an ethics of care for new and emerging forms of Intimate Al and how can these values be operationalized in practice?
- Which are the forms of care that Intimate Al is better positioned to give than human caregivers?
The workshop will also keep in view a socioeconomic perspective to examine structural challenges arising from the automation of care work, such as:
- New labor relationships that span the north and global south and create intimacy from a distance.
- The perpetuation of racial and gender stereotypes and cultural conceptions of care.
- The continual data collection required for intimate, attentive care.
- The superficiality of treating societal symptoms such as loneliness, trauma, and stress instead of their underlying systemic causes.
To address these issues, the workshop will bring together participants from multidisciplinary backgrounds – practitioners and researchers alike- and diverse analytical frameworks, such as feminist ethics of care, bioethics, political theory, postcolonial theory, and labor theory. Rather than focusing on specific domains of care work, participants will first theorize and articulate the relational, physical, and psychological dimensions of care in order to propose when care can and should be automated. Participants will then articulate these values as design recommendations that amplify the benefits of Al within care work, while reducing the potential for harm. The workshop will prompt answers to these questions through group discussions and a design challenge.
The workshop is geared to produce deliverables of three forms:
- Executive Report A report that details the deep challenges for considering the broader implementation of Intimate Al in response to the four guiding questions outlined The aim is to summarize and sharpen our understanding of core ethical and normative concerns and to illustrate how key values of ethics of care can be operationalized drawing on the designs developed during the workshop. The report intends to benefit the 2020 AllO0 study panel, as well as being of use to industry, journalists, and the general public as a guide to understanding the development of new and emerging forms of Intimate Al.
- Dissemination Articles: The organizers and selected participants will write at least two articles on specific topics and issues that emerge from the workshop. These will be published in outlets such as The Conversation, The Atlantic, or Aeon, and will vividly describe the core challenges outlined in the report for the general public.
- Academic Publications: The organizers will approach a leading interdisciplinary journal (e.g. Philosophy & Technology) to create a special issue around Coding Caring, inviting participants to contribute.
- Morgan Currie, Lecturer in Data and Society in Science, Technology and Information Studies at Edinburgh University whose research investigates government- and civic-produced data and democratic participation.
- Jessica Feldman, Assistant Professor of Global Communications, American University of Paris, who researches values-in-design with regards to emotions in Al, as well as the relationship between democratic practice and localized network technologies.
- Johannes Himmelreich, Interdisciplinary Ethics Fellow at the McCoy Family Center for Ethics in Society, Stanford University, and at Apple University, whose work investigates agency and responsibility in contexts of collective collaboration and technological augmentation.
- Fay Niker, Postdoctoral Fellow at the McCoy Family Center for Ethics in Society, Stanford University, whose research in applied political theory investigates the moral bases for using and regulating emerging behavior-modifying technologies.