Care, Autonomy, and Technology Workshop, 25 September 2023

 

Decoration only incorporating the Institute for Ethics in AI logo

 

Care, Autonomy, and Technology Workshop, 25 September 2023

Institute for Ethics in AI – University of Oxford

 

Overview of schedule

13:50 -14:00 Welcome (Ekaterina Hertog)

14:00 - 15:30 Panel 1
Care, autonomy, and negotiations of the use of monitoring technologies within families
Zuzana Burikova, Netta Weinstein, Giovanna Mascheroni

15:30 - 15.45 Break

15:45 - 16:45 Panel 2
Potentials and pitfalls of using technology in social care
Caroline Emmer De Albuquerque Green, Jun Zhao

16:45 – 17:00 Break

17:00 - 18:00 Panel 3
Technology and visions of care
James Wright, Hannah Zeavin

18:30 - Thank you (Vicki Nash)

 

 

Speakers


Dr Zuzana Burikova


Title: Between trust and control: digital technologies and care in intimate partner relationships.

Abstract: The digital monitoring tools and apps are increasingly used in our family and care practices. In this presentation I will present the result of our team research on how care, surveillance, and autonomy interplay in usage of digital technologies in intimate partner relationships in Slovakia.

Our representative quantitative research focused on practices, which are commonly used in intimate partner relationships as expressions of care and trust, and yet can be used for monitoring, controlling, even abusing the partner. The questionnaire focused on both, practices and opinions of people living in intimate partner relationships. In particular, we asked about shared access/passwords to computers, smartphones, applications and internet banking; about an ability to track the partner’s phone; about sharing erotic images with partner; as well as about breaches of trust (e.g. looking into partner’s phone without their knowledge, showing or sharing partner’s erotic image with a third party etc.). Second part of questionnaire mapped opinions on particular practices in relation to trust, control, care, and autonomy.

In the presentation I will show how these practices and opinions relate to socio-demographic characteristics and illustrate how relatively high prevalence of sharing and trust contrasts with opinions on care, autonomy, and abuse. I will conclude relating this date to theories of ethics of care on care and violence.

Co-authors: Viera Poláková and Veronika Valkovičová

 

Prof Netta Weinstein


Title: Buying in to Rules and Restrictions: Motivational Principles of Caregivers’ Technology
Regulation

Abstract: Caregivers are increasingly involved in their children’s engagement with technology, commonly by restricting (i.e., limiting time) and moderating (i.e., guiding type of use) screen use. Caregivers also rely on technology to monitor children’s use and gain information for guiding their decisions. But the processes of tracking, restricting, and moderating do not live in a vacuum. Both how caregivers convey these activities to children, and the broader relational climate at home, can affect how children respond to parental attempts to help regulate their technology use. This talk will examine how the caregiver-child relationship and specific conversations that take place within it impacts adolescents’ reactions to parental technology monitoring, restrictions, and moderations. Adolescence is a particularly interesting time in a young person’s life to explore these issues because this age is characterized by increasing independence and chance of reactance or defiance. The young person can respond to requests and rules by ignoring or even countering them. Examining these processes, I will explore the role that parental autonomy-supportive (supportive of a sense of choice and demonstrating understanding) and controlling (pressuring or punitive) behaviours play and discuss key caregiver strategy for motivating adolescents’ buy-in with healthy technology use. I will present a series of studies conducted on caregiver reactions to adolescents’ cyberbullying, on caregiver communications when applying rules and restrictions, and on the importance of high-quality listening to young people. I will also explore future avenues in these literatures for understanding how parents can effectively regulate their young people’s technology use.

 

Dr Giovanna Mascheroni


Title: The Datafied Habitus: Sociodigital Inequalities and Lived Experiences of Datafication Among Italian Families


Abstract: The mediatization and datafication of childhood is often addressed as a generalised and homogeneous experience, at least across European countries. However, critical data studies have long warned about the need to contextualise data practices and imaginaries in individuals’ everyday lives through e phenomenological approach (Breiter & Hepp, 2018; Couldry & Hepp, 2017; Kennedy & Bates, 2017; Kennedy et al., 2015; Mascheroni & Siibak, 2021). Our qualitative longitudinal research involving three waves of data collection with 20 Italian families with at least one child aged 8 or younger¾including interviews and observations (N=58), app-based media diaries (N=17) and maps drawing (Watson et al., 2022) ¾shows how family life is undergoing a process of deep mediatization (Hepp, 2019). Yet, while all families live digital media-rich lives, they variously engage in and make sense of data practices and algorithmic systems. Different datafied habitus emerge from the complex interplay of sociodigital inequalities (Helsper, 2021) (i.e., the family’s social, cultural and economic capital; the range and type of digital media available – including IoTs and AI-based devices such as smart speakers, smart TVs, smart toys, wearable devices) and the family’s own culture and practices (the specific media practices in which children and parents engage; parental mediation strategies; parenting cultures; and technological imaginaries). Together, these shape different data(fied) habitus, consisting of set of practices, resources, schemes and classifications (as in Bourdieu’s (1986) classical notion of the habitus), that configure different lived experiences of datafication and generate new sociodigital inequalities. Three main habitus emerge: the dataist, enthusiastically adopting data practices and technological solutions to most everyday lives problems; the datafied, either resigned to datafication (digital resignation) or ignoring the profound implications of datafication; and, last and rarest, the digital ascetic, trying to preserve their children from digital media as long as possible.

 

Dr Caroline Emmer De Albuquerque Green 

Title: Potentials and pitfalls of Large Language Models to support family carers' autonomy: A reflection  

Abstract: Large Language Models (LLM) like OpenAI's ChatGPT are transforming the way people work, including in health and social care professions. They also have the potential to support family carers with everyday care tasks, for example by providing quick solutions to care related challenges. This could have an important impact on carers' sense of autonomy by providing them with round the clock virtual assistance in their role. Various organisations with the aim to support carers, for example of people living with dementia, are currently working on integrating LLM’s into their service offer. This is in addition to content available on a website and person-to-person support through phone helplines.

But such use of LLMs in social care poses some fundamental questions and challenges regarding the reliability and quality of responses and solutions.  There is a real need to understand the ethical, legal, and practical ramifications of applying LLMs to provide support to family carers, especially when it comes to capacity and healthcare related questions. This presentation reports on a small study testing out responses generated by ChatGPT to some fundamental care tasks, including on mental capacity, and comparing them with advice generated by care professionals. It suggests that generated responses, although filling a gap in service round the clock service provision, require quality control from trained professionals and considerations for regulation that expand into digital social care provision. 

 

Dr Jun Zhao

Title: Protection or Punishment? Relating the Design Space of Current Parental Control Apps to Support Parenting for Children’s Online Safety and Autonomy

Abstract: Parental control apps, which are mobile apps that enable parents to monitor and restrict their children’s online activities, are being increasingly adopted by parents as a means to safeguard their children online. However, it is unclear whether these apps are consistently beneficial or effective in achieving their intended goals. For instance, the use of excessive restriction and surveillance has been found to undermine parent-child relationship and impede children’s sense of autonomy.

In this presentation, I will share the results of our recent research, which involves a systematic analysis of the key design features found in popular parental control apps. We explored how these design features may relate to the experiences of parents and children when using these apps. Specifically, we ask how children’s and parents' perceptions are influenced by the design of parental control features, and how these design features either support or hinder children's and parents' sense of autonomy. Our findings provide insights into the design landscape of current parental control apps, and shed light on how these apps are perceived by parents and children. This research highlights future design considerations for enhancing children and parent’s autonomy within the context of existing digital parenting theories.

 

Dr James Wright

Title: Monitoring devices and the concept of “leeway” (yoyū) in socio-technical assemblages of care

Abstract: This talk considers monitoring care technologies in relation to the idea of “leeway” (yoyū in Japanese), drawing on my ethnographic fieldwork on care robots in Japan and the “machine theory” developed by anthropologist Michael Fisch based on his research on Tokyo’s commuter train network.

I explore yoyū as a salient concept for thinking about how the development and implementation of monitoring technologies within a socio-technical assemblage of care might avoid an overdetermined, productivist, and instrumentalised vision of care, instead expanding what Fisch terms the “margin of indeterminacy” and increasing opportunities for relational engagement with and between human users. As I aim to demonstrate, such an approach reframes questions of trust, control, and autonomy in collective relationships with technological systems.

 

Dr Hannah Zeavin

Title: “Auto-Intimacy”

Abstract: “Auto-Intimacy” engages with therapeutic and psychiatric treatment by algorithmic and other automated therapies. At the earliest moment of experimentation with automated therapies, two strains of work emerged: the simulation and detection of a disordered mind in the hopes of automating intake, diagnosis, and psychological education, and the wholesale simulation of a therapist toward the dream of automating therapeutic treatment to batch process patients. Each tradition was responsive to pressures on mental health care at mid-century, as well as new understandings of the mind that appeared in the same period. This talk offers an overview of the social and ethical implications of these treatments as they move from care to capture to control across the second half of the 20th century into our present. I will interrogate the democratizing logics that escort these technologies, ask what therapy becomes when the traditional therapist is replaced by a computational actor, and conclude with some thoughts about how the history of these attempts can inform current policy and ethics debates about the deployment of these technologies.