E-verting public life: digital welfare services and the changing boundaries between states and citizens  

Parallel Session 1:
Wednesday 7 June 11.00-13:00

Seminarrom 150, Harriet Holters hus 

Lisa Reutter, Københavns universitet: Citizen responses to emerging public-private data flows: The case of Statistics Norway

Ruby O'Connor, Monash University (Melbourne) 
Imagining future relationships between AI, state and civil society 

Nora Germundsson, Department of Social Work, Stockholm University: Automating social assistance: A client-level study of occurrences and outcomes 

Sophie Spitters, Queen Mary, University of London: Decisions at a distance: Digital triage and the mediation of access in UK primary care 

Jackie Walumbe, University of Oxford/UCLH NHS Foundation Trust: Uncovering the Impact of Social Capital in Implementing Digital Health Innovations: An Autoethnographic Account  

Katja de Neergaard, The IT University of Copenhagen: Privacy as Process: Subtle negotiations of privacy in the digital home during lockdown 

Abstracts

Privacy as Process: Subtle negotiations of privacy in the digital home during lockdown

by Katja de Neergaard, The IT University of Copenhagen; with Brit Ross Winthereik 

Philosophical approaches to privacy within information science have demonstrated that privacy emerges in social relations. It is therefore inadequate to consider privacy in finite terms. Nevertheless, Western societies have a long tradition of conceptualizing privacy in terms of boundaries that indicate binaries. In STS, studies of digital technologies have shown that public spaces can have pockets of privacy and vice versa. Others have shown how the home – a key private space – leaks data about everyday life. These studies show the enormous impact of digital technologies on privacy but fail to detail how individuals and their digital technologies engage in continuous negotiation of access by others to the home. This paper shows how such negotiations are inevitable as homes are increasingly opened to the outside through digital networks, and that privacy is not the result of conscious tactics, but an effect of ad-hoc, subtle negotiations. This means that privacy becomes a process that can be studied with attention to how people seek to establish a sense of control over their private sphere. The paper concludes that technologies are not artefacts we simply have in our home, but become embedded in what constitutes a home, including privacy. It contributes to STS discussions by considering privacy as the outcome of socio-material practices. Doing so shows that privacy-related risks and inequalities at home pertain to more than data-misuse, digital literacy, or faulty design. The wider importance is to bring attention to subtle, material dimensions of privacy beyond data leaks and regulations of computational practices.

 

Automating social assistance: A client-level study of antecedents and outcomes

by Nora Germundsson, Department of Social Work, Stockholm University; with Hugo Stranz

In recent years, the use of Robot Process Automation (RPA) in the handling of social assistance (SA) has become increasingly widespread; up-to-date surveys show that over 10 % of Swedish municipalities have or are about to introduce this particular automation system in their SA handling process. Often emphasized arguments for the use of RPA are aspects of freeing up time for employees, as well as increased transparency in the handling process guaranteeing legal certainty for applicants. Less discussed, however, are aspects of if – and if so how – clients are affected by the increased use of RPA in the SA handling process. As such, the presentation will focus a) the actual use of RPA, b) the degree to which RPA use relate to outcomes of decisions, and c) to what extent both use and outcomes relate to factors at the client level. The analysis is based on quantitative data from 800 actual cases collected in four Swedish municipalities (100 + 100 cases before and after the introduction of RPA). The discussion is theoretically linked to perspectives of technologies as socio-technical systems, constructed in and by the specific culture where they are designed, as well as the context in which they are implemented. Rather than a ‘device’ implemented in an organization to unequivocally improve work processes, RPA is treated as a socially shaped and unpredictable phenomena related to emerging dynamics of digital exclusion from public life.

 

Imagining future relationships between AI, state and civil society 

by Ruby O'Connor, Monash University 

There is increasing interest in the possibilities for Artificial Intelligence (AI) to facilitate, and/or improve governance practices. Many claim AI will be revolutionary in this sphere and countries around the world are publishing National AI Strategies and beginning to integrate AI into their practices. Yet, as well as being a practical tool in these spaces, AI also functions as a performative concept with accompanying narratives that act as social mechanisms. Consequently, questions about why particular government imagined futures are seen as desirable, how AI is purportedly meant to facilitate their realisation, and what this might mean for evolving relationships between state and civil society must be assessed through a lens that accounts for power and politics.

Using a multi-level framework that draws on Gramscian ideology (Gramsci 1972/1947), socio-technical imaginaries (Beckert 2017; Jasanoff and Kim 2009), and the sociology of expectations (Borup et al. 2006), this paper undertakes a thematic analysis of National AI Strategies developed by European Union (EU) countries and the United Kingdom (UK). It then moves to apply this same framework to an analysis of the documentation, speeches and actions accompanying the use of AI for Welfare fraud detection and the identification of service needs. By unpacking discourse and action in these cases, the paper concludes current government interest in AI, far from constituting a revolution, actually serves to reproduce the status quo in governance and public service provision and reinforce current state/civil society relations.

 

Citizen responses to emerging public-private data flows: The case of Statistics Norway  

by Lisa Reutter, Københavns Universitet

Norwegian citizens are often portrayed as having highly positive attitudes towards datadriven systems in policy papers and reports. High levels of trust and a fast-moving digitalization have enabled the welfare state to collect and manage huge amounts of data, which are now imagined fueling all public sector operations - in a quest to become more effective, seamless, and proactive. My earlier research shows however that citizens and civil society are rarely engaged with in this administrative reform. Instead, we identify a paternalistic, top-down, technocratic approach where the context, values, and agendas of datafication are obscured from citizens.

In 2022 Statistics Norway, the National Statistics Agency announced that they will require the biggest grocers in Norway to hand over all collected receipt data to produce statistics of high quality and deepen the understanding of consumer behavior. An online article discussing this emerging practice (written by a public broadcaster’s tech columnist) did however spark surprising high interest among readers- and became NRKbeta’s most read and commented article of 2022. This was somehow surprising. Grocers were expressing concern about the requirement and the case had to be assessed by the Data Protection Authority in the aftermath of this debate.  In this paper I will use the example of Statistics Norway to show how emerging data practices are perceived by citizens and their reaction to datafication.  Although an online comment section is not representative for the whole populations attitude toward datafication and often attracts extreme voices, I regard this as an interesting case of discomfort expressed by citizens associated with public- private data flows. 

  

Decisions at a distance: Digital triage and the mediation of access in UK general practice

by Sophie Spitters, Queen Mary University of London; with Natassia Brenman, Sara Shaw, Michael Gill, Sara Paparini, Joseph Wherton, Sharon Spooner, Deborah Swinglehurst
 
Whilst there is much concern around the doctor-patient encounter during remote consultations, as well as ‘care at a distance’ more broadly (Pols, 2012), communication that takes place at the point of accessing this care is less well considered. The boundary between being inside and outside the clinic has been complicated by the digitisation of access: negotiations between staff and patients about the care they receive now often takes place within online platforms such as e-consult. This presentation explores new ways of negotiating healthcare access and modality of care at a distance, via a digital ‘buffer zone’.

Drawing on ethnographic observations of how digital triage plays out in day-to-day practices within a post pandemic landscape of UK primary care, we focus on the way negotiations and decisions about care are mediated by distance and technology. Zeavin (2021) reminds us that distance is not the opposite of presence; absence is. Distance can be mediated differently by different technologies, with the potential to produce certain patterns of in/exclusion. Bypassing the usual encounter with a receptionist, digital triage involves patients requesting care via an online form, which GPs use to decide what kind of care they will receive. Despite providing an opportunity for patients to communicate with GPs ‘directly’ from the home, this produces a buffer zone which enables some forms of relationality and negotiation whilst inhibiting others. We make visible the workarounds, risk management, negotiations, and even intimacies that emerge as doctors and patients learn new ways to navigate healthcare access at a distance.

 

Uncovering the Impact of Social Capital in Implementing Digital Health Innovations: An Autoethnographic Account

by Jackie Walumbe, University of Oxford

The Topol Digital Health Fellowship is a prominent programme that purportedly aims to equip clinicians with the skills to navigate the ever-evolving landscape of digital technologies in the English National Health Service. As a clinical academic and participant in the programme, I have experienced first-hand the methodological and operational challenges required to understand the opaque and elusive nature of digital practices in institutionally complex contexts.

Applying an autoethnographic approach, this paper provides a reflexive account of my experiences of trying to deliver a digital innovation project with limited material resources for a largely digitally naïve population; marginalised groups accessing tertiary level pain care. I detail the challenges that I encountered as a non-data scientist in accessing and working with a multi-layered technology infrastructure in the form of an established electronic health system. In doing so, I focus on the opaque ways that digital exclusion is currently the default mode, woven through different layers of this particular digital infrastructure. Though the initial aim of my project was to secure material resources to fund a digital solution to the problem of inequitable access to care and increase participation, I discuss here the significant role that social capital played in negotiating institutional hierarchies, securing governance approval, and navigating the opaque nature of digital technologies. I highlight the methodological and ethical complexities that arise when concurrently studying, designing and implementing digital practices in complex institutional contexts. In doing so, I propose an alternative and everted approach to addressing non-participation in pain care through digital inclusion practices and ethos.

Organizers

Marit Halder, Oslo Metropolitan University, Lars E.F. Johannessen, Oslo Metropolitan University, Gemma Hughes, University of Leicester.  

Published May 31, 2023 10:39 AM - Last modified June 5, 2023 4:02 PM