Abstract
AI technologies automate many aspects of urban video surveillance, such as data processing and analysis (Andrejevic, 2019). While such automation in surveillance is popularized with promises of better protection against crimes and increased safety (Whittaker, 2021), AI-fueled automation also poses the risks of automating biases and shifting the nature of power imbalances in society, perpetuating control and discrimination (Costanza-Chock, 2018; Waelen & Wieczorek, 2022). Despite these concerns, cities worldwide are implementing AI-powered video surveillance technologies, for example, the Bengaluru Safe City Project (BSCP) in India for women safety (PK, 2022). Such surveillance projects often carry the notion of the caring watcher “who not only watches but also ‘looks out for’” the public (Andrejevic et al., 2021: 569). Heeding these claims of caring, “can [urban] surveillance technologies be tuned to the key of [urban] care” (Bauman & Lyon, 2013: 84)? This paper responds to this question by asking how we can (if at all) reimagine automated video surveillance systems in cities as infrastructure for urban care. We define urban care by drawing on Davis (2022), Maalsen (2023), and Power & Williams (2019) as a relational practice and collective effort to enhance the well-being, safety, and quality of life for and with all urban residents.
Surveillance and care, two concepts that are “neither oxymoron nor tautology” (Richardson et al., 2017: 110), have been explored together by other scholars. Walsh (2010: 128) shows that surveillance, on the flip side, can construct “counter-geographies of hope,” challenging control and creating a sense of care in borderlands. Maalsen (2023) argues for AI algorithms – which power automated surveillance – also to be seen as agents of caring beyond their harmful potential. In answering the research question, this paper situates both AI-powered automation and urban surveillance together within the urban care debate.
To address our research question, we first identify, from literature, barriers inherent in current AI to being caring and then make suggestions on what that might mean for reimagining AI-powered surveillance systems. We use the BSCP to empirically illustrate these barriers, drawing on official communications about the project. We base our suggestions on reimagining these systems on a care-full justice perspective (Williams, 2017). Care-full justice brings together the ideals of social justice and ethics of care within the same frame, enabling us to “rethink what cities can be” and how they can foster a collective responsibility in a more-than-human world (Williams, 2017: 821).
This paper, thus, concludes with reflections on what it takes to reinvent infrastructure by reimagining AI. We link our conclusions with Protective Optimization Technologies (Kulynych et al., 2020), which opens the way for further discussions around AI’s harms and political implications on populations and their environments.
Surveillance and care, two concepts that are “neither oxymoron nor tautology” (Richardson et al., 2017: 110), have been explored together by other scholars. Walsh (2010: 128) shows that surveillance, on the flip side, can construct “counter-geographies of hope,” challenging control and creating a sense of care in borderlands. Maalsen (2023) argues for AI algorithms – which power automated surveillance – also to be seen as agents of caring beyond their harmful potential. In answering the research question, this paper situates both AI-powered automation and urban surveillance together within the urban care debate.
To address our research question, we first identify, from literature, barriers inherent in current AI to being caring and then make suggestions on what that might mean for reimagining AI-powered surveillance systems. We use the BSCP to empirically illustrate these barriers, drawing on official communications about the project. We base our suggestions on reimagining these systems on a care-full justice perspective (Williams, 2017). Care-full justice brings together the ideals of social justice and ethics of care within the same frame, enabling us to “rethink what cities can be” and how they can foster a collective responsibility in a more-than-human world (Williams, 2017: 821).
This paper, thus, concludes with reflections on what it takes to reinvent infrastructure by reimagining AI. We link our conclusions with Protective Optimization Technologies (Kulynych et al., 2020), which opens the way for further discussions around AI’s harms and political implications on populations and their environments.
Original language | English |
---|---|
Publication status | Published - 4 Sept 2024 |
Event | 5th International Data Power Conference 2024: Situating Data Practices Beyond Data Universalism - University of Graz, Graz, Austria Duration: 4 Sept 2024 → 6 Sept 2024 Conference number: 5 https://datapowerconference.org/data-power-2024/about-2024/ |
Conference
Conference | 5th International Data Power Conference 2024 |
---|---|
Country/Territory | Austria |
City | Graz |
Period | 4/09/24 → 6/09/24 |
Internet address |