Data-intensive technology systems for women safety: Lessons from Bengaluru’s AI-powered CCTV network

Research output: Contribution to conferenceAbstractAcademic

Abstract

Ample scholarly work highlights the harmful social effects of CCTV surveillance, such as hypervisibility, biased profiling, and discrimination against certain social groups. Research also shows low correlations between the use of CCTV cameras and crime prevention. Despite those observations, CCTV surveillance is expanding and is marked by the increasing use of advanced video analytics with artificial intelligence (AI). These developments further complicate the unequal power structures perpetuated by such technological systems. In Bengaluru, authorities are using CCTV data from AI-powered surveillance systems to ensure women safety as part of the Bengaluru Safe City Project (BSCP). Given AI surveillance’s harmful effects, how safety is understood in such projects is little studied. In this paper, we unpack how the notion of safety is evoked within BSCP’s data-intensive environment. Analyzing key documents from BSCP actors describing the intended use and technological capabilities of the now partially implemented surveillance system, we identify the following storylines surrounding urban safety: a) women should be protected from men (safety from physical and sexual violence), b) public places are inherently dangerous for women (safety from potential harm), and even within a high-tech urban environment, c) women are responsible for their own safety (safety in your own hands). These storylines reveal that data-intensive technological systems, even when particularly tailored for women safety, do not provide guaranteed safety but conditional safety at most. These findings nuance our empirical understanding of unequal and pervasive data-intensive technologies in public spaces. We show that rather than promoting a promise of greater safety, BSCP and how it invokes safety reinscribes asymmetrical power dynamics infused with patriarchal beliefs. This asymmetry disrupts the gendered rights to the city, creating techno-mediated (spatial) injustice. In conclusion, we reflect on those findings and situate them within critical discussions around algorithmic care as a future research direction.
Original languageEnglish
Publication statusPublished - 4 Sept 2024
Event5th International Data Power Conference 2024: Situating Data Practices Beyond Data Universalism - University of Graz, Graz, Austria
Duration: 4 Sept 20246 Sept 2024
Conference number: 5
https://datapowerconference.org/data-power-2024/about-2024/

Conference

Conference5th International Data Power Conference 2024
Country/TerritoryAustria
CityGraz
Period4/09/246/09/24
Internet address

Fingerprint

Dive into the research topics of 'Data-intensive technology systems for women safety: Lessons from Bengaluru’s AI-powered CCTV network'. Together they form a unique fingerprint.

Cite this