Abstract
Geospatial data play a central role in disaster risk reduction and management (DRRM). In that context it is used to decide where, when and who should be given aid. Like in other domains, the datafication and digitalization of humanitarian aid raises privacy issues. In our paper, we will explore group data-related harms from demographically identifiable information (DII) in geospatial data. Group privacy is an increasingly important issue in digital humanitarian work because algorithms are concerned with classifying individuals into groups. Consequently, analysis needs to transcend harms caused by individual re-identification and a lack of consent. We will explore four harms posed by commonly used geospatial data: (i) Biases from missing/underrepresented categories. (ii) Mosaic effect – unintentional sensitive knowledge discovery from combining disparate datasets. And AI’s role in facilitating and accelerating data mosaicking which obscures (minute) data problems (e.g., biases) that sum up in the end. (iii) Misuse of data (whether it is shared or not). (iv) Cost-benefit analysis (cost of protection-vs-misuse risk). Using threat modelling methods the paper contributes to the literature on group privacy harms in the humanitarian domain by suggesting an appropriate (geo-)data triage and how this can also be relevant for the wider use of geodata modelling.
Original language | English |
---|---|
Publication status | Published - 27 Jun 2023 |
Event | Digital Geography Research Group Annual Conference 2023: CAMRI Ethics of the Digital - University of Westminster, London, United Kingdom Duration: 27 Jun 2023 → 28 Jun 2023 |
Conference
Conference | Digital Geography Research Group Annual Conference 2023 |
---|---|
Country/Territory | United Kingdom |
City | London |
Period | 27/06/23 → 28/06/23 |