Explainable AI for Human-AI Joint Decision-Making in Farming Practice

Joschka A. Hüllmann*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

12 Downloads (Pure)

Abstract

Agriculture is making leaps in digitalization and artificial intelligence (AI) systems with autonomous machines, sensor data, and decision support systems (Liakos et al., 2018; Smith, 2020). Understanding and improving how farmers interact with AI requires research that looks beyond AI in laboratory settings and into the application of AI in the field (Huysman, 2020; Jussupow et al., 2021). One key issue is explainability which paves the way for successful AI deployments (Gregor & Benbasat, 1999; Thiebes et al., 2021). Explainability refers to the effectiveness of AI’s explanations (e.g., user interfaces, documentation, or manuals). This study focuses on the comprehensibility of explanations and specifically user interfaces for end-users. End-users often cannot comprehend how AI systems reach their decisions (Waardenburg et al., 2020). However, explainability is crucial for using AI in joint decision-making (Asatiani et al., 2021).

Human-AI joint decision-making happens through configurations of Human-AI agency, which are continuously and mutually shaped (Suchman, 2007, 2012). Recent research found that a translator role is required who mediates between end-user and AI system (Gal et al., 2020; Jussupow et al., 2021; Waardenburg et al., 2022). The translator role addresses comprehensibility in domain-specific contexts. What remains unclear is how human-AI joint decision-making occurs when explanations influence it. Research into how AI explanations are embedded in the organization and integrated into decision-making procedures is lacking. How humans engage with AI systems and make sense of explanations in the domain context has seen little empirical work until now (Abdul et al., 2018; Benbya et al., 2021). These issues are urgent for small businesses, where human actors rely on AI explanations. Therefore, this study asks: How do configurations of
Original languageEnglish
Title of host publicationProceedings of the AI@Work Track at Reshaping Work 2022 Conference
Publication statusPublished - 2022
EventReshaping Work conference 2022: Shaping the Future of Work Together - Amsterdam, Netherlands
Duration: 13 Oct 202214 Oct 2022

Conference

ConferenceReshaping Work conference 2022
Country/TerritoryNetherlands
CityAmsterdam
Period13/10/2214/10/22

Fingerprint

Dive into the research topics of 'Explainable AI for Human-AI Joint Decision-Making in Farming Practice'. Together they form a unique fingerprint.

Cite this