A Markov decision process approach for managing medical drone deliveries

Amin Asadi, Sarah Nurre Pinkley, Martijn Mes

Research output: Contribution to journalArticleAcademicpeer-review

10 Citations (Scopus)
185 Downloads (Pure)


Drone delivery is a fast and innovative method for delivering parcels, food, and medical supplies. Furthermore, this low-contact delivery mode contributes to reducing the spread of pandemic and vaccine-preventable diseases. Focusing on the delivery of medical supplies, this paper studies optimizing the distribution operations at a drone hub that dispatches drones to hospitals located at different geographic locations. Each hospital generates stochastic demands for medical supplies to be covered. This paper classifies stochastic demands based on the distance between hospitals and the drone hub. Satisfying the demands requires flying over different ranges, which is directly related to the amount of charge of the drone batteries. We develop a stochastic scheduling and allocation problem with multiple classes of demand and model the problem using a finite Markov decision process approach. We provide exact solutions for the modest sizes instances using backward induction and discuss that the problem suffers from the curses of dimensionality. Hence, we provide a reinforcement learning method capable of giving near-optimal solutions. We perform a set of computational tests using realistic data representing a prominent drone delivery company. Finally, we analyze the results to provide insights for managing drone hub operations and show that the reinforcement learning method has high performance compared with the exact and heuristic solution methods.
Original languageEnglish
Article number117490
JournalExpert systems with applications
Early online date14 May 2022
Publication statusPublished - 15 Oct 2022


  • UT-Hybrid-D


Dive into the research topics of 'A Markov decision process approach for managing medical drone deliveries'. Together they form a unique fingerprint.

Cite this