Markov Decision Processes in Practice

Research output: Book/ReportBook editingAcademic

Abstract

It is over 30 years ago since D.J. White started his series of surveys on practical applications of Markov decision processes (MDP), over 20 years after the phenomenal book by Martin Puterman on the theory of MDP, and over 10 years since Eugene A. Feinberg and Adam Shwartz published their Handbook of Markov Decision Processes: Methods and Applications. In the past decades, the practical development of MDP seemed to have come to a halt with the general perception that MDP is computationally prohibitive. Accordingly, MDP is deemed unrealistic and is out of scope for many operations research practitioners. In addition, MDP is hampered by its notational complications and its conceptual complexity. As a result, MDP is often only briefly covered in introductory operations research textbooks and courses. Recently developed approximation techniques supported by vastly increased numerical power have tackled part of the computational problems; see, e.g., Chaps. 2 and 3 of this handbook and the references therein. This handbook shows that a revival of MDP for practical purposes is justified for several reasons: 1. First and above all, the present-day numerical capabilities have enabled MDP to be invoked for real-life applications. 2. MDP allows to develop and formally support approximate and simple practical decision rules. 3. Last but not least, MDP’s probabilistic modeling of practical problems is a skill if not art by itself.
Original languageEnglish
Place of PublicationCham
PublisherSpringer
ISBN (Electronic)978-3-319-47766-4
ISBN (Print)978-3-319-47764-0
DOIs
Publication statusPublished - 2017

Publication series

NameInternational Series in Operations Research & Management Science
PublisherSpringer International Publishing
Volume248
ISSN (Print)0884-8289

Fingerprint

decision process
textbook
art
modeling
method
decision
book

Keywords

  • EWI-27922

Cite this

Boucherie, R. J., & van Dijk, N. M. (Eds.) (2017). Markov Decision Processes in Practice. (International Series in Operations Research & Management Science; Vol. 248). Cham: Springer. https://doi.org/10.1007/978-3-319-47766-4
Boucherie, Richardus J. (Editor) ; van Dijk, N.M. (Editor). / Markov Decision Processes in Practice. Cham : Springer, 2017. (International Series in Operations Research & Management Science).
@book{d244dda6d3b941a98527c5b66e0fc2e8,
title = "Markov Decision Processes in Practice",
abstract = "It is over 30 years ago since D.J. White started his series of surveys on practical applications of Markov decision processes (MDP), over 20 years after the phenomenal book by Martin Puterman on the theory of MDP, and over 10 years since Eugene A. Feinberg and Adam Shwartz published their Handbook of Markov Decision Processes: Methods and Applications. In the past decades, the practical development of MDP seemed to have come to a halt with the general perception that MDP is computationally prohibitive. Accordingly, MDP is deemed unrealistic and is out of scope for many operations research practitioners. In addition, MDP is hampered by its notational complications and its conceptual complexity. As a result, MDP is often only briefly covered in introductory operations research textbooks and courses. Recently developed approximation techniques supported by vastly increased numerical power have tackled part of the computational problems; see, e.g., Chaps. 2 and 3 of this handbook and the references therein. This handbook shows that a revival of MDP for practical purposes is justified for several reasons: 1. First and above all, the present-day numerical capabilities have enabled MDP to be invoked for real-life applications. 2. MDP allows to develop and formally support approximate and simple practical decision rules. 3. Last but not least, MDP’s probabilistic modeling of practical problems is a skill if not art by itself.",
keywords = "EWI-27922",
editor = "Boucherie, {Richardus J.} and {van Dijk}, N.M.",
year = "2017",
doi = "10.1007/978-3-319-47766-4",
language = "English",
isbn = "978-3-319-47764-0",
series = "International Series in Operations Research & Management Science",
publisher = "Springer",

}

Boucherie, RJ & van Dijk, NM (eds) 2017, Markov Decision Processes in Practice. International Series in Operations Research & Management Science, vol. 248, Springer, Cham. https://doi.org/10.1007/978-3-319-47766-4

Markov Decision Processes in Practice. / Boucherie, Richardus J. (Editor); van Dijk, N.M. (Editor).

Cham : Springer, 2017. (International Series in Operations Research & Management Science; Vol. 248).

Research output: Book/ReportBook editingAcademic

TY - BOOK

T1 - Markov Decision Processes in Practice

A2 - Boucherie, Richardus J.

A2 - van Dijk, N.M.

PY - 2017

Y1 - 2017

N2 - It is over 30 years ago since D.J. White started his series of surveys on practical applications of Markov decision processes (MDP), over 20 years after the phenomenal book by Martin Puterman on the theory of MDP, and over 10 years since Eugene A. Feinberg and Adam Shwartz published their Handbook of Markov Decision Processes: Methods and Applications. In the past decades, the practical development of MDP seemed to have come to a halt with the general perception that MDP is computationally prohibitive. Accordingly, MDP is deemed unrealistic and is out of scope for many operations research practitioners. In addition, MDP is hampered by its notational complications and its conceptual complexity. As a result, MDP is often only briefly covered in introductory operations research textbooks and courses. Recently developed approximation techniques supported by vastly increased numerical power have tackled part of the computational problems; see, e.g., Chaps. 2 and 3 of this handbook and the references therein. This handbook shows that a revival of MDP for practical purposes is justified for several reasons: 1. First and above all, the present-day numerical capabilities have enabled MDP to be invoked for real-life applications. 2. MDP allows to develop and formally support approximate and simple practical decision rules. 3. Last but not least, MDP’s probabilistic modeling of practical problems is a skill if not art by itself.

AB - It is over 30 years ago since D.J. White started his series of surveys on practical applications of Markov decision processes (MDP), over 20 years after the phenomenal book by Martin Puterman on the theory of MDP, and over 10 years since Eugene A. Feinberg and Adam Shwartz published their Handbook of Markov Decision Processes: Methods and Applications. In the past decades, the practical development of MDP seemed to have come to a halt with the general perception that MDP is computationally prohibitive. Accordingly, MDP is deemed unrealistic and is out of scope for many operations research practitioners. In addition, MDP is hampered by its notational complications and its conceptual complexity. As a result, MDP is often only briefly covered in introductory operations research textbooks and courses. Recently developed approximation techniques supported by vastly increased numerical power have tackled part of the computational problems; see, e.g., Chaps. 2 and 3 of this handbook and the references therein. This handbook shows that a revival of MDP for practical purposes is justified for several reasons: 1. First and above all, the present-day numerical capabilities have enabled MDP to be invoked for real-life applications. 2. MDP allows to develop and formally support approximate and simple practical decision rules. 3. Last but not least, MDP’s probabilistic modeling of practical problems is a skill if not art by itself.

KW - EWI-27922

U2 - 10.1007/978-3-319-47766-4

DO - 10.1007/978-3-319-47766-4

M3 - Book editing

SN - 978-3-319-47764-0

T3 - International Series in Operations Research & Management Science

BT - Markov Decision Processes in Practice

PB - Springer

CY - Cham

ER -

Boucherie RJ, (ed.), van Dijk NM, (ed.). Markov Decision Processes in Practice. Cham: Springer, 2017. (International Series in Operations Research & Management Science). https://doi.org/10.1007/978-3-319-47766-4