Simulation-Based Algorithms for Markov Decision Processes

Nonfiction, Science & Nature, Science, Other Sciences, System Theory, Technology, Automation
Cover of the book Simulation-Based Algorithms for Markov Decision Processes by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus, Springer London
View on Amazon View on AbeBooks View on Kobo View on B.Depository View on eBay View on Walmart
Author: Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus ISBN: 9781447150220
Publisher: Springer London Publication: February 26, 2013
Imprint: Springer Language: English
Author: Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
ISBN: 9781447150220
Publisher: Springer London
Publication: February 26, 2013
Imprint: Springer
Language: English

Markov decision process (MDP) models are widely used for modeling sequential decision-making problems that arise in engineering, economics, computer science, and the social sciences.  Many real-world problems modeled by MDPs have huge state and/or action spaces, giving an opening to the curse of dimensionality and so making practical solution of the resulting models intractable.  In other cases, the system of interest is too complex to allow explicit specification of some of the MDP model parameters, but simulation samples are readily available (e.g., for random transitions and costs). For these settings, various sampling and population-based algorithms have been developed to overcome the difficulties of computing an optimal solution in terms of a policy and/or value function.  Specific approaches include adaptive sampling, evolutionary policy iteration, evolutionary random policy search, and model reference adaptive search.
This substantially enlarged new edition reflects the latest developments in novel algorithms and their underpinning theories, and presents an updated account of the topics that have emerged since the publication of the first edition. Includes:
innovative material on MDPs, both in constrained settings and with uncertain transition properties;
game-theoretic method for solving MDPs;
theories for developing roll-out based algorithms; and
details of approximation stochastic annealing, a population-based on-line simulation-based algorithm.
The self-contained approach of this book will appeal not only to researchers in MDPs, stochastic modeling, and control, and simulation but will be a valuable source of tuition and reference for students of control and operations research.

View on Amazon View on AbeBooks View on Kobo View on B.Depository View on eBay View on Walmart

Markov decision process (MDP) models are widely used for modeling sequential decision-making problems that arise in engineering, economics, computer science, and the social sciences.  Many real-world problems modeled by MDPs have huge state and/or action spaces, giving an opening to the curse of dimensionality and so making practical solution of the resulting models intractable.  In other cases, the system of interest is too complex to allow explicit specification of some of the MDP model parameters, but simulation samples are readily available (e.g., for random transitions and costs). For these settings, various sampling and population-based algorithms have been developed to overcome the difficulties of computing an optimal solution in terms of a policy and/or value function.  Specific approaches include adaptive sampling, evolutionary policy iteration, evolutionary random policy search, and model reference adaptive search.
This substantially enlarged new edition reflects the latest developments in novel algorithms and their underpinning theories, and presents an updated account of the topics that have emerged since the publication of the first edition. Includes:
innovative material on MDPs, both in constrained settings and with uncertain transition properties;
game-theoretic method for solving MDPs;
theories for developing roll-out based algorithms; and
details of approximation stochastic annealing, a population-based on-line simulation-based algorithm.
The self-contained approach of this book will appeal not only to researchers in MDPs, stochastic modeling, and control, and simulation but will be a valuable source of tuition and reference for students of control and operations research.

More books from Springer London

Cover of the book Perioperative Medicine by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Practical Preimplantation Genetic Diagnosis by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Clinical Echocardiography by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Evaluation of Cancer Screening by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Difficult Decisions in Thoracic Surgery by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Weather Modeling and Forecasting of PV Systems Operation by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Solar Lighting by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book 3D Histology Evaluation of Dermatologic Surgery by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Human Health by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Intrauterine Growth Restriction by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Shape Perception in Human and Computer Vision by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Overactive Bladder in Clinical Practice by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Rheumatic Disease by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Reflections on the Work of C.A.R. Hoare by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Practical SPECT/CT in Nuclear Medicine by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
We use our own "cookies" and third party cookies to improve services and to see statistical information. By using this website, you agree to our Privacy Policy