SALE ON NOW! PROMOTIONS

Close Notification

Your cart does not contain any items

Counterexamples In Markov Decision Processes

Alexey B Piunovskiy (The Univ Of Liverpool, Uk)

$366.95   $293.44

Hardback

Not in-store but you can order this
How long will it take?

QTY:

English
World Scientific Europe Ltd
11 April 2025
Markov Decision Processes (MDPs) form a cornerstone of applied probability, with over 50 years of rich research history. Throughout this time, numerous foundational books and thousands of journal articles have shaped the field. The central objective of MDP theory is to identify the optimal control strategy for Markov random processes with discrete time. Interestingly, the best control strategies often display unexpected or counterintuitive behaviors, as documented by a wide array of studies.

This book gathers some of the most compelling examples of such phenomena while introducing new ones. By doing so, it serves as a valuable companion to existing textbooks. While many examples require little to no prior knowledge, others delve into advanced topics and will primarily interest specialists.

In this second edition, extensive revisions have been made, correcting errors and refining the content, with a wealth of new examples added. The range of examples spans from elementary to advanced, requiring background knowledge in areas like measure theory, convex analysis, and advanced probability. A new chapter on continuous time jump processes has also been introduced. The entire text has been reworked for clarity and accessibility.

This book is an essential resource for active researchers and graduate students in the field of Markov Decision Processes.
By:  
Imprint:   World Scientific Europe Ltd
Country of Publication:   United Kingdom
Volume:   6
ISBN:   9781800616752
ISBN 10:   1800616759
Series:   Series On Optimization And Its Applications
Pages:   508
Publication Date:  
Audience:   College/higher education ,  Professional and scholarly ,  Primary ,  Undergraduate
Format:   Hardback
Publisher's Status:   Active

See Also