Markov Chains and Invariant Probabilities 2003 Edition Contributor(s): Hernández-Lerma, Onésimo (Author), Lasserre, Jean B. (Author) |
|
![]() |
ISBN: 3764370009 ISBN-13: 9783764370008 Publisher: Birkhauser OUR PRICE: $52.24 Product Type: Hardcover - Other Formats Published: February 2003 Annotation: This book concerns discrete-time homogeneous Markov chains that admit an invariant probability measure. The main objective is to give a systematic, self-contained presentation on some key issues about the ergodic behavior of that class of Markov chains. These issues include, in particular, the various types of convergence of expected and pathwise occupation measures, and ergodic decompositions of the state space. |
Additional Information |
BISAC Categories: - Mathematics | Probability & Statistics - General - Business & Economics | Operations Research - Science | Physics - Mathematical & Computational |
Dewey: 519.233 |
LCCN: 2003041471 |
Series: Progress in Mathematics |
Physical Information: 0.56" H x 6.14" W x 9.21" (1.10 lbs) 208 pages |
Descriptions, Reviews, Etc. |
Publisher Description: This book is about discrete-time, time-homogeneous, Markov chains (Mes) and their ergodic behavior. To this end, most of the material is in fact about stable Mes, by which we mean Mes that admit an invariant probability measure. To state this more precisely and give an overview of the questions we shall be dealing with, we will first introduce some notation and terminology. Let (X, B) be a measurable space, and consider a X-valued Markov chain . = { k' k = 0, 1, ... } with transition probability function (t.pJ.) P(x, B), i.e., P(x, B): = Prob ( k+1 E B I k = x) for each x E X, B E B, and k = 0,1, .... The Me . is said to be stable if there exists a probability measure (p.m.) /.l on B such that (*) VB EB. /.l(B) = Ix /.l(dx) P(x, B) If (*) holds then /.l is called an invariant p.m. for the Me . (or the t.p.f. P). |