sta 331 2 0 stochastic processes
play

STA 331 2.0 Stochastic Processes 3. Markov Chains - Classifjcation - PowerPoint PPT Presentation

STA 331 2.0 Stochastic Processes 3. Markov Chains - Classifjcation of States Dr Thiyanga S. Talagala August 18, 2020 Department of Statistics, University of Sri Jayewardenepura Example 1 1 0 Find the equivalence classes. 1 4 1 4 4 1 1


  1. STA 331 2.0 Stochastic Processes 3. Markov Chains - Classifjcation of States Dr Thiyanga S. Talagala August 18, 2020 Department of Statistics, University of Sri Jayewardenepura

  2. Example 1 1 0 Find the equivalence classes. 1 4 1 4 4 1 1 4 0 0 0 1 2 0 2 2 P = 1 1 2 1 0 2 0              

  3. Theorem The relation of communication partitions the state space into mutually exclusive and exhaustive classes. (The states in a given class communicate with each other. But states in difgerent classes do not communicate with each other.) 3

  4. Defjnition process will ever re-enters state i , i.e, 4 Let f i denote the probability that, starting in state i , the f i = P ( X n = i for some n ≥ 1 | X 0 = i )

  5. Example 2 1 0 Consider the Markov chain consisting of the states 0, 1, 2, 3 1 4 1 4 4 1 1 4 0 0 0 1 Find f 0 , f 1 , f 2 , f 3 . 2 0 2 1 with the transition probability matrix, P = 1 5 2 2 0 1 0              

  6. Recurrent and transient states 6 Let f i be the probability that, starting in state i , the process will ever re-enter state i . State i is said to be recurrent if f i = 1 and transient if f i < 1.

  7. Example 3 1 Determine which states are transient and which are recurrent. 0 2 1 2 1 2 1 0 Consider the Markov chain consisting of the states 0,1,2 with 2 2 1 2 1 0 P = the transition probability matrix 7        

  8. Example 4 0 Determine which states are transient and which are recurrent. 0 1 0 0 0 0 2 1 2 Consider the Markov chain consisting of the states 0, 1, 2, 3 1 0 1 0 1 with the transition probability matrix P = 8 0 0 0              

  9. Example 5 1 1 2 Consider the Markov chain consisting of the states 0, 1, 2, 3, 4 1 2 0 0 0 0 0 2 0 1 2 0 0 0 1 2 1 2 Determine which states are transient and which are recurrent. 0 0 4 1 with the transition probability matrix P = 1 1 2 0 9 2 4 0 1 2 1 0                    

  10. Theorem if state i is recurrent then, starting in state i , the process will re-enter state i again and again and again—in fact, infjnitely often. 10

  11. Theorem state i , the process will ever re-enter state i . If state i is transient then, starting in state i , the number of time periods that the process will be in state i has a geometric distribution with fjnite mean 1 Proof: In-class 11 For any state i , let f i denote the probability that, starting in 1 − f i .

  12. 12

  13. Theorem State i is recurrent if P n transient if P n Proof: In-class 13 ∞ ∑ ii = ∞ , n = 1 ∞ ∑ ii < ∞ , n = 1

  14. Corollary 1 If state i is recurrent, and state i communicates with state j Proof: In-class 14 ( i ↔ j ), then state j is recurrent.

  15. Corollary 2 In a Markov Chain with a fjnite number of states not all of the states can be transient (There should be at least one recurrent state). Proof: In-class 15

  16. Corollary 3 If one state in an equivalent class is transient, then all other states in that class are also transient. Proof: In-class 16

  17. Corollary 4 Not all states in a fjnite Markov chain can be transient. This leads to the conclusion that all states of a fjnite irreducible Markov chain are recurrent . Proof: In-class 17

  18. Acknowledgement The contents in the slides are mainly based on Introduction to Probability Models by Sheldon M. Ross. 18

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend