STA 331 2.0 Stochastic Processes 3. Markov Chains - Classifjcation - - PowerPoint PPT Presentation

sta 331 2 0 stochastic processes
SMART_READER_LITE
LIVE PREVIEW

STA 331 2.0 Stochastic Processes 3. Markov Chains - Classifjcation - - PowerPoint PPT Presentation

STA 331 2.0 Stochastic Processes 3. Markov Chains - Classifjcation of States Dr Thiyanga S. Talagala August 18, 2020 Department of Statistics, University of Sri Jayewardenepura Example 1 1 0 Find the equivalence classes. 1 4 1 4 4 1 1


slide-1
SLIDE 1

STA 331 2.0 Stochastic Processes

  • 3. Markov Chains - Classifjcation of States

Dr Thiyanga S. Talagala August 18, 2020

Department of Statistics, University of Sri Jayewardenepura

slide-2
SLIDE 2

Example 1

Find the equivalence classes. P=

      

1 2 1 2 1 2 1 2 1 4 1 4 1 4 1 4

1

      

2

slide-3
SLIDE 3

Theorem

The relation of communication partitions the state space into mutually exclusive and exhaustive classes. (The states in a given class communicate with each other. But states in difgerent classes do not communicate with each other.)

3

slide-4
SLIDE 4

Defjnition

Let fi denote the probability that, starting in state i, the process will ever re-enters state i, i.e, fi = P(Xn = i for some n ≥ 1|X0 = i)

4

slide-5
SLIDE 5

Example 2

Consider the Markov chain consisting of the states 0, 1, 2, 3 with the transition probability matrix, P=

      

1 2 1 2 1 2 1 2 1 4 1 4 1 4 1 4

1

      

Find f0, f1, f2, f3.

5

slide-6
SLIDE 6

Recurrent and transient states

Let fi be the probability that, starting in state i, the process will ever re-enter state i. State i is said to be recurrent if fi = 1 and transient if fi < 1.

6

slide-7
SLIDE 7

Example 3

Consider the Markov chain consisting of the states 0,1,2 with the transition probability matrix P=

   

1 2 1 2 1 2 1 2 1 2 1 2

   

Determine which states are transient and which are recurrent.

7

slide-8
SLIDE 8

Example 4

Consider the Markov chain consisting of the states 0, 1, 2, 3 with the transition probability matrix P=

      

1 1

1 2 1 2

1

      

Determine which states are transient and which are recurrent.

8

slide-9
SLIDE 9

Example 5

Consider the Markov chain consisting of the states 0, 1, 2, 3, 4 with the transition probability matrix P=

         

1 2 1 2 1 4 1 2 1 4 1 2 1 2 1 2 1 2 1 2 1 2

         

Determine which states are transient and which are recurrent.

9

slide-10
SLIDE 10

Theorem

if state i is recurrent then, starting in state i, the process will re-enter state i again and again and again—in fact, infjnitely

  • ften.

10

slide-11
SLIDE 11

Theorem

For any state i, let fi denote the probability that, starting in state i, the process will ever re-enter state i. If state i is transient then, starting in state i, the number of time periods that the process will be in state i has a geometric distribution with fjnite mean

1 1−fi.

Proof: In-class

11

slide-12
SLIDE 12

12

slide-13
SLIDE 13

Theorem

State i is recurrent if

n=1

Pn

ii = ∞,

transient if

n=1

Pn

ii < ∞,

Proof: In-class

13

slide-14
SLIDE 14

Corollary 1

If state i is recurrent, and state i communicates with state j (i ↔ j), then state j is recurrent. Proof: In-class

14

slide-15
SLIDE 15

Corollary 2

In a Markov Chain with a fjnite number of states not all of the states can be transient (There should be at least one recurrent state). Proof: In-class

15

slide-16
SLIDE 16

Corollary 3

If one state in an equivalent class is transient, then all other states in that class are also transient. Proof: In-class

16

slide-17
SLIDE 17

Corollary 4

Not all states in a fjnite Markov chain can be transient. This leads to the conclusion that all states of a fjnite irreducible Markov chain are recurrent. Proof: In-class

17

slide-18
SLIDE 18

Acknowledgement

The contents in the slides are mainly based on Introduction to Probability Models by Sheldon M. Ross.

18