SLIDE 18 Ergodicity
- Let return probability ๐
๐,๐ ๐ข be the probability that, starting at state ๐,
the first transition to state ๐ occurs at time ๐ข: ๐
๐,๐ ๐ข = Pr ๐๐ข = ๐ and, for ๐ก โ ๐ข โ 1 , ๐๐ก โ ๐ ๐0 = ๐]
- A state ๐ is recurrent if ฯ๐ขโฅ1 ๐
๐,๐ ๐ข = 1 and transient if ฯ๐ขโฅ1 ๐ ๐,๐ ๐ข < 1.
A Markov chain is recurrent if every state in it is recurrent.
โ If a state ๐ is recurrent, once the chain visits ๐, it will return again and again.
- The hitting time โ๐,๐ is the expected time to first reach state ๐ from
state ๐: โ๐,๐ = ฯ๐ขโฅ1 ๐ข โ
๐
๐,๐ ๐ข .
- A recurrent state ๐ is positive recurrent if โ๐,๐ is finite.
โ In a finite Markov chain, all recurrent states are positive recurrent.
- A state is ergodic if it is aperiodic and positive recurrent.
- A Markov chain is ergodic if all its states are ergodic.
4/23/2020
Sofya Raskhodnikova; Randomness in Computing