Chapter 4. Markov Chains
- Prof. Shun-Ren Yang
Chapter 4. Markov Chains Prof. Shun-Ren Yang Department of Computer - - PowerPoint PPT Presentation
Chapter 4. Markov Chains Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Introduction A Markov chain : Consider a stochastic process { X n , n = 0 , 1 , 2 , . . . } that takes on a finite or
1
∞
2
3
4
5
6
∞
7
∞
8
∞
9
n
10
1 Xi, is said to be a
11
12
13
ij to be the probability that
ij = P{Xn+m = j|Xm = i},
ij = Pij.
ij
∞
ikP m kj
14
ij
ij, then
15
16
ij > 0.
ij > 0, P n jk > 0. Hence,
ik
∞
ir P n rk ≥ P m ij P n jk > 0.
ki > 0.
17
ii = 0 whenever n is not divisible
ii = 0 for all
18
ij P n ji > 0, and suppose that P s ii > 0.
jj
jiP m ij > 0 and P n+s+m jj
jiP s iiP m ij > 0.
ii > 0. Therefore, d(j) divides
19
ij to be the probability that, starting in
ij
ij
∞
ij.
20
∞
jj = ∞.
21
⎧ ⎨ ⎩
0 In denotes the number of visits to j. Since
∞
∞
∞
jj,
22
23
ij > 0, P m ji > 0. Now for any s ≥ 0
jj
ji P s iiP n ij
jj
ji P n ij
ii = ∞,
24
n=1 P n 00 is finite or
25
00
00 = (2n
26
00 ∼ (4p(1 − p))n
n an < ∞, if, and only if,
n=1 P n 00 will converge if, and only if, ∞
2.
n=1 P n 00 = ∞ if, and only if, p = 1
2 and transient if p = 1 2.
27
2, the above process is called a symmetric random walk. We
4.
6,
28
ij > 0.
ij, and is thus finite
29
jj, n ≥ 1}.
ij, n ≥ 1}.
30
∞
ij < ∞
ij → 0 as n → ∞.
⎧ ⎪ ⎪ ⎨ ⎪ ⎪ ⎩
∞
jj
31
t→∞ Nj(t)/t = 1/µjj|X0 = i} = 1
n→∞ n
ij/n = 1/µjj
n→∞ P n ij = 1/µjj
n→∞ P nd jj = d/µjj
32
n→∞ P nd(j) jj
33
∞
∞
∞
34
∞
∞
35
ij → 0 as n → ∞ for all i, j and there exists no stationary
n→∞ P n ij > 0
36
ij
∞
ikPkj ≥ M
ikPkj
M
∞
37
∞
∞
∞
∞
∞
∞
∞
0 πk, we see that {Pj, j = 0, 1, 2, . . .} is a
38
∞
∞
ijPi.
M
ijPi
∞
39
ij ≤ 1 to obtain
M
ijPi + ∞
M
∞
0 Pi = 1, we obtain upon letting M → ∞ that
∞
40
ij → 0, which is
41
42
n→∞ P nd jj =
43
∞
j jaj. Since ρ equals the mean number of arrivals during a
44
j+1
∞
∞
45
∞
j+1
∞
∞
s→1 A(s) = ∞
46
s→1 π(s)
s→1
∞
s→1 π(s) =
i=0 πi, this implies that
∞
i=0 πi = π0/(1 − ρ); thus stationary probabilities exist if and only if
47
48
49
50
ji = 0 for all n. Hence if the process starts in state i, there is
51
52
∈R k / ∈T
53
54
55
56
⎧ ⎪ ⎪ ⎨ ⎪ ⎪ ⎩
⎧ ⎪ ⎪ ⎨ ⎪ ⎪ ⎩
2
2.
⎧ ⎨ ⎩
2
2.
57
2, there is a positive probability that the gambler’s fortune
2, then, with probability 1, the gambler will
58
ij defined by
ij
59
ij = πjPji
60
ij = Pij for all i, j, then the Markov chain is said to be time
61
ij
ji
62
n
ij
n
ji
63
ij] such
ji,
ij are the
ji
64
ji = πiPij
ij are the transition probabilities of the reversed
ji to obtain both the stationary
65
66
⎧ ⎨ ⎩
∞
67
68
t→∞ P{Z(t) = i|Z(0) = j}
69
t→∞
70
71
72
Ni(m)
Ni(m)
Ni(m)
Ni(m)
73
Ni(m)
m→∞ Pi=m =
74
4 and 1 4.
3 and the fair condition with
3.
75
76
77
t→∞ P{Z(t) = i, Y (t) > x, S(t) = j}.
t→∞ P{Z(t) = i, Y (t) > x, S(t) = j|Z(0) = k}
∞
x F ij(y)dy
78
∞
79
∞
∞
x
t→∞ P{Z(t) = i, Y (t) > x|Z(0) = k} =
∞
x
80