MATH 20: PROBABILITY
Markov Chain Xingru Chen xingru.chen.gr@dartmouth.edu
XC 2020
MATH 20: PROBABILITY Markov Chain Xingru Chen - - PowerPoint PPT Presentation
MATH 20: PROBABILITY Markov Chain Xingru Chen xingru.chen.gr@dartmouth.edu XC 2020 Random Walk 4 1 3 5 2 A ra walk is a mathematical object, known as a stochastic or random process, random that describes a path that
Markov Chain Xingru Chen xingru.chen.gr@dartmouth.edu
XC 2020
β¦ A ra random walk is a mathematical
known as a stochastic
process, that describes a path that consists
a succession
some mathematical space such as the integers.
XC 2020
1 2 3 4 5
XC 2020
Β§ We describe a Markov chain as follows: We have a set
states, π = π‘!, π‘", β― , π‘# . Β§ The process starts in
these states and moves successively from
state to
move is called a step.
π‘! π‘" π‘# π‘$ π‘%
XC 2020
Β§ If the chain is currently in state π‘!, then it moves to state π‘
" at
the next step with a probability denoted by π!". Β§ The probability π!" does not depend upon which states the chain was in before the current state. Β§ These probabilities are called transition probabilities.
π‘! π‘" π‘# π‘$ π‘%
π!" π$" π%& π&" π"&
XC 2020
Β§ The process can remain in the state it is in, and this
with probability π!!.
π‘! π‘" π‘# π‘$ π‘%
π!! π%%
XC 2020
Β§ An initial probability distribution, defined
π, specifies the starting
this is done by specifying a particular state as the starting state.
π£!
π‘#
π£"
π‘$
π£#
π‘%
π£$
π‘&
π£%
π‘' π£ = π£# π£$ π£% π£& π£' 1 (
!(# )
π£! = 1
XC 2020
Β§ the Land
Oz is blessed by many things, but not by good weather. Β§ They never have two nice days in a
they have a nice day, they are just as likely to have snow as rain the next day. Β§ If they have snow
rain, they have an even chance
having the same the next day. Β§ If there is change from snow
rain,
half
the time is this a change to a nice day.
XC 2020
Β§ They never have two nice days in a
they have a nice day, they are just as likely to have snow as rain the next day. 1 2 1 2
XC 2020
Β§ If they have snow
rain, they have an even chance
having the same the next day. 1 2 1 2 1 2 1 2
XC 2020
Β§ If there is change from snow
rain,
half
the time is this a change to a nice day. 1 2 1 2 1 2 1 2 1 4 1 4
XC 2020
Β§ If there is change from snow
rain,
half
the time is this a change to a nice day. 1 2 1 2 1 2 1 2 1 4 1 4 1 4 1 4
XC 2020
1 2 1 2 1 2 1 2 1 4 1 4 1 4 1 4
XC 2020
1 2 1 2 1 2 1 2 1 4 1 4 1 4 1 4
) * ) + ) +
XC 2020
1 2 1 2 1 2 1 2 1 4 1 4 1 4 1 4
) * ) + ) + ) * ) *
XC 2020
1 2 1 2 1 2 1 2 1 4 1 4 1 4 1 4
) * ) + ) + ) * ) * ) + ) + ) *
XC 2020
1 2 1 2 1 2 1 2 1 4 1 4 1 4 1 4 Β§ States: Β§ π‘!: rain Β§ π‘": nice Β§ π‘&: snow Β§ π =
! " ! $ ! $ ! " ! " ! $ ! $ ! "
XC 2020
Β§ The entries in the first row
the matrix π in the example represent the probabilities for the various kinds
weather following a rainy day. Β§ Similarly, the entries in the second and third rows represent the probabilities for the various kinds
weather following nice and snowy days, respectively. Β§ Such a square array is called the matrix
transition probabilities,
the transition matrix.
π = 1 2 1 4 1 4 1 2 1 2 1 4 1 4 1 2
1 2 1 2 1 2 1 2 1 4 1 4 1 4 1 4
XC 2020
π = 1 2 1 4 1 4 1 2 1 2 1 4 1 4 1 2
the probability that, given the chain is in state π today, it will be in state π tomorrow
Β§ States: Β§ π‘!: rain Β§ π‘": nice Β§ π‘&: snow
1 2 1 2 1 2 1 2 1 4 1 4 1 4 1 4
XC 2020
π = 1 2 1 4 1 4 1 2 1 2 1 4 1 4 1 2
the probability that, given the chain is in state π today, it will be in state π tomorrow
Β§ States: Β§ π‘!: rain Β§ π‘": nice Β§ π‘&: snow π'(
(!) = π'(
the probability that, given the chain is in state π today, it will be in state π the day after tomorrow
π'(
(") = β―
1 2 1 2 1 2 1 2 1 4 1 4 1 4 1 4
XC 2020
π = 1 2 1 4 1 4 1 2 1 2 1 4 1 4 1 2 Β§ States: Β§ π‘!: rain Β§ π‘": nice Β§ π‘&: snow
Day Day 1 Day 2 β¦
π!&
(") = β―
1 2 1 2 1 2 1 2 1 4 1 4 1 4 1 4
XC 2020
1 2 1 2 1 2 1 2 1 4 1 4 1 4 1 4
π = 1 2 1 4 1 4 1 2 1 2 1 4 1 4 1 2 Β§ States: Β§ π‘#: rain Β§ π‘$: nice Β§ π‘%: snow Day Day 1 Day 2 β¦
π!&
(")
= π!!π!& + π!"π"& + π!&π&&
π## π#% π#$ π$% π#% π%%
XC 2020
π = π## π#$ π#% π$# π$$ π$% π%# π%$ π%% = 1 2 1 4 1 4 1 2 1 2 1 4 1 4 1 2 Β§ States: Β§ π‘#: rain Β§ π‘$: nice Β§ π‘%: snow π#%
($) = π##π#% + π#$π$% + π#%π%%
Day Day 1 Day 2
β¦ β¦ π!! π!# π!" π"# π!# π##
π## π#$ π#% π#% π$% π%% π$ = π## π#$ π#% π$# π$$ π$% π%# π%$ π%% 2 π## π#$ π#% π$# π$$ π$% π%# π%$ π%% = π#%
($)
XC 2020
π#%
($) = π##π#% + π#$π$% + π#%π%%
= (
,(# %
π#,π,%
Day Day 1 Day 2
β¦ β¦ π!! π!# π!" π"# π!# π##
Day Day 1 Day 2
β¦ β¦ π!! π!# π!" π"# π!& π&# β¦ π!β― πβ―#
π#%
($) = π##π#% + π#$π$% + β― + π#-π-%
= (
,(#
XC 2020
Day Day 1 Day 2 Day β¦ Day n β¦ οΌ β¦ οΌ β¦
Day Day 1 Day 2
β¦ β¦ π!! π!# π!" π"# π!# π##
π$ = π## π#$ π#% π$# π$$ π$% π%# π%$ π%% 2 π## π#$ π#% π$# π$$ π$% π%# π%$ π%% = π#%
($)
π) = π## π#$ π#% π$# π$$ π$% π%# π%$ π%%
)
= π#%
())
XC 2020
Β§ Let π be the transition matrix
a Markov chain. Β§ The ππth entry π'( of the matrix π+ gives the probability that the Markov chain, starting in state π‘', will be in state π‘
( after
π steps.
Day Day 1 Day 2 Day β¦ Day n β¦ οΌ β¦ οΌ β¦
π) = π## π#$ π#% π$# π$$ π$% π%# π%$ π%%
)
= π#%
())
XC 2020
Β§ Starting states: Β§ rain: π£# Β§ nice: π£$ Β§ snow: π£%
π£ = π£! π£" π£&
Β§ Transition matrix: π = π!! π!" π!# π"! π"" π"# π#! π#" π## = 1 2 1 4 1 4 1 2 1 2 1 4 1 4 1 2
π!" π!# π## π"" π"! π#! π"# π#"
π£%
(#) = π£#π#% + π£$π$% + π£%π%%
π£# π£$ π£% π## π#$ π#% π$# π$$ π$% π%# π%$ π%% π£(#) = π£#
(#)
π£$
(#)
π£%
(#) = π£π
π!# π"# π##
Day Day 1 β¦
β¦ π£! π£" π£# Β§ the probability that the chain is in state π‘( after π steps: Β§ π = 3 Β§ π = 1
XC 2020
π£,
(#) = π£#π#, + π£$π$, + π£%π%,
π£(#) = π£#
(#)
π£$
(#)
π£%
(#) = π£π
π!# π"# π##
Day Day 1 β¦
β¦ π£! π£" π£# Β§ the probability that the chain is in state π‘( after π steps: Β§ π = 1
π£%
(#) = π£#π#% + π£$π$% + π£%π%%
Β§ the probability that the chain is in state π‘( after π steps: Β§ π = 2
Day Day 1 Day 2
β¦ π£! β¦ π!! π!# π!" π"# π!# π## π£" π£# π!#
(") = π!!π!# + π!"π"# + π!#π## = . (+! #
π!(π(# π£#
(") = π£!π!# (") + π£"π"# (") + π£#π## (")
π£(
(") = π£!π!( (") + π£"π"( (") + π£#π#( (")
π£(") = π£!
(")
π£"
(")
π£#
(") = π£π"
XC 2020
Β§ Let π be the transition matrix
a Markov chain, and let π£ be the probability vector which represents the starting distribution. Β§ Then the probability that the chain is in state π‘, after π steps is the πth entry in the vector π£(+) = π£π+.
Day Day 1 Day 2 Day β¦ Day n β¦ οΌ β¦ οΌ β¦
π£()) = π£π) = π£ π## π#$ π#% π$# π$$ π$% π%# π%$ π%%
)
= π£#
())
π£$
())
π£%
())
π£! π£" π£#
XC 2020
Open book Scope: Mostly Chapters 7, 8, 9, 10, and 11 Β§ sum
random variables Β§ LLN and CLT Β§ generating functions Β§ Markov chains Materials: sl slides, ho homework, qu quizzes, textbook Date & Time: 12:00 pm β 10: 00 pm, August 30 Office hours: August 27, 28 Homework due: 11:00 pm August 28
Fi Final
26 26 27 27 28 28 29 29 30 30 31 31 01 01 02 02 03 03 04 04 05 05 06 06 07 07 08 08 09 09 10 10 11 11 12 12 13 13 14 14 15 15 16 16 17 17 18 18 19 19 20 20 21 21 22 22 23 23 24 24 25 25 26 26 28 28 27 27 30 30 05 05 31 31 01 01 02 02 04 04 03 03 29 29
Sun Mon Tue Wed Thu Fri Sat
XC 2020