advanced algorithms xii
play

Advanced Algorithms (XII) Shanghai Jiao Tong University Chihao - PowerPoint PPT Presentation

Advanced Algorithms (XII) Shanghai Jiao Tong University Chihao Zhang May 25, 2020 Random Walk on a Graph Random Walk on a Graph 1 1 2 8 1 3 1 4 3 2 8 3 3 1 4 3 2 Random Walk on a Graph 1 3 1 1 1 2 2 8 8 8 1 3 1 1


  1. Advanced Algorithms (XII) Shanghai Jiao Tong University Chihao Zhang May 25, 2020

  2. Random Walk on a Graph

  3. Random Walk on a Graph 1 1 2 8 1 3 1 4 3 2 8 3 3 1 4 3 2

  4. Random Walk on a Graph 1 3 1 1 1 2 2 8 8 8 1 3 1 1 2 4 P = [ p ij ] 1 ≤ i , j ≤ n = 0 3 3 3 2 8 3 3 1 3 1 4 0 3 3 4 2

  5. Random Walk on a Graph 1 3 1 1 1 2 2 8 8 8 1 3 1 1 2 4 P = [ p ij ] 1 ≤ i , j ≤ n = 0 3 3 3 2 8 3 3 1 3 1 4 0 3 3 4 2 p ij = Pr [ X t +1 = j ∣ X t = i ]

  6. Random Walk on a Graph 1 3 1 1 1 2 2 8 8 8 1 3 1 1 2 4 P = [ p ij ] 1 ≤ i , j ≤ n = 0 3 3 3 2 8 3 3 1 3 1 4 0 3 3 4 2 ∀ t ≥ 0, μ T t = μ T 0 P t p ij = Pr [ X t +1 = j ∣ X t = i ]

  7. Random Walk on a Graph 1 3 1 1 1 2 2 8 8 8 1 3 1 1 2 4 P = [ p ij ] 1 ≤ i , j ≤ n = 0 3 3 3 2 8 3 3 1 3 1 4 0 3 3 4 2 ∀ t ≥ 0, μ T t = μ T 0 P t p ij = Pr [ X t +1 = j ∣ X t = i ] Stationary distribution : π π T P = π T

  8. Fundamental Theorem of Markov Chains

  9. Fundamental Theorem of Markov Chains We study a few basic questions regarding a chain:

  10. Fundamental Theorem of Markov Chains We study a few basic questions regarding a chain: • Does a stationary distribution always exist?

  11. Fundamental Theorem of Markov Chains We study a few basic questions regarding a chain: • Does a stationary distribution always exist? • If so, is the stationary distribution unique?

  12. Fundamental Theorem of Markov Chains We study a few basic questions regarding a chain: • Does a stationary distribution always exist? • If so, is the stationary distribution unique? • If so, does any initial distribution converge to it?

  13. Existence of Stationary Distribution

  14. Existence of Stationary Distribution Yes, any Markov chain has a stationary distribution

  15. Existence of Stationary Distribution Yes, any Markov chain has a stationary distribution Perron-Frobenius Any positive matrix matrix n × n has a positive real eigenvalue A with . Moreover, its λ ρ ( A ) = λ eigenvector is positive.

  16. Existence of Stationary Distribution Yes, any Markov chain has a stationary distribution Perron-Frobenius λ ( P T ) = λ ( P ) = 1 Any positive matrix matrix n × n has a positive real eigenvalue A with . Moreover, its λ ρ ( A ) = λ eigenvector is positive.

  17. Existence of Stationary Distribution Yes, any Markov chain has a stationary distribution Perron-Frobenius λ ( P T ) = λ ( P ) = 1 Any positive matrix matrix n × n has a positive real eigenvalue A The positive with . Moreover, its λ ρ ( A ) = λ eigenvector is π eigenvector is positive.

  18. Uniqueness and Convergence

  19. Uniqueness and Convergence p 1 − q 1 − p 1 2 q

  20. Uniqueness and Convergence P = [ p 1 − q 1 − q ] 1 − p 1 − p p 1 2 q q

  21. Uniqueness and Convergence P = [ p 1 − q 1 − q ] 1 − p 1 − p p 1 2 q q π = ( T p + q ) q p is a stationary dist. of p + q , P

  22. Uniqueness and Convergence P = [ p 1 − q 1 − q ] 1 − p 1 − p p 1 2 q q π = ( T p + q ) q p is a stationary dist. of p + q , P T Start from an arbitrary μ 0 = ( μ (1), μ (2) )

  23. Uniqueness and Convergence P = [ p 1 − q 1 − q ] 1 − p 1 − p p 1 2 q q π = ( T p + q ) q p is a stationary dist. of p + q , P T Start from an arbitrary μ 0 = ( μ (1), μ (2) ) 0 P t − π T ∥ Compute ∥ μ T

  24. Δ t = | μ t (1) − π (1) |

  25. Δ t = | μ t (1) − π (1) | q Δ t +1 = μ t +1 (1) − p + q q μ t (1 − p ) + (1 − μ t (1)) q − = p + q q = (1 − p − q ) μ t (1) − = (1 − p − q ) ⋅ Δ t p + q

  26. Δ t = | μ t (1) − π (1) | q Δ t +1 = μ t +1 (1) − p + q q μ t (1 − p ) + (1 − μ t (1)) q − = p + q q = (1 − p − q ) μ t (1) − = (1 − p − q ) ⋅ Δ t p + q Since , there are two ways to prohibit p , q ∈ [0,1] : or Δ t → 0 p = q = 1 p = q = 0

  27. p = q = 0

  28. p = q = 0 1 1 1 2

  29. p = q = 0 1 1 1 2 ∀ t , Δ t = Δ 0

  30. p = q = 0 The graph is disconnected 1 1 1 2 ∀ t , Δ t = Δ 0

  31. p = q = 0 The graph is disconnected 1 1 1 2 The chain is called reducible ∀ t , Δ t = Δ 0

  32. p = q = 0 The graph is disconnected 1 1 1 2 The chain is called reducible ∀ t , Δ t = Δ 0 In this case, the stationary distribution is not unique

  33. p = q = 0 The graph is disconnected 1 1 1 2 The chain is called reducible ∀ t , Δ t = Δ 0 In this case, the stationary distribution is not unique Chain = convex combination of small chains

  34. p = q = 0 The graph is disconnected 1 1 1 2 The chain is called reducible ∀ t , Δ t = Δ 0 In this case, the stationary distribution is not unique Chain = convex combination of small chains Stationary distribution=convex combination of “small” distributions

  35. p = q = 1

  36. p = q = 1 1 1 2 1

  37. p = q = 1 1 1 2 1 ∀ t , Δ t = − Δ t − 1

  38. p = q = 1 The graph is bipartite 1 1 2 1 ∀ t , Δ t = − Δ t − 1

  39. p = q = 1 The graph is bipartite 1 1 2 The chain is called periodic 1 ∀ t , Δ t = − Δ t − 1

  40. p = q = 1 The graph is bipartite 1 1 2 The chain is called periodic 1 ∀ t , Δ t = − Δ t − 1 Formally, ∃ v , gcd C ∈ C v | C | > 1

  41. p = q = 1 The graph is bipartite 1 1 2 The chain is called periodic 1 ∀ t , Δ t = − Δ t − 1 Formally, ∃ v , gcd C ∈ C v | C | > 1 In this case, not all initial distribution converges to the stationary distribution

  42. Fundamental Theorem of Markov Chains

  43. Fundamental Theorem of Markov Chains If a finite chain is irreducible and aperiodic, then P it has a unique stationary distribution . Moreover, π for any initial distribution , it holds that μ t →∞ μ T P t = π T lim

  44. Fundamental Theorem of Markov Chains If a finite chain is irreducible and aperiodic, then P it has a unique stationary distribution . Moreover, π for any initial distribution , it holds that μ t →∞ μ T P t = π T lim (Show on board, see the note for details)

  45. Reversible Chains

  46. Reversible Chains We study a special family of Markov chains called reversible chains

  47. Reversible Chains We study a special family of Markov chains called reversible chains Their transition graphs are undirected x → y ⟺ y → x

  48. Reversible Chains We study a special family of Markov chains called reversible chains Their transition graphs are undirected x → y ⟺ y → x A chain and a distribution satisfies detailed P π balance condition :

  49. Reversible Chains We study a special family of Markov chains called reversible chains Their transition graphs are undirected x → y ⟺ y → x A chain and a distribution satisfies detailed P π balance condition : ∀ x , y ∈ V , π ( x ) ⋅ P ( x , y ) = π ( y ) ⋅ P ( y , x )

  50. Reversible Chains We study a special family of Markov chains called reversible chains Their transition graphs are undirected x → y ⟺ y → x A chain and a distribution satisfies detailed P π balance condition : ∀ x , y ∈ V , π ( x ) ⋅ P ( x , y ) = π ( y ) ⋅ P ( y , x ) Then is a stationary distribution of π P

  51. We study reversible chains because

  52. We study reversible chains because • They are quite general. For any , one can define π an reversible whose stationary distribution is π P

  53. We study reversible chains because • They are quite general. For any , one can define π an reversible whose stationary distribution is π P Helpful for Sampling

  54. We study reversible chains because • They are quite general. For any , one can define π an reversible whose stationary distribution is π P Helpful for Sampling • We have powerful tools (spectral method) to analyze reversible chains

  55. Spectral Decomposition Theorem

  56. Spectral Decomposition Theorem An symmetric matrix has real eigenvalues n × n A n with corresponding eigenvectors λ 1 , …, λ n v 1 , …, v n which are orthogonal. Moreover, it holds that A = V Λ V T

  57. Spectral Decomposition Theorem An symmetric matrix has real eigenvalues n × n A n with corresponding eigenvectors λ 1 , …, λ n v 1 , …, v n which are orthogonal. Moreover, it holds that A = V Λ V T where and V = [ v 1 , …, v n ] Λ = diag( λ 1 , …, λ n )

  58. Spectral Decomposition Theorem An symmetric matrix has real eigenvalues n × n A n with corresponding eigenvectors λ 1 , …, λ n v 1 , …, v n which are orthogonal. Moreover, it holds that A = V Λ V T where and V = [ v 1 , …, v n ] Λ = diag( λ 1 , …, λ n ) n ∑ Equivalently, A = λ i v i v T i i =1

  59. Spectral Decomposition Theorem for Reversible Chains

  60. Spectral Decomposition Theorem for Reversible Chains is a stationary distribution of a reversible chain π P

  61. Spectral Decomposition Theorem for Reversible Chains is a stationary distribution of a reversible chain π P Define an inner product on ℝ n : ⟨ ⋅ , ⋅ ⟩ π

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend