scheidegger networks a bonus
play

Scheidegger NetworksA Bonus First return random walk Calculation - PowerPoint PPT Presentation

Scheidegger Networks Scheidegger NetworksA Bonus First return random walk Calculation References Complex Networks, Course 295A, Spring, 2008 Prof. Peter Dodds Department of Mathematics & Statistics University of Vermont Licensed


  1. Scheidegger Networks Scheidegger Networks—A Bonus First return random walk Calculation References Complex Networks, Course 295A, Spring, 2008 Prof. Peter Dodds Department of Mathematics & Statistics University of Vermont Licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 License . Frame 1/11

  2. Scheidegger Outline Networks First return random walk References First return random walk Frame 2/11

  3. Scheidegger Outline Networks First return random walk References First return random walk References Frame 2/11

  4. Scheidegger Random walks Networks First return random walk ◮ We’ve seen that Scheidegger networks have random References walk boundaries [1, 2] Frame 3/11

  5. Scheidegger Random walks Networks First return random walk ◮ We’ve seen that Scheidegger networks have random References walk boundaries [1, 2] ◮ Determining expected shape of a ‘basin’ becomes a problem of finding the probability that a 1-d random walk returns to the origin after t time steps Frame 3/11

  6. Scheidegger Random walks Networks First return random walk ◮ We’ve seen that Scheidegger networks have random References walk boundaries [1, 2] ◮ Determining expected shape of a ‘basin’ becomes a problem of finding the probability that a 1-d random walk returns to the origin after t time steps ◮ We solved this with a counting argument for the discrete random walk the preceding Complex Systems course Frame 3/11

  7. Scheidegger Random walks Networks First return random walk ◮ We’ve seen that Scheidegger networks have random References walk boundaries [1, 2] ◮ Determining expected shape of a ‘basin’ becomes a problem of finding the probability that a 1-d random walk returns to the origin after t time steps ◮ We solved this with a counting argument for the discrete random walk the preceding Complex Systems course ◮ For fun and the constitution, let’s work on the continuous time Wiener process version Frame 3/11

  8. Scheidegger Random walks Networks First return random walk ◮ We’ve seen that Scheidegger networks have random References walk boundaries [1, 2] ◮ Determining expected shape of a ‘basin’ becomes a problem of finding the probability that a 1-d random walk returns to the origin after t time steps ◮ We solved this with a counting argument for the discrete random walk the preceding Complex Systems course ◮ For fun and the constitution, let’s work on the continuous time Wiener process version ◮ A classic, delightful problem Frame 3/11

  9. Scheidegger Random walks Networks First return random walk References The Wiener process ( ⊞ ) Frame 4/11

  10. Scheidegger Random walking on a sphere... Networks First return random walk References The Wiener process ( ⊞ ) Frame 5/11

  11. Scheidegger Random walks Networks First return random walk References ◮ Wiener process = Brownian motion Frame 6/11

  12. Scheidegger Random walks Networks First return random walk References ◮ Wiener process = Brownian motion ◮ x ( t 2 ) − x ( t 1 ) ∼ N ( 0 , t 2 − t 1 ) where 1 e − x 2 / 2 t √ N ( x , t ) = 2 π t Frame 6/11

  13. Scheidegger Random walks Networks First return random walk References ◮ Wiener process = Brownian motion ◮ x ( t 2 ) − x ( t 1 ) ∼ N ( 0 , t 2 − t 1 ) where 1 e − x 2 / 2 t √ N ( x , t ) = 2 π t ◮ Continuous but nowhere differentiable Frame 6/11

  14. Scheidegger First return Networks ◮ Objective: find g ( t ) , the probability that Wiener First return random walk process first returns to the origin at time t . References Frame 7/11

  15. Scheidegger First return Networks ◮ Objective: find g ( t ) , the probability that Wiener First return random walk process first returns to the origin at time t . References ◮ Use what we know: the probability density for a return (not necessarily the first) at time t is 1 1 e − 0 / 2 t = √ √ f ( t ) = 2 π t 2 π t Frame 7/11

  16. Scheidegger First return Networks ◮ Objective: find g ( t ) , the probability that Wiener First return random walk process first returns to the origin at time t . References ◮ Use what we know: the probability density for a return (not necessarily the first) at time t is 1 1 e − 0 / 2 t = √ √ f ( t ) = 2 π t 2 π t ◮ Observe that f and g are connected like this: � t f ( t ) = f ( τ ) g ( t − τ ) d τ + δ ( t ) τ = 0 ���� Dirac delta function Frame 7/11

  17. Scheidegger First return Networks ◮ Objective: find g ( t ) , the probability that Wiener First return random walk process first returns to the origin at time t . References ◮ Use what we know: the probability density for a return (not necessarily the first) at time t is 1 1 e − 0 / 2 t = √ √ f ( t ) = 2 π t 2 π t ◮ Observe that f and g are connected like this: � t f ( t ) = f ( τ ) g ( t − τ ) d τ + δ ( t ) τ = 0 ���� Dirac delta function ◮ In words: Probability of returning at time t equals the integral of the probability of returning at time τ and then not returning until exactly t − τ time units later. Frame 7/11

  18. Scheidegger First return Networks First return random walk ◮ Next see that right hand side of References � t f ( t ) = τ = 0 f ( τ ) g ( t − τ ) d τ + δ ( t ) is a juicy convolution. Frame 8/11

  19. Scheidegger First return Networks First return random walk ◮ Next see that right hand side of References � t f ( t ) = τ = 0 f ( τ ) g ( t − τ ) d τ + δ ( t ) is a juicy convolution. ◮ So we take the Laplace transform: � ∞ t = 0 − f ( t ) e − st d t L [ f ( t )] = F ( s ) = Frame 8/11

  20. Scheidegger First return Networks First return random walk ◮ Next see that right hand side of References � t f ( t ) = τ = 0 f ( τ ) g ( t − τ ) d τ + δ ( t ) is a juicy convolution. ◮ So we take the Laplace transform: � ∞ t = 0 − f ( t ) e − st d t L [ f ( t )] = F ( s ) = ◮ and obtain F ( s ) = F ( s ) G ( s ) + 1 Frame 8/11

  21. Scheidegger First return Networks First return random walk ◮ Next see that right hand side of References � t f ( t ) = τ = 0 f ( τ ) g ( t − τ ) d τ + δ ( t ) is a juicy convolution. ◮ So we take the Laplace transform: � ∞ t = 0 − f ( t ) e − st d t L [ f ( t )] = F ( s ) = ◮ and obtain F ( s ) = F ( s ) G ( s ) + 1 ◮ Rearrange: G ( s ) = 1 − 1 / F ( s ) Frame 8/11

  22. Scheidegger First return Networks First return random walk References ◮ We are here: G ( s ) = 1 − 1 / F ( s ) Frame 9/11

  23. Scheidegger First return Networks First return random walk References ◮ We are here: G ( s ) = 1 − 1 / F ( s ) ◮ Now we want to invert G ( s ) to find g ( t ) Frame 9/11

  24. Scheidegger First return Networks First return random walk References ◮ We are here: G ( s ) = 1 − 1 / F ( s ) ◮ Now we want to invert G ( s ) to find g ( t ) ◮ Use calculation that F ( s ) = ( 2 s ) − 1 / 2 Frame 9/11

  25. Scheidegger First return Networks First return random walk References ◮ We are here: G ( s ) = 1 − 1 / F ( s ) ◮ Now we want to invert G ( s ) to find g ( t ) ◮ Use calculation that F ( s ) = ( 2 s ) − 1 / 2 ◮ G ( s ) = 1 − ( 2 s ) 1 / 2 Frame 9/11

  26. Scheidegger First return Networks First return random walk References ◮ We are here: G ( s ) = 1 − 1 / F ( s ) ◮ Now we want to invert G ( s ) to find g ( t ) ◮ Use calculation that F ( s ) = ( 2 s ) − 1 / 2 ◮ G ( s ) = 1 − ( 2 s ) 1 / 2 ≃ e − ( 2 s ) 1 / 2 Frame 9/11

  27. Scheidegger First return Networks First return random walk References Groovy aspects of g ( t ) ∼ t − 3 / 2 : Frame 10/11

  28. Scheidegger First return Networks First return random walk References Groovy aspects of g ( t ) ∼ t − 3 / 2 : ◮ Variance is infinite (weird but okay...) Frame 10/11

  29. Scheidegger First return Networks First return random walk References Groovy aspects of g ( t ) ∼ t − 3 / 2 : ◮ Variance is infinite (weird but okay...) ◮ Mean is also infinite (just plain crazy...) Frame 10/11

  30. Scheidegger First return Networks First return random walk References Groovy aspects of g ( t ) ∼ t − 3 / 2 : ◮ Variance is infinite (weird but okay...) ◮ Mean is also infinite (just plain crazy...) ◮ Distribution is normalizable so process always returns to 0. Frame 10/11

  31. Scheidegger First return Networks First return random walk References Groovy aspects of g ( t ) ∼ t − 3 / 2 : ◮ Variance is infinite (weird but okay...) ◮ Mean is also infinite (just plain crazy...) ◮ Distribution is normalizable so process always returns to 0. ◮ For river networks: P ( ℓ ) ∼ ℓ − γ so γ = 3 / 2 for Scheidegger networks. Frame 10/11

  32. Scheidegger References I Networks First return random walk References A. E. Scheidegger. A stochastic model for drainage patterns into an intramontane trench. Bull. Int. Assoc. Sci. Hydrol. , 12(1):15–20, 1967. . A. E. Scheidegger. Theoretical Geomorphology . Springer-Verlag, New York, third edition, 1991. . Frame 11/11

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend