a line breaking construction of the stable trees
play

A line-breaking construction of the stable trees Christina - PowerPoint PPT Presentation

AofA15, Strobl, Austria, 8-12 June 2015 A line-breaking construction of the stable trees Christina Goldschmidt (Oxford) Joint work with B en edicte Haas (Paris-Dauphine) Uniform random trees Let T n be the set of unordered trees on n


  1. The infinite-variance case Theorem (Duquesne & Le Gall (2002); Duquesne (2003)) Suppose that ( p k ) k ≥ 0 lies in the domain of attraction of a stable law of index α ∈ (1 , 2) . Then as n → ∞ , 1 d n 1 − 1 /α T GW → c α T α , n where T α is the stable tree of parameter α and c α is a non-negative constant. (The convergence is in the sense of the Gromov–Hausdorff distance.)

  2. The stable trees [Pictures by Igor Kortchemski]

  3. The stable trees The stable trees also possess a functional encoding (although the excursions concerned are rather more involved to describe). 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 [Pictures by Igor Kortchemski]

  4. The stable trees The stable trees also possess a functional encoding (although the excursions concerned are rather more involved to describe). 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 [Pictures by Igor Kortchemski] An important difference between the stable trees for α ∈ (1 , 2) and the Brownian CRT is that the Brownian CRT is binary. The stable trees, on the other hand, have only branch-points of infinite degree.

  5. A uniform measure The principal theme of the rest of this talk is how to give a (relatively) simple description of the stable trees (and how to use it to get at their distributional properties).

  6. A uniform measure The principal theme of the rest of this talk is how to give a (relatively) simple description of the stable trees (and how to use it to get at their distributional properties). For α ∈ (1 , 2], the stable tree T α is naturally endowed with a “uniform” probability measure µ α , which is the limit of the discrete uniform measure on T GW . It turns out that µ α is supported by the n set of leaves of T α .

  7. A uniform measure The principal theme of the rest of this talk is how to give a (relatively) simple description of the stable trees (and how to use it to get at their distributional properties). For α ∈ (1 , 2], the stable tree T α is naturally endowed with a “uniform” probability measure µ α , which is the limit of the discrete uniform measure on T GW . It turns out that µ α is supported by the n set of leaves of T α . Aldous’ theory of continuum random trees tells us that we can characterize the laws of such trees via sampling.

  8. Reduced trees Let X 1 , X 2 , . . . be leaves sampled independently from T α according to µ α , and let T α, n be the subtree spanned by the root ρ and X 1 , . . . , X n : ρ

  9. Reduced trees Let X 1 , X 2 , . . . be leaves sampled independently from T α according to µ α , and let T α, n be the subtree spanned by the root ρ and X 1 , . . . , X n : X 2 X 4 X 3 X 5 X 1 ρ

  10. Characterising the law of a stable tree T α, n can be thought of in two parts: its tree-shape T α, n (a rooted unordered tree with n labelled leaves) and its edge-lengths.

  11. Characterising the law of a stable tree T α, n can be thought of in two parts: its tree-shape T α, n (a rooted unordered tree with n labelled leaves) and its edge-lengths. The laws of ( T α, n , n ≥ 1) (the random finite-dimensional distributions) are sufficient to fully specify the law of T α .

  12. Characterising the law of a stable tree T α, n can be thought of in two parts: its tree-shape T α, n (a rooted unordered tree with n labelled leaves) and its edge-lengths. The laws of ( T α, n , n ≥ 1) (the random finite-dimensional distributions) are sufficient to fully specify the law of T α . Moreover, � T α = T α, n . n ≥ 1

  13. Reminder: Aldous’ line-breaking construction of the Brownian CRT Let C 1 , C 2 , . . . be the points of an inhomogeneous Poisson process on R + of intensity t dt . ... C C C C C C 0 1 2 3 4 5 6

  14. Line-breaking construction ˜ T 1 ... C C C C C C 0 1 2 3 4 5 6

  15. Line-breaking construction ˜ T 2 ... C C C C C C 0 1 2 3 4 5 6

  16. Line-breaking construction ˜ T 3 ... C C C C C C 0 1 2 3 4 5 6

  17. Line-breaking construction ˜ T 4 ... C C C C C C 0 1 2 3 4 5 6

  18. Line-breaking construction ˜ T 5 ... C C C C C C 0 1 2 3 4 5 6

  19. Line-breaking construction ˜ T 6 ... C C C C C C 0 1 2 3 4 5 6

  20. Line-breaking construction It turns out that the line-breaking construction precisely gives the random finite-dimensional distributions for the Brownian CRT, i.e. � � T n , n ≥ 1) d ( ˜ 1 = 2 T 2 , n , n ≥ 1 . √

  21. Line-breaking construction It turns out that the line-breaking construction precisely gives the random finite-dimensional distributions for the Brownian CRT, i.e. � � T n , n ≥ 1) d ( ˜ 1 = 2 T 2 , n , n ≥ 1 . √ Question: does there exist a similar line-breaking construction for the stable trees with α ∈ (1 , 2)?

  22. Marchal’s algorithm Marchal (2008) discovered a recursive construction of the tree-shapes. Build ( ˜ T n , n ≥ 1) as follows:

  23. Marchal’s algorithm Marchal (2008) discovered a recursive construction of the tree-shapes. Build ( ˜ T n , n ≥ 1) as follows: ◮ Start from a single edge, rooted at one end-point and with the other other end-point labelled 1.

  24. Marchal’s algorithm Marchal (2008) discovered a recursive construction of the tree-shapes. Build ( ˜ T n , n ≥ 1) as follows: ◮ Start from a single edge, rooted at one end-point and with the other other end-point labelled 1. ◮ At all subsequent steps, assign edges weight α − 1 and vertices of degree d ≥ 3 weight d − 1 − α .

  25. Marchal’s algorithm Marchal (2008) discovered a recursive construction of the tree-shapes. Build ( ˜ T n , n ≥ 1) as follows: ◮ Start from a single edge, rooted at one end-point and with the other other end-point labelled 1. ◮ At all subsequent steps, assign edges weight α − 1 and vertices of degree d ≥ 3 weight d − 1 − α . ◮ At step n , pick an edge or a vertex with probability proportional to their weights.

  26. Marchal’s algorithm Marchal (2008) discovered a recursive construction of the tree-shapes. Build ( ˜ T n , n ≥ 1) as follows: ◮ Start from a single edge, rooted at one end-point and with the other other end-point labelled 1. ◮ At all subsequent steps, assign edges weight α − 1 and vertices of degree d ≥ 3 weight d − 1 − α . ◮ At step n , pick an edge or a vertex with probability proportional to their weights. ◮ If we pick an edge, subdivide it into two edges and attach the leaf labelled n to the middle vertex we just created.

  27. Marchal’s algorithm Marchal (2008) discovered a recursive construction of the tree-shapes. Build ( ˜ T n , n ≥ 1) as follows: ◮ Start from a single edge, rooted at one end-point and with the other other end-point labelled 1. ◮ At all subsequent steps, assign edges weight α − 1 and vertices of degree d ≥ 3 weight d − 1 − α . ◮ At step n , pick an edge or a vertex with probability proportional to their weights. ◮ If we pick an edge, subdivide it into two edges and attach the leaf labelled n to the middle vertex we just created. ◮ If we pick a vertex, attach the leaf labelled n to it.

  28. Marchal’s algorithm 1 α − 1 ρ

  29. Marchal’s algorithm 1 α − 1 α − 1 2 − α 2 α − 1 ρ

  30. Marchal’s algorithm 1 α − 1 α − 1 3 − α 2 α − 1 α − 1 3 ρ

  31. Marchal’s algorithm 1 4 α − 1 α − 1 α − 1 α − 1 3 − α 2 2 − α α − 1 α − 1 3 ρ

  32. Marchal’s algorithm 1 4 5 2 3 ρ

  33. Marchal’s algorithm 1 6 4 5 2 3 ρ

  34. Marchal’s algorithm 1 6 7 4 5 2 3 ρ

  35. Marchal’s algorithm Then T n , n ≥ 1) d ( ˜ = ( T α, n , n ≥ 1) . (The α = 2 case is R´ emy’s algorithm (1985) for building a uniform binary rooted tree with n labelled leaves.)

  36. Marchal’s algorithm Then T n , n ≥ 1) d ( ˜ = ( T α, n , n ≥ 1) . (The α = 2 case is R´ emy’s algorithm (1985) for building a uniform binary rooted tree with n labelled leaves.) Moreover, 1 a . s . n 1 − 1 /α ˜ → c ′ T n α T α as n → ∞ [Curien-Haas (2013)].

  37. Marchal’s algorithm Then T n , n ≥ 1) d ( ˜ = ( T α, n , n ≥ 1) . (The α = 2 case is R´ emy’s algorithm (1985) for building a uniform binary rooted tree with n labelled leaves.) Moreover, 1 a . s . n 1 − 1 /α ˜ → c ′ T n α T α as n → ∞ [Curien-Haas (2013)]. Our new line-breaking construction gives a nested sequence of continuous trees which converge a.s. to T α without any need for rescaling.

  38. The generalized Mittag-Leffler distribution For β ∈ (0 , 1), let σ β be a stable random variable with Laplace transform E [exp( − λσ β )] = exp( − λ β ) , λ ≥ 0 .

  39. The generalized Mittag-Leffler distribution For β ∈ (0 , 1), let σ β be a stable random variable with Laplace transform E [exp( − λσ β )] = exp( − λ β ) , λ ≥ 0 . Say that a non-negative random variable M has the generalized Mittag-Leffler distribution with parameters β ∈ (0 , 1) and θ > − β , and write M ∼ ML( β, θ ), if � � �� σ − β σ − θ E [ f ( M )] = C β,θ E β f . β for all suitable test-functions f .

  40. The generalized Mittag-Leffler distribution For β ∈ (0 , 1), let σ β be a stable random variable with Laplace transform E [exp( − λσ β )] = exp( − λ β ) , λ ≥ 0 . Say that a non-negative random variable M has the generalized Mittag-Leffler distribution with parameters β ∈ (0 , 1) and θ > − β , and write M ∼ ML( β, θ ), if � � �� σ − β σ − θ E [ f ( M )] = C β,θ E β f . β for all suitable test-functions f . The law of M is characterized by its moments: = Γ( θ )Γ( θ/β + k ) � M k � E Γ( θ/β )Γ( θ + k β ) for any k ≥ 1.

  41. The generalized Mittag-Leffler distribution For β ∈ (0 , 1), let σ β be a stable random variable with Laplace transform E [exp( − λσ β )] = exp( − λ β ) , λ ≥ 0 . Say that a non-negative random variable M has the generalized Mittag-Leffler distribution with parameters β ∈ (0 , 1) and θ > − β , and write M ∼ ML( β, θ ), if � � �� σ − β σ − θ E [ f ( M )] = C β,θ E β f . β for all suitable test-functions f . The law of M is characterized by its moments: = Γ( θ )Γ( θ/β + k ) � M k � E Γ( θ/β )Γ( θ + k β ) for any k ≥ 1. � If β = 1 / 2 and n ≥ 1, ML(1 / 2 , n − 1 / 2) = 2 Gamma( n , 1).

  42. A generalized P´ olya urn scheme ML( β, θ ) arises as an almost sure limit in the context of a generalized P´ olya urn scheme.

  43. A generalized P´ olya urn scheme ML( β, θ ) arises as an almost sure limit in the context of a generalized P´ olya urn scheme. Start with weight 0 on black and weight θ/β on red.

  44. A generalized P´ olya urn scheme ML( β, θ ) arises as an almost sure limit in the context of a generalized P´ olya urn scheme. Start with weight 0 on black and weight θ/β on red. Pick a colour with probability proportional to its weight in the urn.

  45. A generalized P´ olya urn scheme ML( β, θ ) arises as an almost sure limit in the context of a generalized P´ olya urn scheme. Start with weight 0 on black and weight θ/β on red. Pick a colour with probability proportional to its weight in the urn. ◮ If black is picked, add 1 /β to the black weight. ◮ If red is picked, add 1 − 1 /β to the black weight and 1 to the red weight. Let R n be the weight of red at step n . Then [Janson (2006)], a.s. n − β R n → W ∼ ML( β, θ ) .

  46. Urns in Marchal’s algorithm Idea: there are many such urns embedded in Marchal’s algorithm!

  47. Urns in Marchal’s algorithm Idea: there are many such urns embedded in Marchal’s algorithm! Consider the distance D n between the root and the leaf labelled 1. The associated weight is ( α − 1) D n . Let W n be the remaining weight in the rest of the tree. D 1 = 1 and W 1 = 0.

  48. Urns in Marchal’s algorithm Idea: there are many such urns embedded in Marchal’s algorithm! Consider the distance D n between the root and the leaf labelled 1. The associated weight is ( α − 1) D n . Let W n be the remaining weight in the rest of the tree. D 1 = 1 and W 1 = 0. At each subsequent step, (We always add weight α to the whole tree.)

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend