choices and intervals
play

Choices and Intervals P ASCAL M AILLARD (Weizmann Institute of - PowerPoint PPT Presentation

Choices and Intervals P ASCAL M AILLARD (Weizmann Institute of Science) AofA, Paris, June 17, 2014 joint work with E LLIOT P AQUETTE (Weizmann Institute of Science) P ASCAL M AILLARD Choices and Intervals 1 / 13 Related models Random


  1. Choices and Intervals P ASCAL M AILLARD (Weizmann Institute of Science) AofA, Paris, June 17, 2014 joint work with E LLIOT P AQUETTE (Weizmann Institute of Science) P ASCAL M AILLARD Choices and Intervals 1 / 13

  2. Related models Random structures formed by adding objects one after the other according to some random rule. Examples: balls-and-bins model: n bins, place balls one after the other into 1 bins, for each ball choose bin uniformly at random (maybe with size-biasing) random graph growth: n vertices, add (uniformly chosen) edges 2 one after the other. interval fragmentation: unit interval [ 0 , 1 ] , add uniformly chosen 3 points one after the other → fragmentation of the unit interval. Extensive literature on these models. P ASCAL M AILLARD Choices and Intervals 2 / 13

  3. Power of choices Aim: Changing behaviour of model by applying a different rule when adding objects balls-and-bins model: n bins, at each step choose two bins 1 uniformly at random and place ball into bin with fewer/more balls. Azar, Broder, Karlin, Upfal ’99; D’Souza, Krapivsky, Moore ’07; Malyshkin, Paquette ’13 random graph growth: n vertices, at each step uniformly sample 2 two possible edges to add, choose the one that (say) minimizes the product of the sizes of the components of its endvertices. Achlioptas, D’Souza, Spencer ’09; Riordan, Warnke ’11+’12 interval fragmentation: unit interval [ 0 , 1 ] , at each step, uniformly 3 sample two possible points to add, choose the one that falls into the larger/smaller fragment determined by the previous points. → this talk P ASCAL M AILLARD Choices and Intervals 3 / 13

  4. Balls-and-bins model n bins, place n balls one after the other into bins. Model A: For each ball, choose bin uniformly at random. Model B: For each ball, choose two bins uniformly at random and place ball into bin with more balls. Model C: For each ball, choose two bins uniformly at random and place ball into bin with fewer balls. How many balls in bin with largest number of balls? Model A: Model B: Model C: P ASCAL M AILLARD Choices and Intervals 4 / 13

  5. Balls-and-bins model n bins, place n balls one after the other into bins. Model A: For each ball, choose bin uniformly at random. Model B: For each ball, choose two bins uniformly at random and place ball into bin with more balls. Model C: For each ball, choose two bins uniformly at random and place ball into bin with fewer balls. How many balls in bin with largest number of balls? Model A: ≈ log n / log log n Model B: Model C: P ASCAL M AILLARD Choices and Intervals 4 / 13

  6. Balls-and-bins model n bins, place n balls one after the other into bins. Model A: For each ball, choose bin uniformly at random. Model B: For each ball, choose two bins uniformly at random and place ball into bin with more balls. Model C: For each ball, choose two bins uniformly at random and place ball into bin with fewer balls. How many balls in bin with largest number of balls? Model A: ≈ log n / log log n Model B: ≈ log n / log log n Model C: P ASCAL M AILLARD Choices and Intervals 4 / 13

  7. Balls-and-bins model n bins, place n balls one after the other into bins. Model A: For each ball, choose bin uniformly at random. Model B: For each ball, choose two bins uniformly at random and place ball into bin with more balls. Model C: For each ball, choose two bins uniformly at random and place ball into bin with fewer balls. How many balls in bin with largest number of balls? Model A: ≈ log n / log log n Model B: ≈ log n / log log n Model C: O ( log log n ) P ASCAL M AILLARD Choices and Intervals 4 / 13

  8. Ψ -process: definition X : random variable on [ 0 , 1 ] , Ψ( x ) = P ( X ≤ x ) . P ASCAL M AILLARD Choices and Intervals 5 / 13

  9. Ψ -process: definition X : random variable on [ 0 , 1 ] , Ψ( x ) = P ( X ≤ x ) . Step 1: empty unit interval [ 0 , 1 ] 1 P ASCAL M AILLARD Choices and Intervals 5 / 13

  10. Ψ -process: definition X : random variable on [ 0 , 1 ] , Ψ( x ) = P ( X ≤ x ) . Step 1: empty unit interval [ 0 , 1 ] 1 Step n : n − 1 points in interval, splitting it into n fragments 2 P ASCAL M AILLARD Choices and Intervals 5 / 13

  11. Ψ -process: definition order by length 2 1 4 3 1 2 3 4 X : random variable on [ 0 , 1 ] , Ψ( x ) = P ( X ≤ x ) . Step 1: empty unit interval [ 0 , 1 ] 1 Step n : n − 1 points in interval, splitting it into n fragments 2 Step n + 1: 3 Order intervals/fragments according to length P ASCAL M AILLARD Choices and Intervals 5 / 13

  12. Ψ -process: definition order by length 2 1 4 3 1 2 3 4 X X : random variable on [ 0 , 1 ] , Ψ( x ) = P ( X ≤ x ) . Step 1: empty unit interval [ 0 , 1 ] 1 Step n : n − 1 points in interval, splitting it into n fragments 2 Step n + 1: 3 Order intervals/fragments according to length Choose an interval according to (copy of) random variable X P ASCAL M AILLARD Choices and Intervals 5 / 13

  13. Ψ -process: definition order by length 2 1 4 3 1 2 3 4 X X : random variable on [ 0 , 1 ] , Ψ( x ) = P ( X ≤ x ) . Step 1: empty unit interval [ 0 , 1 ] 1 Step n : n − 1 points in interval, splitting it into n fragments 2 Step n + 1: 3 Order intervals/fragments according to length Choose an interval according to (copy of) random variable X Split this interval at a uniformly chosen point. P ASCAL M AILLARD Choices and Intervals 5 / 13

  14. Ψ -process: definition X : random variable on [ 0 , 1 ] , Ψ( x ) = P ( X ≤ x ) . Step 1: empty unit interval [ 0 , 1 ] 1 Step n : n − 1 points in interval, splitting it into n fragments 2 Step n + 1: 3 Order intervals/fragments according to length Choose an interval according to (copy of) random variable X Split this interval at a uniformly chosen point. P ASCAL M AILLARD Choices and Intervals 5 / 13

  15. Ψ -process: examples order by length 2 1 4 3 1 2 3 4 X X : random variable on [ 0 , 1 ] , Ψ( x ) = P ( X ≤ x ) . Ψ( x ) = x : uniform process Ψ( x ) = 1 x ≥ 1 : Kakutani process Ψ( x ) = x k , k ∈ N : max- k -process (maximum of k intervals) Ψ( x ) = 1 − ( 1 − x ) k , k ∈ N : min- k -process (minimum of k intervals) P ASCAL M AILLARD Choices and Intervals 6 / 13

  16. Main result I ( n ) 1 , . . . , I ( n ) n : lengths of intervals after step n . n µ n = 1 � δ n · I ( n ) n k k = 1 Main theorem Assume Ψ is continuous + polynomial decay of 1 − Ψ( x ) near x = 1. µ n (weakly) converges almost surely as n → ∞ to a deterministic 1 probability measure µ Ψ on ( 0 , ∞ ) . � x 0 y µ Ψ ( dy ) . Then F Ψ is C 1 and Set F Ψ ( x ) = 2 � ∞ 1 ( F Ψ ) ′ ( x ) = x z d Ψ( F Ψ ( z )) . x P ASCAL M AILLARD Choices and Intervals 7 / 13

  17. Properties of limiting distribution Write µ Ψ ( dx ) = f Ψ ( x ) dx . max- k -process ( Ψ( x ) = x k ) f Ψ ( x ) ∼ C k exp ( − kx ) , as x → ∞ . min- k -process ( Ψ( x ) = 1 − ( 1 − x ) k ) c k f Ψ ( x ) ∼ , as x → ∞ . 1 x 2 + k − 1 convergence to Kakutani (cf. Pyke ’80 ) If (Ψ n ) n ≥ 0 s.t. Ψ n ( x ) → 1 x ≥ 1 pointwise, then f Ψ n ( x ) → 1 2 1 x ∈ [ 0 , 2 ] , as n → ∞ . P ASCAL M AILLARD Choices and Intervals 8 / 13

  18. Properties of limiting distribution (2) P ASCAL M AILLARD Choices and Intervals 9 / 13

  19. Proof of main theorem: the stochastic evolution Embedding in continuous time: points arrive according to Poisson process with rate e t . N t : number of intervals at time t I ( t ) 1 , . . . , I ( t ) N t : lengths of intervals at time t . Observable: size-biased distribution function N t I ( t ) � A t ( x ) = k 1 I ( t ) k ≤ xe − t k = 1 Then A = ( A t ) t ≥ 0 satisfies the following stochastic evolution equation: � t �� ∞ 1 � A t ( x ) = A 0 ( e − t x ) + ( e s − t x ) 2 z d Ψ( A s ( z )) ds + M t ( x ) , e s − t x 0 for some centered noise M t . Claim: A t converges almost surely to a deterministic limit as t → ∞ . P ASCAL M AILLARD Choices and Intervals 10 / 13

  20. Deterministic evolution Let F = ( F t ) t ≥ 0 be solution of � t �� ∞ 1 � F t ( x ) = F 0 ( e − t x ) + ( e s − t x ) 2 z d Ψ( F s ( z )) ds e s − t x 0 =: S Ψ ( F ) t . Define the following norm: � ∞ x − 2 | f ( x ) | dx . � f � x − 2 = 0 Lemma Let F and G be solutions of the above equation. For every t ≥ 0, � F t − G t � x − 2 ≤ e − t � F 0 − G 0 � x − 2 . In particular: ∃ ! F Ψ : F t → F Ψ as t → ∞ . P ASCAL M AILLARD Choices and Intervals 11 / 13

  21. Stochastic evolution - stochastic approximation Problem Cannot control noise M t using the norm �·� x − 2 ! = ⇒ no quantitative estimates to prove convergence. P ASCAL M AILLARD Choices and Intervals 12 / 13

  22. Stochastic evolution - stochastic approximation Problem Cannot control noise M t using the norm �·� x − 2 ! = ⇒ no quantitative estimates to prove convergence. Still possible to prove convergence by Kushner–Clark method for stochastic approximation algorithms. Shifted evolutions A ( n ) = ( A ( n ) t − n ) t ∈ R . Show: almost surely, the 1 family ( A ( n ) ) n ∈ N is precompact in a suitable functional space. Show S Ψ is continuous in this functional space. 2 Show A ( n ) − S Ψ ( A ( n ) ) → 0 almost surely as n → ∞ . 3 This entails that every subsequential limit A ( ∞ ) of ( A ( n ) ) n ∈ N is a fixed point of S Ψ . By previous lemma: A ( ∞ ) ≡ F Ψ . Note: precompactness shown by entropy bounds, already used by Lootgieter ’77; Slud ’78 . P ASCAL M AILLARD Choices and Intervals 12 / 13

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend