particle methods for rare event monte carlo
play

Particle Methods for Rare Event Monte Carlo Paul Dupuis Division of - PowerPoint PPT Presentation

Particle Methods for Rare Event Monte Carlo Paul Dupuis Division of Applied Mathematics Brown University (Thomas Dean (Oxford)) ASEAS, Arlington, VA March, 2009 Paul Dupuis (Brown University) March, 2009 Particle methods for rare event


  1. Particle Methods for Rare Event Monte Carlo Paul Dupuis Division of Applied Mathematics Brown University (Thomas Dean (Oxford)) ASEAS, Arlington, VA March, 2009 Paul Dupuis (Brown University) March, 2009

  2. Particle methods for rare event Monte Carlo The use of branching processes to estimate small probabilities Paul Dupuis (Brown University) March, 2009

  3. Particle methods for rare event Monte Carlo The use of branching processes to estimate small probabilities Summary: Paul Dupuis (Brown University) March, 2009

  4. Particle methods for rare event Monte Carlo The use of branching processes to estimate small probabilities Summary: The design of such schemes was (until recently) poorly understood. Paul Dupuis (Brown University) March, 2009

  5. Particle methods for rare event Monte Carlo The use of branching processes to estimate small probabilities Summary: The design of such schemes was (until recently) poorly understood. Design should be based on subsolutions to an associated HJB equation. Paul Dupuis (Brown University) March, 2009

  6. Particle methods for rare event Monte Carlo The use of branching processes to estimate small probabilities Summary: The design of such schemes was (until recently) poorly understood. Design should be based on subsolutions to an associated HJB equation. Obtain necessary and su¢cient conditions for asymptotically optimal performance. Paul Dupuis (Brown University) March, 2009

  7. Example: A tandem queue with server slow-down (Ethernet control) Paul Dupuis (Brown University) March, 2009

  8. Example: A tandem queue with server slow-down (Ethernet control) Paul Dupuis (Brown University) March, 2009

  9. Example: A tandem queue with server slow-down (Ethernet control) p n = P f Q 2 exceeds n before Q = ( 0 ; 0 ) j Q ( 0 ) = ( 1 ; 0 ) g : Paul Dupuis (Brown University) March, 2009

  10. Example: A tandem queue with server slow-down (Ethernet control) p n = P f Q 2 exceeds n before Q = ( 0 ; 0 ) j Q ( 0 ) = ( 1 ; 0 ) g : Also, analogous non-Markovian model. Paul Dupuis (Brown University) March, 2009

  11. Model problem and large deviation scaling As a general Markov model one can consider iid random vector …elds � v i ( y ) ; y 2 R d � , and the process i + 1 X n i + 1 = X n nv i ( X n X n i ) ; 0 = x : Paul Dupuis (Brown University) March, 2009

  12. Model problem and large deviation scaling As a general Markov model one can consider iid random vector …elds � v i ( y ) ; y 2 R d � , and the process i + 1 X n i + 1 = X n nv i ( X n X n i ) ; 0 = x : De…ne H ( y ; � ) = log E exp h �; v i ( y ) i ; L ( y ; � ) = sup � 2 R d [ h �; � i � H ( y ; � )] X n ( i = n ) = X n i ; piecewise linear interpolation for t 6 = i = n : Paul Dupuis (Brown University) March, 2009

  13. Model problem and large deviation scaling (cont’d) Under conditions f X n ( � ) g satis…es a Large Deviation Principle with rate function Z T L ( �; _ I T ( � ) = � ) dt 0 if � is AC and � ( 0 ) = x , and I T ( � ) = 1 else. Paul Dupuis (Brown University) March, 2009

  14. Model problem and large deviation scaling (cont’d) Under conditions f X n ( � ) g satis…es a Large Deviation Principle with rate function Z T L ( �; _ I T ( � ) = � ) dt 0 if � is AC and � ( 0 ) = x , and I T ( � ) = 1 else. Heuristically, for T < 1 , given � , small � > 0 and large n ( ) k X n ( t ) � � ( t ) k � � � e � nI T ( � ) : P sup 0 � t � T Paul Dupuis (Brown University) March, 2009

  15. Model problem and large deviation scaling (cont’d) Let C = f trajectories that hit B prior to A g . To estimate: p n ( x ) = P f X n 2 C j X n ( 0 ) = x g : Paul Dupuis (Brown University) March, 2009

  16. Model problem and large deviation scaling (cont’d) Under mild conditions: � 1 n log p n ( x ) ! inf f I T ( � ) : � enters B prior to A before T ; T < 1g = � ( x ) : Paul Dupuis (Brown University) March, 2009

  17. Some estimation generalities For standard Monte Carlo we average iid copies of 1 f X n 2 C g . One 1 needs K � e n � samples for bounded relative error [std dev/ p n ( x ) ]. Paul Dupuis (Brown University) March, 2009

  18. Some estimation generalities For standard Monte Carlo we average iid copies of 1 f X n 2 C g . One 1 needs K � e n � samples for bounded relative error [std dev/ p n ( x ) ]. Alternative approach: construct iid random variables � n 1 ; : : : ; � n K with 2 E � n 1 = p n ( x ) and use the unbiased estimator = � n 1 + � � � + � n q n ; K ( x ) : K ^ : K Paul Dupuis (Brown University) March, 2009

  19. Some estimation generalities For standard Monte Carlo we average iid copies of 1 f X n 2 C g . One 1 needs K � e n � samples for bounded relative error [std dev/ p n ( x ) ]. Alternative approach: construct iid random variables � n 1 ; : : : ; � n K with 2 E � n 1 = p n ( x ) and use the unbiased estimator = � n 1 + � � � + � n q n ; K ( x ) : K ^ : K Performance determined by variance of � n 1 , and since unbiased by 3 1 ) 2 . E ( � n Paul Dupuis (Brown University) March, 2009

  20. Some estimation generalities For standard Monte Carlo we average iid copies of 1 f X n 2 C g . One 1 needs K � e n � samples for bounded relative error [std dev/ p n ( x ) ]. Alternative approach: construct iid random variables � n 1 ; : : : ; � n K with 2 E � n 1 = p n ( x ) and use the unbiased estimator = � n 1 + � � � + � n q n ; K ( x ) : K ^ : K Performance determined by variance of � n 1 , and since unbiased by 3 1 ) 2 . E ( � n By Jensen’s inequality 4 � 1 1 ) 2 � � 2 1 = � 2 n log E ( � n n log E � n n log p n ( x ) ! 2 � ( x ) : Paul Dupuis (Brown University) March, 2009

  21. Some estimation generalities For standard Monte Carlo we average iid copies of 1 f X n 2 C g . One 1 needs K � e n � samples for bounded relative error [std dev/ p n ( x ) ]. Alternative approach: construct iid random variables � n 1 ; : : : ; � n K with 2 E � n 1 = p n ( x ) and use the unbiased estimator = � n 1 + � � � + � n q n ; K ( x ) : K ^ : K Performance determined by variance of � n 1 , and since unbiased by 3 1 ) 2 . E ( � n By Jensen’s inequality 4 � 1 1 ) 2 � � 2 1 = � 2 n log E ( � n n log E � n n log p n ( x ) ! 2 � ( x ) : An estimator is called asymptotically e¢cient if 5 n !1 � 1 1 ) 2 � 2 � ( x ) : n log E ( � n lim inf Paul Dupuis (Brown University) March, 2009

  22. Splitting type schemes Pure branching methods (also called multi-level splitting) Paul Dupuis (Brown University) March, 2009

  23. Splitting type schemes Pure branching methods (also called multi-level splitting) Branching with killing [RESTART, DPR] Paul Dupuis (Brown University) March, 2009

  24. Splitting type schemes Pure branching methods (also called multi-level splitting) Branching with killing [RESTART, DPR] Interacting particle systems (Del Moral et. al.) Paul Dupuis (Brown University) March, 2009

  25. Construction of splitting estimators Pure branching. A certain number [proportional to n ] of splitting thresholds C n r are de…ned which enhance migration, e.g., Paul Dupuis (Brown University) March, 2009

  26. Construction of splitting estimators Pure branching. A certain number [proportional to n ] of splitting thresholds C n r are de…ned which enhance migration, e.g., A single particle is started at x that follows the same law as X n , but branches into a number of independent copies each time a new level is reached. Paul Dupuis (Brown University) March, 2009

  27. Construction of splitting estimators (cont’d) Paul Dupuis (Brown University) March, 2009

  28. Construction of splitting estimators (cont’d) The number of new particles M can be random (though independent of past data), and a multiplicative weight w i is assigned to the i th descendent, where M X E w i = 1 : i = 1 Paul Dupuis (Brown University) March, 2009

  29. Construction of splitting estimators (cont’d) Evolution continues until every particle has reached either A or B . Let M n = total number of particles generated x X n j ( t ) = trajectory of j th particle, W n = product of weights assigned to j along path j Paul Dupuis (Brown University) March, 2009

  30. Construction of splitting estimators (cont’d) Evolution continues until every particle has reached either A or B . Let M n = total number of particles generated x X n j ( t ) = trajectory of j th particle, W n = product of weights assigned to j along path j Then M n X x � n = j 2 C g W n 1 f X n j j = 1 Paul Dupuis (Brown University) March, 2009

  31. Subsolutions for branching processes Now consider the asymptotic rate of decay as a function of y : n !1 � 1 � ( y ) = lim n log p n ( y ) = inf f I T ( � ) : � ( 0 ) = y ; � enters B prior to A before T ; T < 1g : Paul Dupuis (Brown University) March, 2009

  32. Subsolutions for branching processes Now consider the asymptotic rate of decay as a function of y : n !1 � 1 � ( y ) = lim n log p n ( y ) = inf f I T ( � ) : � ( 0 ) = y ; � enters B prior to A before T ; T < 1g : Let H ( y ; � ) = � H ( y ; � � ) [recall H ( y ; � ) = log E exp h �; v i ( y ) i ] : Paul Dupuis (Brown University) March, 2009

  33. Subsolutions for branching processes (cont’d) � ( y ) is a weak-sense solution to the PDE Paul Dupuis (Brown University) March, 2009

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend