a variance reduction method for computing var
play

A variance reduction method for computing VaR 1. Computing Value at - PowerPoint PPT Presentation

A variance reduction method for computing VaR 1. Computing Value at Risk by Monte Carlo simulations 2. Importance Sampling for variance reduction 3. Interacting Particle Systems for Importance Sampling (IPS-IS) 4. Simulation results Nadia


  1. A variance reduction method for computing VaR 1. Computing Value at Risk by Monte Carlo simulations 2. Importance Sampling for variance reduction 3. Interacting Particle Systems for Importance Sampling (IPS-IS) 4. Simulation results Nadia Oudjane - EDF R&D- Journ´ ees MAS 2008 1

  2. 1. Computing Value at Risk by Monte Carlo simulations 2 Value at Risk and quantile � T ◮ P&L of a portfolio on [0 , T ] ∆ V ( X ) = V T − V 0 + 0 CF X ∈ R d the risk factors impacting the portfolio value on [0 , T ] with V aR α = | inf { s ∈ R | P (∆ V ≤ s ) ≥ 1 − α } | ◮ Value at Risk V aR α = | F − ( α ) | , F ( s ) = P (∆ V ≤ s ) , for all s ∈ R where N. Oudjane - EDF R&D- Journ´ ees MAS 2008 2

  3. 1. Computing Value at Risk by Monte Carlo simulations 3 Monte Carlo method for VaR estimation ◮ The distribution function F can be viewed as an expectation F ( s ) = E [ I ∆ V ( X ) ≤ s ] , s ∈ R for all ◮ Traditional Monte Carlo Method for computing VaR 1. Monte Carlo simulations give an approximation of F ( s ) : N F N ( s ) = 1 � ˆ I (∆ V ( X i ) ≤ s ) , for all s ∈ R N i =1 ⇒ Too many evaluations of ∆ V for a given accuracy 2. Inversion of ˆ F N and interpolation for approximating VaR N. Oudjane - EDF R&D- Journ´ ees MAS 2008 3

  4. 2. Importance Sampling for variance reduction 4 Importance Sampling for variance reduction p − → q where q dominates Hp ◮ Change of measure m = E p [ H ( X )] = E q [ H ( Y ) p q ( Y )] , X ∼ p Y ∼ q where and → q ∗ p − achieves zero variance if H ≥ 0 ◮ Optimal change of measure Hp Hp q ∗ = H ( x ) p ( x ) dx = E p [ H ( X )] = H · p � ◮ Monte Carlo approximation M M = 1 H ( Y i ) p � m q E p [ H ( X )] ≈ ˆ q ( Y i ) , ( Y 1 , · · · , Y M ) i.i.d. ∼ q where M i =1 ⇒ How to simulate and evaluate approximately q ∗ ? N. Oudjane - EDF R&D- Journ´ ees MAS 2008 4

  5. 2. Importance Sampling for variance reduction 5 Variance of the Importance Sampling estimate ◮ Let q be a (possibly random) importance probability density dominating q ∗ � � � � m q m q m q V ar ( ˆ M ) = E V ar [ ˆ M | F q ] + V ar E [ ˆ M | F q ] � �� � =0 F q denotes the sigma-algebra generated by the random variables involved in q ◮ The variance of the IS estimate depends on the ”distance” between q and q ∗ � � � M ) = m 2 [( q ∗ − q ) q ∗ m q V ar ( ˆ q ]( x ) dx M E ◮ Idea : use Interacting Particle Systems for Importance Sampling (IPS-IS) to approximate q ∗ by q N based on an N -particle system to achieve C m q N V ar ( ˆ M ) ≤ 0 < α < 1 / 2 with MN α N. Oudjane - EDF R&D- Journ´ ees MAS 2008 5

  6. 2. Importance Sampling for variance reduction 6 Some alternative approaches ◮ Large deviation approximation for rare events simulation ◮ Approximation of H to obtain a simple form fo q ∗ ex : [Glasserman&al00] for computing VaR, ∆ - Γ approximation of the portfolio ◮ Cross-entropy [Homem-de-Mello&Rubinstein02] q θ is chosen in a parametric family such as to minimize the entropy K ( q θ , q ∗ ) ◮ Interacting Particle Systems whithout Importance Sampling [DelMoral&Garnier05], [Cerou&al06] Interacting Particle Systems for Importance Sampling (IPS-IS) can be viewed as a non parametric version of cross entropy approach N. Oudjane - EDF R&D- Journ´ ees MAS 2008 6

  7. 2. Importance Sampling for variance reduction 7 Progressive correction [Musso&al01] ◮ We introduce a sequence of non negative functions ( G k ) 0 ≤ k ≤ n such that  G 0 ( x ) = 1    x ∈ R d , G 0 ( x ) · · · G n ( x ) = H ( x ) for all The product    G k ( x ) = 0 G k +1 ( x ) = 0 If then H ( x ) = I (∆ V ( x ) ≤ s ) ◮ In our case then we choose G k ( x ) = I ∆ V ( x ) ≤ s k , s = s n ≤ · · · ≤ s 0 = + ∞ with ( ν k ) 0 ≤ k ≤ n ◮ Dynamical system on the space of probability measures  ν 0 = p dx   G k ν k − 1 ν k = R d G k ( x ) ν k − 1 ( x ) dx = G k · ν k − 1 , 1 ≤ k ≤ n for all �   ⇒ ν n = q ∗ dx N. Oudjane - EDF R&D- Journ´ ees MAS 2008 7

  8. 2. Importance Sampling for variance reduction 8 Space exploration ( Q k ) 0 ≤ k ≤ n ◮ We introduce a sequence of Markov kernels such that � x ∈ R d ν k ≈ ν k Q k ν k ( dx ) ≈ R d ν k ( du ) Q k ( u, dx ) , i.e. for all G k ( x ) = I ∆ V ( x ) ≤ s k , if p is Gaussian then Q k is ◮ In our case where easily obtained from a Gaussian kernel Q reversible for p , � � Q k ( x, dx ′ ) = Q ( x, dx ′ ) I ∆ V ( x ) ≤ s k + 1 − Q ( x, ∆ V − (( −∞ , s k ])) δ x ( dx ′ ) ( ν k ) 0 ≤ k ≤ n ◮ Dynamical system on the space of probability measures   ν 0 = p dx ν k = G k · ( ν k − 1 Q k − 1 ) , 1 ≤ k ≤ n  for all ⇒ ν n = q ∗ dx N. Oudjane - EDF R&D- Journ´ ees MAS 2008 8

  9. 3. Interacting Particle Systems for Importance Sampling (IPS-IS) 9 Approximation of the dynamical system ◮ The idea is to replace at each iteration k , ν k − 1 Q k − 1 by its N -empirical S N ( ν k − 1 Q k − 1 ) measure such that N S N ( ν k − 1 Q k − 1 ) = 1 � ( X 1 k , · · · , X N δ X i k ) are i.i.d. ∼ ν k − 1 Q k − 1 where N k i =1 ◮ Dynamical system on the space of dicrete probability measures ( ν N k ) 0 ≤ k ≤ n  ν N 0 = S N ( ν 0 )  ν N k = G k · S N ( ν N k − 1 Q k − 1 ) , 1 ≤ k ≤ n  for all n ≈ q ∗ dx ν N ⇒ One can show that [DelMoral] N. Oudjane - EDF R&D- Journes MAS 2008 9

  10. 3. Interacting Particle Systems for Importance Sampling (IPS-IS) 10 Algorithm ◮ Initialization : Generate independently N 0 = 1 � ( X 1 0 , · · · , X N ν N 0 ) ∼ p δ X i i.i.d. then set N 0 i =1 ◮ Selection : Generate independently N � ( ˜ k , · · · , ˜ X 1 X N ν N ω i k ) ∼ k = k δ X i i.i.d. k i =1 ◮ Mutation : Generate independently for each i ∈ { 1 , · · · , N } , Q k ( ˜ X i X i ∼ k , · ) k +1 ◮ Weighting : For each particle i ∈ { 1 , · · · , N } , compute N G k +1 ( X i k +1 ) � ω i ν N ω i k +1 = k +1 = k +1 δ X i then set � N j =1 G k +1 ( X j k +1 ) k +1 i =1 N. Oudjane - EDF R&D- Journes MAS 2008 10

  11. 3. Interacting Particle Systems for Importance Sampling (IPS-IS) 11 Adaptive choice of the sequence ( G k ) 0 ≤ k ≤ n [Musso&al01], [Hommem-de-Mello&Rubinstein02], [C´ erou&al06] ◮ The performance of Interacting particle systems is known to deteriorate max G k when the quantities are big S N ( ν N k − 1 Q k − 1 )( G k ) N The idea is then to chose G k such that 1 � G k ( X i k ) is not to small N i =1 G k ( x ) = I ∆ V ( x ) ≤ s k , the threshold s k is chosen as a ◮ In our case where r.v. depending on the current particle system and on a parameter ρ ∈ (0 , 1) : � � N � s k = inf s I ∆ V ( X i ) ≤ s ≥ ρN such that i =1 ◮ This choice of s k is not prooved to guarantee that the algorithms ends in a finite number of iterations but this point does not seem to be a problem in our simulations N. Oudjane - EDF R&D- Journes MAS 2008 11

  12. 3. Interacting Particle Systems for Importance Sampling (IPS-IS) 12 Density estimation n ≈ q ∗ dx ◮ At the end of the algorithm, we get ν N But Importance Sampling requires a smooth approximation with density q N K ◮ Kernel of order 2 � � � K ≥ 0 K = 1 x i K = 0 | x i x j | K < ∞ K h ( x ) = 1 h d K ( x K h h ) ◮ Rescaled kernel � � ν N = ω i δ X i q N,h = ω i K h ( · − X i ) Density estimation − − − − − − − − − → ◮ K h ∗ · E � q N − q ∗ � 1 ≤ C ◮ Optimal choice of h = > 4 2( d +4) N W3 W4 W5 <−−−−−−−− WEIGHTS W2 <−−−−−−−− DENSITY ESTIMATE W1 <−−−−−−−− KERNELS <−−−−−−−− SAMPLE <−−−−−−−− SAMPLE N. Oudjane - EDF R&D- Journes MAS 2008 12

  13. 3. Interacting Particle Systems for Importance Sampling (IPS-IS) 13 Some simulation results : Variance ratio ◮ Several test cases depending on the form of function x �→ ∆ V ( x ) have been studied : results are all comparable ◮ X is a d dimensional Gaussian variable and m = E p [ I ∆ V ( X ) ≤ s ] ◮ Particles N = 500 Iterations n ≈ 10 to 60 Simulations M = 10 000 d = 1 d = 2 d = 3 d = 4 d = 5 150 m = 10 − 2 50 50 30 25 10 − 1 1000 m = 10 − 3 300 300 200 140 2 2 . 10 5 10 5 10 5 5 . 10 4 2 . 10 4 m = 10 − 6 200 400 300 460 480 d = 6 d = 7 d = 8 d = 9 · · · d = 30 m = 10 − 2 5 . 10 − 3 22 14 11 8 · · · m = 10 − 3 10 − 3 100 70 55 40 · · · 10 4 2 . 10 3 2 . 10 3 4 . 10 3 1 m = 10 − 6 · · · 250 480 300 300 360 N. Oudjane - EDF R&D- Journes MAS 2008 13

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend