stochastic grid bundling method for backward stochastic
play

Stochastic Grid Bundling Method for Backward Stochastic Di ff - PowerPoint PPT Presentation

Stochastic Grid Bundling Method for Backward Stochastic Di ff erential Equations Ki Wai Chau Centrum Wiskunde & Informatica 17th Winter school on Mathematical Finance 22 January 2018 (A joint work with Kees Oosterlee (CWI & TU Delft))


  1. Stochastic Grid Bundling Method for Backward Stochastic Di ff erential Equations Ki Wai Chau Centrum Wiskunde & Informatica 17th Winter school on Mathematical Finance 22 January 2018 (A joint work with Kees Oosterlee (CWI & TU Delft)) K.W. Chau (CWI) SGBM for BSDEs 22 January 2018 1 / 20

  2. Backward Stochastic Di ff erential Equations Settings: A filtered complete probability space ( Ω , F , F , P ) W := ( W t ) 0 ≤ t ≤ T is a d-dimensional Brownian motion adapted to F Forward Backward Stochastic Di ff erential Equation ⇢ dX t = µ ( t , X t ) dt + σ ( t , X t ) dW t , X 0 = x 0 , dY t = � f ( t , X t , Y t , Z t ) dt + Z t dW t , Y T = Φ ( X T ) , µ : Ω ⇥ [0 , T ] ⇥ R q ! R q and σ : Ω ⇥ [0 , T ] ⇥ R q ! R q × d f : Ω ⇥ [0 , T ] ⇥ R q ⇥ R ⇥ R d ! R Φ : Ω ⇥ R q ! R Solution: ( Y t , Z t ) which satisfies the equation, adapts to F and satisfies some regularity requirements. K.W. Chau (CWI) SGBM for BSDEs 22 January 2018 2 / 20

  3. Discretization For a given time grid π = { 0 = t 0 < . . . < t N = T } , we define the backward time discretizations ( Y π , Z π ) based on the theta-scheme from [Zhao et al., 2012]: Y π t N = Φ ( X π Z π t N = r Φ ( X π t N ) σ ( t N , X π t N ) , t N ) + 1 h i h i Z π t k = � θ � 1 Z π θ � 1 Y π t k +1 ∆ W T 2 (1 � θ 2 ) E t k 2 E t k t k +1 k ∆ k h i + θ � 1 f k +1 ( Y π t k +1 , Z π t k +1 ) ∆ W T 2 (1 � θ 2 ) E t k , k = N � 1 , . . . , 0 k h i Y π Y π + ∆ k θ 1 f k ( Y π t k , Z π t k = E t k t k ) t k +1 h i f k +1 ( Y π t k +1 , Z π + ∆ k (1 � θ 1 ) E t k t k +1 ) , k = N � 1 , . . . , 0 , where f k ( y , z ) := f ( t k , X π t k , y , z ), 0  θ 1  1 and 0 < θ 2  1. K.W. Chau (CWI) SGBM for BSDEs 22 January 2018 3 / 20

  4. Discretization (cont.) Note that: the globally Lipschitz driver assumption is in force; we use a Markovian approximation X π t k , t k 2 π : t k +1 = X π t k + b ( t k , X π t k ) ∆ k + σ ( t k , X π t k ) ∆ W k ; X π due to the Markovian setting, there exist functions y ( θ 1 , θ 2 ) ( x ) and k z ( θ 1 , θ 2 ) ( x ) such that k t k = y ( θ 1 , θ 2 ) t k = z ( θ 1 , θ 2 ) Y π ( X π t k ) , Z π ( X π t k ) . k k Question: h i h i y ( θ 1 , θ 2 ) y ( θ 1 , θ 2 ) How to approximate E x ( X π , E x ( X π t k +1 ) ∆ W T t k +1 ) , t k k +1 t k k +1 k and other similar quantities along the time grid? K.W. Chau (CWI) SGBM for BSDEs 22 January 2018 4 / 20

  5. Stochastic Grid Bundling Method Non-nested Monte Carlo scheme It starts with the simulation of M independent samples of ( X π t k ) 0 ≤ k ≤ N , denoted by ( X π , m ) 1 ≤ m ≤ M , 0 ≤ k ≤ N . t k The simulation is only performed once. The terminal values for each path are: y ( θ 1 , θ 2 ) , R , I ( X π , m ) = Φ ( X π , m ) , N t N t N z ( θ 1 , θ 2 ) , R ( X π , m ) = r Φ ( X π , m ) σ ( t N , X π , m ), m = 1 , . . . , M . t N t N t N N K.W. Chau (CWI) SGBM for BSDEs 22 January 2018 5 / 20

  6. Recurring steps in time (I) Non-nested Monte Carlo scheme Regress-later The least-squares regression technique for functions is performed on the random variable X π t k +1 Then we use the (analytical) expectation of the resulting approximation in our algorithm. This will remove the ”statistical” error at the regression step. To ensure the stability of our algorithm, the regression coe ffi cients must be bounded above. It means that an error notion should be given by the program when the Euclidean norm of any regression coe ffi cient vector is greater than a predetermined constant L . K.W. Chau (CWI) SGBM for BSDEs 22 January 2018 6 / 20

  7. Regress now and Regress later Regress now η 1 ( X π , 1 η Q ( X π , 1 g ( X π , 1 0 1 0 1 0 1 t k +1 ) t k ) t k ) α 1 . ... . . . A = B C B C B . C . @ A @ @ A η 1 ( X π , # B η Q ( X π , # B g ( X π , # B ) ) α Q t k +1 ) t k t k Q X E [ g ( X π t k +1 ) | X π t k = x ] ⇡ α l η l ( x ) l =1 Regress later η 1 ( X π , 1 η Q ( X π , 1 g ( X π , 1 0 1 0 1 0 1 t k +1 ) t k +1 ) t k +1 ) α 1 . . ... . . A = B C B C B C . . @ A @ @ A η 1 ( X π , # B η Q ( X π , # B g ( X π , # B α Q t k +1 ) t k +1 ) t k +1 ) Q X E [ g ( X π t k +1 ) | X π α l E [ η l ( X π t k +1 ) | X π t k = x ] ⇡ t k = x ] l =1 K.W. Chau (CWI) SGBM for BSDEs 22 January 2018 7 / 20

  8. Recurring steps in time (II) Non-nested Monte Carlo scheme Regress-later Localization (Bundling) At each time period, all paths are bundled into B t k (1) , . . . , B t k ( B ) (almost) equal-size, non-overlapping partitions based on the result of ( X π , m ). t k We perform the approximation separately at each bundle. K.W. Chau (CWI) SGBM for BSDEs 22 January 2018 8 / 20

  9. Bundling K.W. Chau (CWI) SGBM for BSDEs 22 January 2018 9 / 20

  10. Formulation Specifically, the bundle regression parameters α k +1 ( b ), β k +1 ( b ), γ k +1 ( b ) are defined as t k +1 ) α � y ( θ 1 , θ 2 ) , R , I P M m =1 ( p ( X π , m ( X π , m t k +1 )) 2 1 B tk ( b ) ( X π , m ) k +1 t k α k +1 ( b ) = arg min P M m =1 1 B tk ( b ) ( X π . m ) α 2 R Q t k t k +1 ) β � z ( θ 1 , θ 2 ) , R P M m =1 ( p ( X π , m ( X π , m t k +1 )) 2 1 B tk ( b ) ( X π , m ) i , k +1 t k β i , k +1 ( b ) = arg min P M m =1 1 B tk ( b ) ( X π . m β 2 R Q ) t k γ k +1 ( b ) = t k +1 ) γ � f k +1 ( y ( θ 1 , θ 2 ) , R , I , z ( θ 1 , θ 2 ) , R P M m =1 ( p ( X π , m )) 2 1 B tk ( b ) ( X π , m ) t k k +1 k +1 arg min P M m =1 1 B tk ( b ) ( X π . m ) γ 2 R Q t k K.W. Chau (CWI) SGBM for BSDEs 22 January 2018 10 / 20

  11. Formulation (cont.) The approximate functions within the bundle at time k are defined by : h i z ( θ 1 , θ 2 ) , R ( b , x ) = � θ � 1 2 (1 � θ 2 ) E x p ( X π t k +1 ) β k +1 ( b ) r , k t k  ∆ W r , k � + θ � 1 2 E x p ( X π t k +1 ) ( α k +1 ( b ) + (1 � θ 2 ) ∆ k γ k +1 ( b )) , t k ∆ k h i y ( θ 1 , θ 2 ) , R , 0 ( b , x ) = E x p ( X π t k +1 ) α k +1 ( b ) , t k k y ( θ 1 , θ 2 ) , R , i ( b , x ) = ∆ k θ 1 f k ( y π , R , i � 1 ( x ) , z π , R ( x )) + h k ( x ) , k k k h i h k ( b , x ) = E x p ( X π t k +1 ) ( α k +1 ( b ) + ∆ k (1 � θ 1 ) γ k +1 ( b )) , i = 1 , . . . , I , t k with B y ( θ 1 , θ 2 ) , R , I 1 x 2 B tk ( b ) y ( θ 1 , θ 2 ) , R , I X ( x ) = ( b , x ) k k b =1 and similarly for z . K.W. Chau (CWI) SGBM for BSDEs 22 January 2018 11 / 20

  12. Refined Regression Theorem 1 Assume that for a real function v that is bounded in a compact set and v 2 ( x ) ν ( dx )  1 , then R ZZ � ˆ v ( x , y ) | 2 ν ( dx , dy ) | v ( y ) � ˜ E S "X # ν ( dx , dy )(log( P M m =1 1 B ( X m )) + 1) Q  ϑ ( L 0 ) Z Z ˆ E ˆ P M E [ 1 S ] m =1 1 B ( X m ) B B 2 B "X # 8 Z Z ˆ | v ( Y ) � φ ( Y ) | 2 | X = x ⇥ ⇤ + ν ( dx , dy )( inf φ 2 H sup ^ L 0 ) E E ˆ E [ 1 S ] x 2 B B B 2 B ZZ � +ˆ v ( x , y ) | 2 (1 � 1 A ( y )) ν ( dx , dy ) | v ( y ) � ˜ E S K.W. Chau (CWI) SGBM for BSDEs 22 January 2018 12 / 20

  13. Example 1 We consider the BSDE: 8 dX t = dW t , < dY t = � ( Y t Z t � Z t + 2 . 5 Y t � sin( t + X t ) cos( t + X t ) � 2 sin( t + X t )) dt + Z t dW t , : with the initial and terminal conditions x 0 = 0 and Y T = sin( X T + T ). The exact solution is given by ( Y t , Z t ) = (sin( X t + t ) , cos( X t + t )) . The terminal time is set to be T = 1 and ( Y 0 , Z 0 ) = (0 , 1). We run the examples with the basis functions η ( x ) = (1 , x , x 2 ) and bundle based on the value of x . K.W. Chau (CWI) SGBM for BSDEs 22 January 2018 13 / 20

  14. Test Case Example θ 1 θ 2 I M N B L 2 2 J 2 J 2 J D 1 0.5 0.5 4 100 2 2 J 2 J 2 J E 1 0.5 0.5 4 10000 2 2 J 2 J 2 J F 1 0.5 0.5 4 � | Y 0 � y ( θ 1 , θ 2 ) , R ( x 0 ) | 0 J 2 3 4 5 9 . 2870 ⇥ 10 � 2 1 . 0114 ⇥ 10 � 1 8 . 1415 ⇥ 10 � 2 D NA 7 . 8601 ⇥ 10 � 1 3 . 9639 ⇥ 10 � 1 5 . 2388 ⇥ 10 � 2 E 29 . 2228 2 . 2154 ⇥ 10 15 1 . 9059 ⇥ 10 56 3 . 4731 ⇥ 10 � 1 5 . 8511 ⇥ 10 � 2 F J 6 7 8 3 . 9920 ⇥ 10 � 3 1 . 5486 ⇥ 10 � 2 D NA 1 . 1931 ⇥ 10 � 2 1 . 2395 ⇥ 10 � 2 1 . 4347 ⇥ 10 � 3 E 2 . 0485 ⇥ 10 � 3 6 . 8277 ⇥ 10 � 3 2 . 6705 ⇥ 10 � 3 F K.W. Chau (CWI) SGBM for BSDEs 22 January 2018 14 / 20

  15. Example 2: European option We consider a market where the assets satisfy: dS i , t = µ i S i , t dt + σ i S i , t dB i , t , 1  i  q with B t being a correlated q -dimension Wiener process with dB i , t dB j , t = ρ i jdt . The parameters ρ ij form a symmetric matrix ρ , 0 1 1 ρ 12 ρ 13 · · · ρ 1 q 1 · · · ρ 21 ρ 23 ρ 2 q B C ρ = B . . . . C A , . . . . B C . . . . @ ρ q 1 ρ q 2 ρ q 3 · · · 1 and we assume it is invertible. By performing a Cholesky decomposition on ρ such that LL T = ρ , we relate B t to standard Brownian motion B t = LW t . K.W. Chau (CWI) SGBM for BSDEs 22 January 2018 15 / 20

  16. Example 2: European option (cont.) For a European option with terminal payo ff g ( S t ), a replicating portfolio Y t , containing ω i , t of asset S i , t and Z t = ( ω 1 , t σ 1 S 1 , t , . . . , ω q , t σ q , S q , t ) L solve the BSDE, � rY t � Z t L � 1 � µ � r ( � �� dY t = � dt + Z t dW t ; σ Y T = g ( S T ) , ⌘ T � µ � r ⇣ µ 1 � r σ 1 , · · · , µ q � r � where = . σ σ q In this numerical test, we use the 5-dimensional example from [Reisinger and Wittum, 2007]. K.W. Chau (CWI) SGBM for BSDEs 22 January 2018 16 / 20

Recommend


More recommend