From Martingales in Finance to Quantization for pricing Giorgia - - PowerPoint PPT Presentation

from martingales in finance to quantization for pricing
SMART_READER_LITE
LIVE PREVIEW

From Martingales in Finance to Quantization for pricing Giorgia - - PowerPoint PPT Presentation

Introduction Quantization Recursive marginal quantization Results and perspectives From Martingales in Finance to Quantization for pricing Giorgia Callegaro Universit di Padova Workshop on Martingales in Finance and Physics ICTP, 24 May


slide-1
SLIDE 1

Introduction Quantization Recursive marginal quantization Results and perspectives

From Martingales in Finance to Quantization for pricing

Giorgia Callegaro

Università di Padova

Workshop on Martingales in Finance and Physics ICTP, 24 May 2019.

Mostly based on recent papers with Lucio Fiorin and Martino Grasselli.

1 / 24

slide-2
SLIDE 2

Introduction Quantization Recursive marginal quantization Results and perspectives

Martingales in Finance: why?

[Arbitrage] Intuition: a possibility of a riskless profit. An

arbitrage is an investment strategy whose cost today is non positive, whose (portfolio) value tomorrow is non-negative and strictly positive with positive probability (recall today’s first talk).

[Viability] A market is viable if there is no arbitrage

  • pportunity: we AOA.

[Key theorem] The market is viable if and only if there exists

a probability measure Q equivalent to P such that the discounted asset prices are Q-martingales.

P: real world measure Q: risk neutral probability

2 / 24

slide-3
SLIDE 3

Introduction Quantization Recursive marginal quantization Results and perspectives

Pricing and hedging in a nutshell

[Call and Put] A Call option gives the holder the opportunity

to buy an underlying asset X, at a fixed time T and at a specified cost (strike) K > 0: its value is F C

T = (XT −K)+

The Put option analogously gives the holder the right to sell: F P

T = (K −XT)+

[Pricing] The price at time t of a European option, whose

payoff is FT = f (XT) is

EQ

e−r(T−t)f (XT)|Ft

  • where (Ft)t∈[0,T] is the available filtration.

[Hedging] An investment strategy whose portfolio’s value

coincides (replicates) at any time with the option’s value.

3 / 24

slide-4
SLIDE 4

Introduction Quantization Recursive marginal quantization Results and perspectives

What if we discretize XT?

In case when the random variable XT reduces to a finite set of points, the expectation (price) is computed as a finite sum. Quantization: approximating a signal (random variable) admitting a continuum of possible values, by a signal that takes values in a discrete set

Figure: Picture taken from Mc Walter et al [9]

4 / 24

slide-5
SLIDE 5

Introduction Quantization Recursive marginal quantization Results and perspectives

Quantization: a brief history

[Birth] Back to the 50’s, to optimize signals’ transmission [Two worlds]

Vector quantization random variables Functional quantization stochastic processes

[Applications] Information theory, cluster analysis, pattern

and speech recognition, numerical integration and probability

[How?] Numerical procedures mostly based on stochastic

  • ptimization algorithms very time consuming.

Today’s menu: discretize random variables and stochastic processes in a fast and efficient way via recursive marginal quantization.

5 / 24

slide-6
SLIDE 6

Introduction Quantization Recursive marginal quantization Results and perspectives

Vector quantization: some Math

Given an Rd-valued random variable X on (Ω,A ,P) (or Q), X ∈ Lr, N-quantizing X on a grid Γ = (x1,...,xN) consists in projecting X

  • n Γ. In order to univocally define the projection function, we need

to specify a partition of Rd, (Ci)1≤i≤N, so that

ProjΓ(X) =

N

  • i=1

xi1 1Ci(X)

The induced Lr error

||X −ProjΓ(X)||r = E

  • min

1≤i≤N |X −xi|r

1/r

is called the Lr−mean quantization error.

N.B. For a complete background on optimal quantization: Graf and Luschgy [7]

6 / 24

slide-7
SLIDE 7

Introduction Quantization Recursive marginal quantization Results and perspectives

Quadratic optimal quantization (r = 2)

Let us focus, from now on, on r = 2! How do we choose the N points in Γ? By minimizing the L2 error!

A grid Γ⋆ minimizing the L2− quantization error over all the

grids with size at most N is the optimal quadratic quantizer.

The projection of X on Γ⋆, ProjΓ⋆(X), or

X Γ⋆, or X for simplicity, is called the quantization of X and the associated partition Ci(Γ⋆) ⊂

  • ξ ∈ Rd : |ξ−xi| = min

1≤j≤N |ξ−xj|

  • is called the Voronoï partition, or tessellation induced by Γ⋆.

ProjΓ⋆(X) is defined as the closest neighbor projection on Γ⋆.

7 / 24

slide-8
SLIDE 8

Introduction Quantization Recursive marginal quantization Results and perspectives

Vector quantization: example (N = 50)

−4 −3 −2 −1 1 2 3 4 −4 −3 −2 −1 1 2 3 4

Figure: Optimal quantizer and tessellation of a 2-d Gaussian r.v.

8 / 24

slide-9
SLIDE 9

Introduction Quantization Recursive marginal quantization Results and perspectives

Vector quantization: some useful facts

Theory:

The Lr− error goes to zero as N → +∞ (Zador Theorem). The distortion function (the quadratic quantization error

squared) always reaches one minimum at a N-tuple Γ⋆ having pairwise distinct components. Practice:

d = 1: optimal quantizers can be obtained via standard

Newton-Raphson procedure.

d ≥ 2: stochastic gradient descent algorithms are required (or

standard gradient descent when the distribution can be easily simulated)

9 / 24

slide-10
SLIDE 10

Introduction Quantization Recursive marginal quantization Results and perspectives

Vector quantization: fixing the ideas

Optimal quadratic quantization of X: gives access to a N-tuple Γ = {x1,x2,...,xN} which minimizes the L2 distance between X and X Γ This provides the best possible quadratic approximation of a random vector X by a random vector taking (at most) N values.

10 / 24

slide-11
SLIDE 11

Introduction Quantization Recursive marginal quantization Results and perspectives

Vector quantization: numerical integration

Given an integrable function f , a random variable X and a (hopefully optimal) quantizer Γ = {x1,...,xN}, E[f (X)] can be approximated by the finite sum

E[f (

X Γ)] =

N

  • i=1

f (xi)P( X Γ = xi). If f is Lipschitz continuous, then

  • E[f (X)]−E[f (

X Γ)]

  • ≤ [f ]Lip ||X −

X Γ||2 and ||X − X Γ||2

N→∞

− → 0 (Zador theorem).

N.B. When f is smoother this error bound can be significantly improved.

11 / 24

slide-12
SLIDE 12

Introduction Quantization Recursive marginal quantization Results and perspectives

Vector quantization: towards stationary quantizers

What do we need in practice to quantize X?

The grid Γ⋆ = {x1,x2,...,xN} The weights of the cells in the Voronöi tessellation

P(X ∈ Ci(Γ⋆)) = P(

X = xi),i = 1,...,N From a numerical point of view, finding an optimal quantizer may be a very challenging and time consuming task. This motivates the introduction of sub-optimal criteria: stationary quantizers.

12 / 24

slide-13
SLIDE 13

Introduction Quantization Recursive marginal quantization Results and perspectives

Stationary quantizers

Definition: Γ = {x1,··· ,xN} is stationary for X if

E X|

X Γ

=

X Γ.

Optimal quantizers are stationary; Stationary quantizers Γ are critical points of the distortion

function:

∇D(Γ) = 0

(1) where the distortion function is the square of the L2-error D(Γ) :=

N

  • i=1
  • Ci(Γ)

|u −xi|2dPX(u).

13 / 24

slide-14
SLIDE 14

Introduction Quantization Recursive marginal quantization Results and perspectives

From stationary quantizers to the quantization of a stochastic process

Stationary quantizers are interesting from a numerical point of view: they can be found through zero search recursive procedures like Newton’s algorithm.

We stationary (sub-optimal) quantizers. How to quantize a stochastic process with these ideas?

14 / 24

slide-15
SLIDE 15

Introduction Quantization Recursive marginal quantization Results and perspectives

“Step by step marginal quantization”: warm up

Recently introduced by Pagès and Sagna [8]

Consider a continuous-time Markov process Y

dYt = b(t,Yt)dt +a(t,Yt)dWt, Y0 = y0 > 0, where W is a standard Brownian motion and a and b satisfy the usual conditions ensuring the existence of a (strong) solution to the SDE;

Given T > 0 and {0 = t0,t1,...,tM = T}, ∆k = tk −tk−1, k ≥ 1,

the Euler scheme is

  • Ytk =

Ytk−1 +b(tk−1, Ytk−1)∆k +a(tk−1, Ytk−1)∆Wk

  • Yt0 =

Y0 = y0 where ∆Wk := (Wtk −Wtk−1) ∼ N (0,∆k).

15 / 24

slide-16
SLIDE 16

Introduction Quantization Recursive marginal quantization Results and perspectives

Key remark: for every k = 1,...,M

L

Ytk

  • Ytk−1 = x
  • ∼ N

mk−1(x),σ2

k−1(x)

  • (2)

where mk−1(x)

= x +b(tk−1,x)∆k σ2

k−1(x)

= [a(tk−1,x)]2 ∆k.

Idea: quantize recursively every marginal random variable

(vector quantization) Ytk, exploiting (2).

16 / 24

slide-17
SLIDE 17

Introduction Quantization Recursive marginal quantization Results and perspectives

“Step by step marginal quantization”: stationary quantizers The distortion function at time tk, relative to Ytk, is Dk(Γk) =

N

  • i=1
  • Ci(Γk)(y −yk

i )2 P

Ytk ∈ dy

  • where N is the (fixed) size of the grid Γk =

yk

1 ,yk 2 ,...,yk N

. Target: Γk ∈ RN such that ∇Dk(Γk) = 0

Question: applying Newton-Raphson now? Answer: NO! We do NOT know the distribution of Ytk!

17 / 24

slide-18
SLIDE 18

Introduction Quantization Recursive marginal quantization Results and perspectives

... using the conditional distribution in (2) we have

P(

Ytk ∈ dy) = dy

  • R

φmk−1(yk−1),σk−1(yk−1)(y) P(

Ytk−1 ∈ dyk−1) where φm,σ is the density function of a N (m,σ2).

Replacing

Y by Y , we deduce a recursive procedure to obtain the stationary quantizer at time tk, based on the quantizer at time tk−1,k ∈ {0,...,M −1}: The distorsion is continuously differentiable, so (via gradient and Hessian matrix) Newton-Raphson faster computations wrt stochastic algorithms.

18 / 24

slide-19
SLIDE 19

Introduction Quantization Recursive marginal quantization Results and perspectives

The algorithm

At every step k = 1,...,M −1 of the algorithm:

What we need

The (stationary) quantizer

Yk−1 at time tk−1.

The weights

What we do

Newton - Raphson iterations until convergence to the stationary grid Γk = (yk

1 ,...,yk N) at time tk .

What we get

The quantization at time tk:

  • Yk =

N

  • i=1

yk

i 1

1

Yk∈Ci(Γk).

The weights The transition probabilities from time tk−1 to time tk. 19 / 24

slide-20
SLIDE 20

Introduction Quantization Recursive marginal quantization Results and perspectives

Recent research and perspectives - 1

Recursive marginal quantization can be safely extended to discretize Y taking values in Rd (local and) stochastic vola models (d = 2). Example: Heston model dSt St

= rdt +

  • Vt
  • ρ dW 1

t +

  • 1−ρ2dW 2

t

  • dVt = κ(θ −Vt)dt +ξ
  • VtdW 1

t

where

W 1 and W 2 are independent standard Brownian motions r is the risk free interest rate θ is the long run average price variance κ is the reversion speed ρ is the correlation ξ is the volatility of the variance process (vol of vol).

20 / 24

slide-21
SLIDE 21

Introduction Quantization Recursive marginal quantization Results and perspectives

RMQuantization of the Heston model

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Time 20 40 60 80 100 120 140 160 180 200

Quantization of the price process in the Heston model, N = 30.

21 / 24

slide-22
SLIDE 22

Introduction Quantization Recursive marginal quantization Results and perspectives

Recent research and perspectives - 2

Being transition probabilities available, it is also possible to

easily price exotic options (such as American).

Calibration on vanilla and american options’ prices is possible. Challenges:

High dimension machine learning? Discretizing non Markovian stochastic processes (e.g.

rough-volatility models)?

22 / 24

slide-23
SLIDE 23

Introduction Quantization Recursive marginal quantization Results and perspectives

Bormetti, G., Callegaro, G., Livieri, G. and Pallavicini, A. (2018). A backward Monte Carlo approach to exotic option pricing, European Journal of Applied Mathematics, 29(1), pp. 146-187. Callegaro, G., Fiorin, L. and Grasselli, M. (2015). Quantized calibration in local

  • volatility. Risk, 56-67.

Callegaro, G., Fiorin, L. and Grasselli, M. (2016). Pricing via quantization in stochastic volatility models, Quantitative Finance, 17(6), pp. 855-872. Callegaro, G., Fiorin, L. and Grasselli, M. (2018). American quantized calibration in stochastic volatility, Risk, February 2018, pp. 84-88. Callegaro, G., Fiorin, L. and Grasselli, M. (2018). Quantization meets Fourier: a new technology for pricing options, Annals of Operations Research, to appear. Callegaro, G., Fiorin, L. and Pallavicini, A. (2018). Quantization goes polynomial, preprint. Graf, S. and Luschgy, H. (2000). Foundations of quantization for probability

  • distributions. Springer, New York.

Pagès, G. and Sagna, A. (2015). Recursive marginal quantization of the Euler scheme of a diffusion process. Applied Mathematical Finance, 22(5), pp. 463-498. Mc Walter, T.A., Rudd, R., Kienitz, J. and Platen, E. (2018). Recursive Marginal Quantization of Higher-Order Schemes. Quantitative Finance, 18(4),

  • pp. 693-706.

23 / 24

slide-24
SLIDE 24

Introduction Quantization Recursive marginal quantization Results and perspectives

Thank you for your attention !!!

24 / 24