WAVELET-PLS REGRESSION: Application to Oil Production Data Salwa - - PowerPoint PPT Presentation

wavelet pls regression application to oil production data
SMART_READER_LITE
LIVE PREVIEW

WAVELET-PLS REGRESSION: Application to Oil Production Data Salwa - - PowerPoint PPT Presentation

WAVELET-PLS REGRESSION: Application to Oil Production Data Salwa BenAmmou, Zied Kacem, Hdi Kortas and Zouheir Dhifaoui Computational Mathematics Laboratory Introduction Statisticians are often confronted to several problems such as


slide-1
SLIDE 1

WAVELET-PLS REGRESSION: Application to Oil Production Data

Salwa BenAmmou, Zied Kacem, Hédi Kortas and Zouheir Dhifaoui

Computational Mathematics Laboratory

slide-2
SLIDE 2

Introduction

 Statisticians are often confronted to several problems such as

missing or incomplete data, the presence of a strong collinearity between the explanatory variables or the case where the number of variables exceeds the number of observations.

 The PLS method has been proposed by WOLD in the 80’s to cope

with these problems.

slide-3
SLIDE 3

Introduction

In practical applications, however, we are confronted with the problem of noise affecting the dataset. Actually, the noise component can strongly affect the adjustment quality and the predictive performance of the PLS model.

slide-4
SLIDE 4

Objective

We propose an hybrid data analysis method based on the combination of wavelet thresholding techniques and PLS regression in order to remove or attenuate the effect of the noise.

slide-5
SLIDE 5

Wavelet Theory: Multiresolution Analysis (MRA)

A MRA is a sequence

  • f closed subspaces of

satisfying:

                 

 

1 1 2

: : 2. . : . . ; .

j j j j j j j j

i j V V ii j f V f V iii V i V L Thereexists a function V suchthat k k isanO N Bof V iscalled scaling function     

   

             

 

2 IR

L

slide-6
SLIDE 6

Wavelet Theory: Basic concepts

The scaling function is such that: is an ONB of Let be the orthogonal complement of in

If there exists a function such that is an ONB in is called wavelet function and satisfy: is an ONB in

 

 

/ 2

: 2 2 . :

j j jk

k k     

j

W

j

V

1 j

V  W    

 

. : k k Z   

W 

 

 

2

: 2 2 . ,

j j jk

k k Z     

slide-7
SLIDE 7

Wavelet Theory: Basic concepts

Thus a function has a unique representation in terms of a convergent series in

2

L :

     

 

 

 

j k jk jk k k k

x x x f    

(1) where

   dx

x x f

k k

    and

   dx

x x f

jk jk

   

slide-8
SLIDE 8

Wavelet thresholding

The thresholding strategy consists in three steps:

  • Apply the DWT decomposition to the observed data sequence to

produce a set of scale-wise approximation and detail coefficients.

  • Keep the detail coefficients

which are above a fixed threshold level and set to zero the coefficients which are below the threshold.

  • Reconstruct the signal

jk

slide-9
SLIDE 9

Thresholding techniques

 Hard thesholding:

 

           x si x x si x

 Soft thesholding:

   

si x x x sign x si x             

slide-10
SLIDE 10

 The linear wavelet estimator

  • f the function

is given by:

(2)

are the thresholded wavelet detail coefficients are the thresholded approximation coefficients

jl

f ˆ

f

   

   

  n 1 i n 1 i i jk i jk i jk i jk

X Y n 1 d ˆ et X Y n 1 c ˆ  

jk

c ˆ

jk

d ˆ

slide-11
SLIDE 11

PLS regression (Partial Least Squares Regression)

  • PLS regression (PLS) is a nonlinear model linking a set of

dependent variables Y to a set of numerical or categorical explanatory variables X.

  • It is often utilized to handle highly correlated regressors
  • It is of great interest when dealing with data sets in which the number
  • f predictors greatly exceeds the number of observations.
  • It allows to deal with the problem of missing data.
slide-12
SLIDE 12

PLS1 regression

PLS univariate regression (PLS1) is a nonlinear model linking a dependent variable to a set of numerical or categorical explanatory variables .

 The PLS1 regression algorithm involves several steps:

  • Construction of the first PLS component t1
  • Normalisation of the coefficients

* j 1

w

k k 1 1 11 1

X w ... X w t   

(3) ) (

1 2 1 1 * 1

k j j j j

w w w

slide-13
SLIDE 13

PLS1 regression

  • Perform an OLS regression of Y on t1

regression coefficient residuals

Therefore: If the model has limited explanatory power, we search for a second component which is not correlated with and is able to explain the residual vector quite good.

1 1 1

Y t c Y ˆ  

1 k k 1 1 1 11 1

Y X w c ... X w c Y ˆ    

slide-14
SLIDE 14

PLS1 Regression

  • t2 can be written as:
  • We perform a multiple regression of Y on t1, t2:

regression coefficients residual vector

  • The number of components th to be retained is determined by cross

validation

k 1 k 2 11 21 2

x w ... x w t   

2 2 2 1 1

Y t c t c Y ˆ   

slide-15
SLIDE 15

Wavelet-PLS

slide-16
SLIDE 16

Application

  • The response variable Y: the crude oil (petroleum) daily

production in barrels in a given oil field composed of four wells during the period from May 1, 2003 to March 31, 2006 i.e.1024

  • bservations.
  • The data measurements are made on a daily basis
  • The response variable Y depends on 16 explanatory variables:

Choke i : the choke valve position in the oil well i; i = 1,…, 4.

FTHP i : Flowing Tubing Head Pressure of the well i (in Bars); i = 1,…, 4.

slide-17
SLIDE 17

 Pres at Choke i : pressure on the level of the choke in the well i (in bars);

i = 1,…, 4.

 WCi : (Water cut) Percentage of water. It is the ratio of water produced to

the volume of total liquids extracted from the well i; i = 1,…, 4.

slide-18
SLIDE 18

Wavelet threshoding set-up

 We use a Daubechies compactly supported wavelet with 5 vanishing

moments.

 The Discrete wavelet Transform is curtailed at scale j=5  We opt for a soft thresholding

slide-19
SLIDE 19

Signal before (green) and after thresholding (black)

slide-20
SLIDE 20

Specification of the number of components by cross validation

Number of components Q2

h

limits 1 0.742 0.0975 2 0.319 0.0975 3 0.393 0.0975 4 0.186 0.0975 5 0.0663 0.0975 Number of components Q2

h

limits 1 0.734 0.0975 2 0.287 0.0975 3 0.237 0.0975 4 0.266 0.0975 5 0.038 0.0975 PLS1 Wavelet-PLS

slide-21
SLIDE 21

The PLS1 equation before thresholding: ŷ = 0,14745477 x1 + 0,12351255 x2 + 0,29458188 x3 + 0,16206525 x4 - 0,27695889 x5 + 0,03891265 x6 - 0,1728005 x7 - 0,14108841 x8 + 0,28230372 x9 + 0,27352113 x10 + 0,23676341 x11 + 0,08288938 x12 + 0,01417857 x13 - 0,19398681 x14 - 0,00272167 x15 + 0,00767741 x16

The PLS1 equation after thresholding: ŷ = 0,076750638 x1 + 0,073312704 x2 + 0,314558779 x3

+0,116011568 x4 - 0,268962544 x5 + 0,002680218 x6 - 0,124656262 x7 - 0,254468339 x8 + 0,338198727 x9 + 0,317734483 x10 + 0,277136406 x11 + 0,053406536 x12 - 0,028291771 x13 - 0,13101302 x14 - 0,028039241 x15 - 0,01063908 x16.

slide-22
SLIDE 22

Outliers

PLS1 before thresholding

9.6% of the total sample are regarded as outliers

PLS1 after thresholding

8.7% of the observations are regarded as outliers

slide-23
SLIDE 23

Confidence ellipsoids

Raw data Denoised data

slide-24
SLIDE 24

Goodness of fit

The values are much closer to zero than the . This shows the effectiveness of the wavelet techniques for noise removal.

2 1

R

2 2

R

slide-25
SLIDE 25

Mean Squared Errors

It is clear that the are much smaller than those of This confirms the relevance of the Wavelet-PLS method.

2 MSE 1 MSE

slide-26
SLIDE 26

Conclusion

The Wavelet-PLS approach allowed us to:

 reduce the number of outliers  reduce the Mean Square Error  correct the observations in the score plot  ameliorate the goodness of fit of the model

slide-27
SLIDE 27

Thanks