how to make
play

How to Make Remaining Problem Plausibility-Based Let Us Consider - PowerPoint PPT Presentation

Need for Prediction Deterministic . . . Need for Statistical . . . Plausibility Approach . . . How to Make Remaining Problem Plausibility-Based Let Us Consider the . . . How to Modify . . . Forecasting More Accurate What Is the . . .


  1. Need for Prediction Deterministic . . . Need for Statistical . . . Plausibility Approach . . . How to Make Remaining Problem Plausibility-Based Let Us Consider the . . . How to Modify . . . Forecasting More Accurate What Is the . . . Resulting . . . Kongliang Zhu 1 , Nantiworn Thianpaen 2 , Home Page and Vladik Kreinovich 3 Title Page 1 Faculty of Economics, Chiang Mai University, Thailand email 258zkl@gmail.com ◭◭ ◮◮ 2 Faculty of Economics, Chiang Mai University, Thailand, ◭ ◮ and Faculty of Management Sciences, Suratthani Rajabhat University, Thailand, nantiworn@outlook.com Page 1 of 22 3 Department of Computer Science, University of Texas at El Paso El Paso, TX 79968, USA, vladik@utep.edu Go Back Full Screen Close Quit

  2. Need for Prediction Deterministic . . . 1. Outline Need for Statistical . . . • In recent papers, a new plausibility-based forecasting Plausibility Approach . . . method was proposed. Remaining Problem Let Us Consider the . . . • This method has been empirically successful. How to Modify . . . • One of the steps – selecting a uniform probability dis- What Is the . . . tribution for the plausibility level – is heuristic. Resulting . . . • Is this selection optimal or a modified selection would Home Page like to a more accurate forecast? Title Page • In this talk, we show that the uniform distribution does ◭◭ ◮◮ not always lead to (asymptotically) optimal estimates. ◭ ◮ • We show how to modify this step so that the resulting Page 2 of 22 estimates become asymptotically optimal. Go Back Full Screen Close Quit

  3. Need for Prediction Deterministic . . . 2. Need for Prediction Need for Statistical . . . • One of the main objectives of science is: Plausibility Approach . . . Remaining Problem – given the available data x 1 , . . . , x n , Let Us Consider the . . . – to predict future values of different quantities y . How to Modify . . . • The usual approach to solving this problem consists of What Is the . . . two stages: Resulting . . . Home Page – first, we find a model that describes the observed data; and Title Page – then, we use this model to predict the future value ◭◭ ◮◮ of each of the quantities y . ◭ ◮ • Often, it is sufficient to have a deterministic model : Page 3 of 22 x i = f i ( p ) and y = f ( p ) for some parameters p . Go Back • We use the observed values to estimate p ; then, we use Full Screen these estimates to predict the desired future values y . Close Quit

  4. Need for Prediction Deterministic . . . 3. Deterministic Prediction and Beyond Need for Statistical . . . • This is how, e.g., solar eclipses can be predicted for Plausibility Approach . . . centuries ahead; here: Remaining Problem Let Us Consider the . . . – parameters p includes initial locations, initial ve- How to Modify . . . locities, and masses of all the celestial bodies; What Is the . . . – observations x i are visible locations of celestial bod- Resulting . . . ies at different moments of time. Home Page Title Page ◭◭ ◮◮ ◭ ◮ Page 4 of 22 Go Back Full Screen Close Quit

  5. Need for Prediction Deterministic . . . 4. Need for Statistical Prediction Need for Statistical . . . • In most practical problems, a fully deterministic pre- Plausibility Approach . . . diction is not possible: Remaining Problem Let Us Consider the . . . – in addition to the parameters p , How to Modify . . . – both the observed values x i and the future value y What Is the . . . are affected by parameters z j beyond our control, Resulting . . . – parameters that can be viewed as random . Home Page • Thus, we have a probabilistic model : Title Page x i = f ( p, z 1 , . . . , z m ) and y = f ( p, z 1 , . . . , z m ) , ◭◭ ◮◮ where z j are random variables. ◭ ◮ • Usually: Page 5 of 22 – we do not know the exact probability distribution Go Back for the variables z i , but Full Screen – we know a finite-parametric family of distributions that contains the actual (unknown) distribution. Close Quit

  6. Need for Prediction Deterministic . . . 5. Need for Statistical Prediction Need for Statistical . . . • For example, we may know that the distribution is Plausibility Approach . . . Gaussian, or that it is uniform. Remaining Problem Let Us Consider the . . . • Let q denote the parameter(s) that describe this dis- How to Modify . . . tribution. What Is the . . . • In this case, both x i and y are random variables whose Resulting . . . distribution depends on all the parameters θ = ( p, q ): Home Page x i ∼ f i,θ and y ∼ f θ . Title Page • In this case, to identify the model: ◭◭ ◮◮ – we first estimate the parameters θ based on the ◭ ◮ observations x 1 , . . . , x n , and then Page 6 of 22 – we use the distribution f θ corresponding to these Go Back parameter values to predict the values y Full Screen – or, to be more precise, to predict the probability of different values of y . Close Quit

  7. Need for Prediction Deterministic . . . 6. Need for a Confidence Interval Need for Statistical . . . • In the statistical case, we cannot predict the exact value Plausibility Approach . . . of y . Remaining Problem Let Us Consider the . . . • So, it is desirable to predict the range of possible values How to Modify . . . of y . What Is the . . . • For many distributions (e.g., for normal), it is possible Resulting . . . to have arbitrarily small and arbitrarily large values. Home Page • In such situations, there is no guaranteed range of val- Title Page ues of y . ◭◭ ◮◮ • However, we can still try to estimate a confidence in- ◭ ◮ terval , i.e., Page 7 of 22 – for a given small value α > 0, Go Back – an interval [ y α , y α ] that contains the actual value y Full Screen with confidence 1 − α . Close Quit

  8. Need for Prediction Deterministic . . . 7. Confidence Interval (cont-d) Need for Statistical . . . • In other words, we would like to find an interval for Plausibility Approach . . . which Prob( y ∈ [ y α , y α ]) ≥ α . Remaining Problem Let Us Consider the . . . def • When we know the cdf F ( y ) = Prob( Y ≤ y ), we can How to Modify . . . F − 1 � α 1 − α � � , F − 1 � �� take [ y α , y α ] = . What Is the . . . 2 2 • In general, a statistical estimate based on a finite sam- Resulting . . . Home Page ple is only approximate. Title Page • Thus, based on a finite sample, we can predict the value of the parameters θ only approximately. ◭◭ ◮◮ • Therefore, we only have an approximate estimate of ◭ ◮ the probabilities of different values of y . Page 8 of 22 • So, instead of the actual cdf F ( y ), we only know the Go Back bounds on the cdf: F ( y ) ≤ F ( y ) ≤ F ( y ). Full Screen • We want to select [ y α , y α ] so that the probability of Close being outside this interval is guaranteed not be ≤ α . Quit

  9. Need for Prediction Deterministic . . . 8. Confidence Interval (cont-d) Need for Statistical . . . • We can take F ( y α ) ≤ α Plausibility Approach . . . 2 for all possible F ( x ). Remaining Problem • This can be achieved for y α = ( F ) − 1 � α � . Let Us Consider the . . . 2 How to Modify . . . 1 − α • Similarly, we can take y α = ( F ) − 1 � � . What Is the . . . 2 Resulting . . . • Plausibility-based forecasting: we start by forming a Home Page likelihood function . Title Page • We assume that the probability density function corre- sponding to each observation x i has the form f i,θ ( x i ). ◭◭ ◮◮ • We assume that x i are independent, so ◭ ◮ n Page 9 of 22 � L x ( θ ) = f i,θ ( x i ) . i =1 Go Back • L x ( θ ) is normally used to find the maximum likelihood Full Screen � � estimate ˆ ˆ θ : L x θ = max L x ( θ ). Close θ Quit

  10. Need for Prediction Deterministic . . . 9. Plausibility Approach (cont-d) Need for Statistical . . . • Instead, we use L x ( θ ) to define the plausibility function Plausibility Approach . . . Remaining Problem θ ′ L x ( θ ′ ) = L x ( θ ) L x ( θ ) pl x ( θ ) = � . Let Us Consider the . . . � sup ˆ L x θ How to Modify . . . • Based on pl x ( θ ), we define, for each ω ∈ [0 , 1], a plau- What Is the . . . sibility region Γ x ( ω ) = { θ : pl x ( θ ) ≥ ω } . Resulting . . . Home Page • We represent y as g ( θ, z ), where z is uniform on [0 , 1] and g ( θ, z ) = F − 1 Title Page θ ( z ). ◭◭ ◮◮ • We compute the belief and plausibility of each set A of possible values of θ : Bel( A ) = Prob( g (Γ x ( ω ) , z ) ⊆ A ) , ◭ ◮ Pl( A ) = Prob( g (Γ x ( ω ) , z ) ∩ A � = ∅ ) . Page 10 of 22 Go Back • Here, ω , z are uniformly distributed on [0 , 1]. Full Screen • Then, we compute F ( y ) = Bel(( −∞ , y ]), F ( y ) = Pl(( −∞ , y ]) , and the confidence intervals. Close Quit

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend