multiple regression analysis
play

Multiple Regression Analysis Independent Variables Mechanics and - PowerPoint PPT Presentation

Motivation for Multiple Regression The Model with k Multiple Regression Analysis Independent Variables Mechanics and Interpretation of OLS Interpreting the OLS Caio Vigo Regression Line The Expected Value of the OLS The University of


  1. Motivation for Multiple Regression The Model with k Multiple Regression Analysis Independent Variables Mechanics and Interpretation of OLS Interpreting the OLS Caio Vigo Regression Line The Expected Value of the OLS The University of Kansas Estimators Department of Economics The Variance of the OLS Estimators Spring 2019 Estimating the Error Variance Efficiency of OLS: The Gauss-Markov Theorem These slides were based on Introductory Econometrics by Jeffrey M. Wooldridge (2015) 1 / 89

  2. Topics Motivation for Multiple Regression 1 Motivation for Multiple Regression The Model with k Independent The Model with k Independent Variables Variables Mechanics and Interpretation 2 Mechanics and Interpretation of OLS of OLS Interpreting the OLS Interpreting the OLS Regression Line Regression Line The Expected Value of the 3 The Expected Value of the OLS Estimators OLS Estimators The Variance 4 The Variance of the OLS Estimators of the OLS Estimators Estimating the Error Variance Estimating the Error Variance Efficiency of 5 Efficiency of OLS: The Gauss-Markov Theorem OLS: The Gauss-Markov Theorem 2 / 89

  3. Motivation for Multiple Regression Motivation for Multiple Regression The Model with k Motivation: Independent Variables Mechanics and Interpretation • With a simple linear regression model we learned a model in which a single of OLS independent variable x explains (or affect) a dependent variable y . Interpreting the OLS Regression Line The Expected Value of the • If we add more factors to our model that are useful for explaining y , then more of OLS Estimators the variation in y can be explained. The Variance of the OLS Estimators Estimating the Error We can build better models for predicting the dependent variable. Variance Efficiency of OLS: The Gauss-Markov Theorem 3 / 89

  4. Motivation for Multiple Regression Motivation for Multiple Regression The Model with k • Recall the log(wage) example. Independent Variables Mechanics and Interpretation of OLS Example: log(wage) Interpreting the OLS Regression Line The Expected log( wage ) = β 0 + β 1 educ + u Value of the OLS Estimators The Variance • Might be the case that there are factors in u affecting y . of the OLS Estimators Estimating the Error Variance • For instance intelligence could help to explain wage . Efficiency of OLS: The Gauss-Markov Theorem 4 / 89

  5. Motivation for Multiple Regression Motivation for Multiple Regression The Model with k Independent Variables • Let’s use a proxy for it: IQ . Mechanics and Interpretation of OLS • By explicitly including IQ in the equation, we can take it out of the error term. Interpreting the OLS Regression Line The Expected • Consider the following extension of the log(wage) example: Value of the OLS Estimators Example: log(wage) (extension) The Variance of the OLS log( wage ) = β 0 + β 1 educ + β 2 IQ + u Estimators Estimating the Error Variance Efficiency of OLS: The Gauss-Markov Theorem 5 / 89

  6. The Model with 2 Independent Variable Motivation for Multiple Regression The Model with k Independent Variables Generally, we can write a model with two independent variables as: Mechanics and Interpretation of OLS y = β 0 + β 1 x 1 + β 2 x 2 + u, Interpreting the OLS Regression Line The Expected where Value of the OLS Estimators β 0 is the intercept, The Variance β 1 measures the change in y with respect to x 1 , holding other factors fixed , of the OLS Estimators β 2 measures the change in y with respect to x 2 , holding other factors fixed Estimating the Error Variance Efficiency of OLS: The Gauss-Markov Theorem 6 / 89

  7. The Model with 2 Independent Variable Motivation for Multiple Regression The Model with k Independent • In the model with two explanatory variables, the key assumption about how u is Variables Mechanics and related to x 1 and x 2 is: Interpretation of OLS Interpreting the OLS Regression Line E ( u | x 1 , x 2 ) = 0 . The Expected Value of the OLS • For any values of x 1 and x 2 in the population, the average unobservable is equal Estimators to zero. The Variance of the OLS Estimators • The value zero is not important because we have an intercept, β 0 in the equation. Estimating the Error Variance Efficiency of OLS: The Gauss-Markov Theorem 7 / 89

  8. The Model with 2 Independent Variable Motivation for Multiple Regression The Model with k Independent Variables Mechanics and Interpretation of OLS Interpreting the OLS Regression Line The Expected Value of the OLS Estimators The Variance of the OLS Estimators Estimating the Error Variance Efficiency of OLS: The Gauss-Markov Theorem 8 / 89

  9. The Model with 2 Independent Variable Motivation for Multiple Regression • In the wage equation, the assumption is E ( u | educ, IQ ) = 0 . The Model with k Independent Variables • Now u no longer contains intelligence, and so this condition has a better chance of Mechanics and Interpretation being true. of OLS Interpreting the OLS Regression Line • Recall that in the simple regression, we had to assume IQ and educ are unrelated The Expected Value of the to justify leaving IQ in the error term. OLS Estimators • Other factors, such as workforce experience and “motivation,” are part of u . The Variance of the OLS Motivation is very difficult to measure. Experience is easier: Estimators Estimating the Error Variance Efficiency of log( wage ) = β 0 + β 1 educ + β 2 IQ + β 3 exper + u. OLS: The Gauss-Markov Theorem 9 / 89

  10. The Model with k Independent Variables Motivation for Multiple • The multiple linear regression model can be written in the population as Regression The Model with k Independent Variables y = β 0 + β 1 x 1 + β 2 x 2 + . . . + β k x k + u Mechanics and Interpretation of OLS where, Interpreting the OLS Regression Line The Expected Value of the β 0 is the intercept , OLS Estimators β 1 is the parameter associated with x 1 , The Variance β 2 is the parameter associated with x 2 , and so on. of the OLS Estimators Estimating the Error • Contains k + 1 (unknown) population parameters. Variance Efficiency of OLS: The • We call β 1 , ..., β k the slope parameters . Gauss-Markov Theorem 10 / 89

  11. The Model with k Independent Variables Motivation for Multiple Regression The Model with k Independent Variables Mechanics and Interpretation • Now we have multiple explanatory or independent variables x ′ s . of OLS Interpreting the OLS Regression Line • We still have one explained or dependent variable y . The Expected Value of the OLS Estimators • We still have an error term , u . The Variance of the OLS Estimators Estimating the Error Variance Efficiency of OLS: The Gauss-Markov Theorem 11 / 89

  12. The Model with k Independent Variables Motivation for Multiple Regression The Model with k • Advantage of multiple regression: it can incorporate fairly general functional Independent Variables form relationships. Mechanics and Interpretation of OLS Interpreting the OLS Regression Line • Let lwage = log( wage ) : The Expected Value of the OLS lwage = β 0 + β 1 educ + β 2 IQ + β 3 exper + β 4 exper 2 + u, Estimators The Variance so that exper is allowed to have a quadratic effect on lwage . of the OLS Estimators Estimating the Error Variance • Thus, x 1 = educ , x 2 = IQ , x 3 = exper , and x 4 = exper 2 . Note that x 4 is a a Efficiency of nonlinear function of x 3 . OLS: The Gauss-Markov Theorem 12 / 89

  13. The Model with k Independent Variables Motivation for Multiple Regression The Model with k Independent Variables Mechanics and • The key assumption for the general multiple regression model is: Interpretation of OLS Interpreting the OLS Regression Line The Expected E ( u | x 1 , ..., x k ) = 0 Value of the OLS Estimators The Variance • We can make this condition closer to being true by “controlling for” more variables. of the OLS Estimators Estimating the Error Variance Efficiency of OLS: The Gauss-Markov Theorem 13 / 89

  14. Topics Motivation for Multiple Regression 1 Motivation for Multiple Regression The Model with k Independent The Model with k Independent Variables Variables Mechanics and Interpretation 2 Mechanics and Interpretation of OLS of OLS Interpreting the OLS Interpreting the OLS Regression Line Regression Line The Expected Value of the 3 The Expected Value of the OLS Estimators OLS Estimators The Variance 4 The Variance of the OLS Estimators of the OLS Estimators Estimating the Error Variance Estimating the Error Variance Efficiency of 5 Efficiency of OLS: The Gauss-Markov Theorem OLS: The Gauss-Markov Theorem 14 / 89

  15. Mechanics and Interpretation of OLS Motivation for Multiple Regression The Model with k Independent • Suppose we have x 1 and x 2 ( k = 2 ) along with y . Variables Mechanics and Interpretation • We want to fit an equation of the form: of OLS Interpreting the OLS Regression Line The Expected y = ˆ β 0 + ˆ β 1 x 1 + ˆ ˆ β 2 x 2 Value of the OLS Estimators given data { ( x i 1 , x i 2 , y i ) : i = 1 , ..., n } . The Variance of the OLS Estimators Estimating the Error • Sample size = n . Variance Efficiency of OLS: The Gauss-Markov Theorem 15 / 89

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend