introduction to business statistics qm 220 qm 220 chapter
play

Introduction to Business Statistics QM 220 QM 220 Chapter 15 Dr. - PowerPoint PPT Presentation

Department of Quantitative Methods & Information Systems Introduction to Business Statistics QM 220 QM 220 Chapter 15 Dr. Mohammad Zainal Chapter 15: Multiple linear regression 15.1 Multiple Regression Analysis The model used in the


  1. Department of Quantitative Methods & Information Systems Introduction to Business Statistics QM 220 QM 220 Chapter 15 Dr. Mohammad Zainal

  2. Chapter 15: Multiple linear regression 15.1 Multiple Regression Analysis � The model used in the simple regression includes one independent variable, which is denoted by x , and one dependent variable which is denoted by y dependent variable, which is denoted by y . � Usually a dependent variable is affected by more than one independent variable. independent variable. � When we include two or more independent variables in a regression model, it is called a multiple regression model. g , p g � Remember, whether it is a simple or a multiple regression model, it always includes one and only one dependent variable. 2 QM-220, M. Zainal

  3. Chapter 15: Multiple linear regression 15.1 Multiple Regression Analysis � A multiple regression model with y as a dependent variable and x 1 , x 2 , x 3 , …, x k as independent variables is written as written as = + + + + + ε y A B x B x B x ... (1) k k 1 1 2 2 where A represents the constant term, B 1 , B 2 , B 3 , …, B k are h A t th t t t B B B B the regression coefficients of independent variables x 1, x 2 , x 3 x 3 , …, x k , respectively, and ε represents the random error x k respectively and ε represents the random error term. � This model contains k independent variables x 1 , x 2 , x 3 , …, s ode co s depe de v b es 1 , 2 , 3 , …, and x k . 3 QM-220, M. Zainal

  4. Chapter 15: Multiple linear regression 15.1 Multiple Regression Analysis � This model contains k independent variables x 1 , x 2 , x 3 , …, and x k . � A multiple regression models can only be used when the � A lti l i d l l b d h th relationship between the dependent variable and each independent variable is linear. independent variable is linear. � There can be no interaction between two or more of the independent variables. p � In regression model (1), A represents the constant term, which gives the value of y when all independent variables assume zero values. The coefficients B 1 , B 2 , B 3 , …, and B k are called the partial regression coefficients. 4 QM-220, M. Zainal

  5. Chapter 15: Multiple linear regression 15.1 Multiple Regression Analysis � For example B is a partial regression coefficient of x � For example, B 1 is a partial regression coefficient of x 1 . � It gives the change in y due to a one-unit change in x 1 when all other independent variables included in the model when all other independent variables included in the model are held constant. � In other words, if we change x 1 by one unit but keep x 2 , x 3 , …, and x k unchanged, then the resulting change in y is measured by B 1 . � In model (1) above A B � In model (1) above, A , B 1 , B 2 , B 3 , …, and B k are called the B B and B are called the true regression coefficients or population parameters . � A positive value for a particular B i in model (1) will p p ( ) i indicate a positive relationship between y and the corresponding x i variable and vice versa. 5 QM-220, M. Zainal

  6. Chapter 15: Multiple linear regression 15.1 Multiple Regression Analysis � For example B is a partial regression coefficient of x � For example, B 1 is a partial regression coefficient of x 1 . � It gives the change in y due to a one-unit change in x 1 when all other independent variables included in the model when all other independent variables included in the model are held constant. � In other words, if we change x 1 by one unit but keep x 2 , x 3 , …, and x k unchanged, then the resulting change in y is measured by B 1 . � In model (1) above A B � In model (1) above, A , B 1 , B 2 , B 3 , …, and B k are called the B B and B are called the true regression coefficients or population parameters . � A positive value for a particular B i in model (1) will p p ( ) i indicate a positive relationship between y and the corresponding x i variable and vice versa. 6 QM-220, M. Zainal

  7. Chapter 15: Multiple linear regression 15.1 Multiple Regression Analysis � If model (1) is estimated using sample data � If model (1) is estimated using sample data, which is which is usually the case, the estimated regression equation is written as = + + + + ˆ y a b x b x b x ... (2) k k 1 1 2 2 � In equation 2, a , b 1 , b 2 , b 3 , …, and b k are the sample statistics, which are the point estimators of the population parameters A , B 1 , B 2 , B 3 , …, and B k , respectively. � The degrees of freedom for the model is � The degrees of freedom for the model is df = n – k – 1 � The estimated equation 2 obtained by minimizing the sum � The estimated equation 2 obtained by minimizing the sum of squared errors is called the least squares regression equation. 7 QM-220, M. Zainal

  8. Chapter 15: Multiple linear regression 15.2 Assumptions of the Multiple Regression Model � Assumption 1: The mean of the probability distribution of � Assumption 1: The mean of the probability distribution of ε is zero � Assumption 2: The errors associated with different sets of � Assumption 2: The errors associated with different sets of values of independent variables are independent. Furthermore, these errors are normally distributed and have a constant standard deviation which is denoted by σ have a constant standard deviation, which is denoted by σ ε . � Assumption 3: The independent variables are not linearly related. However, they can have a nonlinear relationship. related. However, they can have a nonlinear relationship. When independent variables are highly linearly correlated, it is referred to as multicollinearity. � Assumption 4: There is no linear association between the random error term ε and each independent variable x i . 8 QM-220, M. Zainal

  9. Chapter 15: Multiple linear regression 15.5 Computer Solution of Multiple Regression Example: A researcher wanted to find the Monthly Driving Number of Driving Premium Experience Violation effect of driving experience and the number 148 5 2 of driving violations on auto insurance 76 14 0 premiums. A random sample of 12 drivers insured with the same company and having 100 6 1 126 10 3 similar auto insurance policies was selected 194 194 4 4 6 6 from a large city. The table lists the monthly 110 8 2 auto insurance premiums (in dollars) paid 114 11 3 by these drivers, their driving experiences (in years), and the numbers of driving (i ) d th b f d i i 86 86 16 16 1 1 198 3 5 violations committed by them during the 92 9 1 past three years. Using MINITAB, find the 70 70 19 19 0 0 regression eq ation of monthl regression equation of monthly premiums premi ms 120 13 3 paid by drivers on the driving experiences and the numbers of driving violations. 9 QM-220, M. Zainal

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend