Much of the concepts in simple regression are applicable, but watch out when determining your degrees of freedom for different analyses, as the values will be slightly different for models similar in observation count, but different in slope coefficient count.
Six Assumptions of Multiple Regression (very similar to simple regression)
- Y and X must have a liner relationship.
- X is not random and there is no multicollinearity among two or more of the independent variables.
- The expected value of e is 0 (zero).
- No heteroskedasticity, i.e., the error term’s variance is the same for all observations /does not exhibit a relationship with the independent variable.
- No serial correlation, i.e. error terms are uncorrelated with one another across all observations.
- The error term has a normal distribution.
Be sure to review and get comfortable with the standard form of a statistical software program’s output for a multiple variable regression analysis.
- Example multiple regression equation: Yi = b0 + b1X1i + b2X2i + ei
- You will need to be able to:
- Use this equation to estimate a dependent variable (this becomes simple plug and chug once you are comfortable with the material)
- Test the overall validity of a multiple regression model (see Fcalc below)
- Perform tcalcs to test the validity of the y-intercept and individual slope coefficients.
- Determine confidence intervals for individual slope coefficients.
- Hypothesis testing to determine if a slope coefficient is statistically different from some specified value (maybe your colleague creating a similar model but deriving a different slope coefficient; this testing will determine if the difference is statistically significant).
- Determine the Standard Error of the Estimate (SEE)