- CFA L2: Quantitative Methods - Introduction
- Quants: Correlation Analysis
- Quants: Single Variable Linear Regression Analysis
- Standard Error of the Estimate or SEE
- Confidence Intervals (CI) for Dependent Variable Prediction
- Coefficient of Determination (R-Squared)
- Analysis of Variance or ANOVA
- Multiple Regression Analysis
- Multiple Regression and Coefficient of Determination (R-Squared)
- Fcalc – the Global Test for Regression Significance
- Regression Analysis and Assumption Violations
- Qualitative and Dummy Variables in Regression Modeling
- Time Series Analysis: Simple and Log-linear Trend Models
- Auto-Regressive (AR) Time Series Models
- Auto-Regressive Models - Random Walks and Unit Roots
- ARMA Models and ARCH Testing
- How to Select the Most Appropriate Time Series Model?
Time Series Analysis: Simple and Log-linear Trend Models
Simple Time Series Models
- This is basic trend modeling.
A simple trend model can be expressed as follows:
yt = b0 + b1t+ εt
b0 = the y-intercept; where t = 0.
b1 = the slope coefficient of the time trend.
t = the time period.
ŷt = the estimated value for time t based on the model.
ei = the random error of the time trend.
The big validity pit-fall for simple trend models is serial correlation; if this problem is present, then you will see an artificially high R2 and your slope coefficient may falsely appear to be significant.
There is a visual way to detect serial correlation (not shown) or you can perform a Dubin-Watson test.
Log-linear Trend Models
- This applies to non-linear time series trends.
The structure is:
ln yt = b0 + b1t+ et; or
yt \= e b0 + b1t + et
Again, like the simple trend model, use a graph or Durbin Watson test to check for serial correlation, as this will be a big threat to validity.