Quantitative Techniques in Anaylsis
Autor: abyo • June 13, 2019 • Course Note • 330 Words (2 Pages) • 616 Views
Page 1 of 2
SPSS OUTPUT MISSING VALUES FORMULAS
Introduction to Linear Regression
- Straight line
- Difference between y and y hat
- Beta and Constant
- Residual or Error
- Difference between Simple Linear Regression and Multiple Linear Regression
- Sample Size in Linear Regression. Note: the expected R for random data is k/(N − 1)
Assumption of Multiple Linear Regression
- Normality
- Homoscedasticity
- Independent error ( No auto correlation for different point in time or Spatial autocorrelation for different spatial locations)
- Endogeneity
- Specification Error ( omitted variable bias, Irrelevant variable included Measurement Error, simultaneity bias)
- Lack of multicollinearity
- Consistent Coefficient
Assumption Testing
- Normality (Kolmogorov-Smirnov Test and the Shapiro-Wilk Test. 2. Lilliefors test, 3. Jarque–Bera test)
- Homoscedasticity ( 1 White Test 2 Breusch–Pagan test )
- Independent error ( Durbin Watson 2. Breusch–Godfrey serial correlation LM test)
- Endogeneity (Durbin-Wu-Hausman Test)
- Specification Error (Ramsey RESET test)
- Lack of multicollinearity ( Tolerance and Variance Inflation Factor (VIF)
- Consistent Coefficient (Chow Test for structural break used in time series)
Regression Methods
- Enter
- Stepwise ( Backward)
Formulas
- R
- R2
- Adjusted R2
- Standard error of estimate
- SSR
- SSE
- SST
- Dfr
- Dfe
- Dft
- MSR
- MSE
- F
- t
- y hat
Interpretation
Model Summary: Assessing the Goodness of fit: R and R square
ANOVA Table: F Value
Coefficient Table: Assessing the individual predictors
Notation
Y = Dependent Variable.
X = Independent Variable
Ŷ = Predicted Y
SS = Sum of Square
MS = Mean Square
Df = Degree of Freedom
Regression = R
Residual = E
Total = T
N = Sample Size
Formulas
SS.T = SS.R + SS.E
...