Consider the regression equation 𝑌i = 𝛼 + 𝛽𝑋i + 𝑢i where 𝑢i is a stochastic error term. a) Explain how estimators of 𝛼 and 𝛽 can be obtained. b) What algebraic properties do the estimators fulfil?

Introduction

Regression analysis is a powerful statistical technique used to determine the relationship between a dependent variable (Y) and an independent variable (X). One of the simplest forms of regression is the simple linear regression, represented as:

Yi = α + βXi + ui

Where:

  • Yi: Dependent variable for observation i
  • Xi: Independent variable for observation i
  • α: Intercept of the regression line
  • β: Slope coefficient, representing the effect of X on Y
  • ui: Error term (disturbance term)

Let’s now explore how the estimators for α and β are obtained and their algebraic properties.

(a) How Estimators of α and β Can Be Obtained

To estimate the parameters α and β, the method of Ordinary Least Squares (OLS) is commonly used. The goal of OLS is to minimize the sum of the squared residuals (errors).

Objective: Minimize Σ(ui)² = Σ(Yi − α − βXi

Step-by-Step Estimation

1. The OLS minimizes the squared differences between the actual and predicted values of Y.
2. Taking derivatives with respect to α and β and setting them to zero gives us the Normal Equations:

  • ΣYi = nα + βΣXi
  • ΣXiYi = αΣXi + βΣXi²

OLS Estimators:

β̂ = [Σ(Xi − X̄)(Yi − Ȳ)] / Σ(Xi − X̄)²
α̂ = Ȳ − β̂X̄

Where:
X̄ = Mean of X
Ȳ = Mean of Y

Thus, α̂ and β̂ are the estimators for α and β, calculated from the sample data.

(b) Algebraic Properties of OLS Estimators

The OLS estimators possess several algebraic and statistical properties. These are essential to understand the behavior and reliability of the estimators.

1. Linearity

OLS estimators are linear functions of the dependent variable Y. This makes them computationally simple and easy to interpret.

2. Unbiasedness

Under the classical linear regression assumptions (including E(ui) = 0, and ui being uncorrelated with Xi), the OLS estimators α̂ and β̂ are unbiased:

E(α̂) = α and E(β̂) = β

3. Minimum Variance (Efficiency)

Among all linear and unbiased estimators, the OLS estimators have the smallest variance. This property makes them BLUE (Best Linear Unbiased Estimators), as stated in the Gauss-Markov theorem.

4. Zero Sum of Residuals

The sum of the residuals from the OLS estimation is always zero:

Σûi = 0

This implies that the regression line passes through the mean of the observed data.

5. Residuals Are Uncorrelated with Independent Variable

The residuals are uncorrelated with the values of the independent variable:

ΣXii = 0

This ensures that the regression line is the best linear fit to the data.

6. Mean of Residuals is Zero

Mean(û) = 0
This is a direct result of the OLS estimation process, reinforcing the unbiased nature of the estimators.

Conclusion

In the simple linear regression model Yi = α + βXi + ui, the parameters α and β are estimated using the method of Ordinary Least Squares. These estimators are calculated by minimizing the sum of squared residuals, and they possess several important algebraic properties such as unbiasedness, efficiency, and linearity. Understanding these properties helps us trust the regression output and use it for valid inference and prediction.

Leave a Comment

Your email address will not be published. Required fields are marked *

Disabled !