Simple linear regression: Definition from Answers.com
A simple linear regression is a linear regression in which there is only one covariate (predictor variable).
Simple linear regression is used to evaluate the linear relationship between two variables. One example could be the relationship between muscle strength and lean body mass. Another way to put it is that simple linear regression is used to develop an equation by which we can predict or estimate a dependent variable given an independent variable.
Given a sample , the regression model is given by
Where Yi is the dependent variable, a is the y intercept, b is the gradient or slope of the line, Xi is independent variable and is a random term associated with each observation.
The linear relationship between the two variables (i.e. dependent and independent) can be measured using a correlation coefficient e.g. the Pearson product moment correlation coefficient.
Contents
Estimating the regression line
The parameters of the linear regression model, , can be estimated using the method of ordinary least squares. This method finds the line that minimizes the sum of the squares of errors,
.
The minimization problem can be solved using calculus, producing the following formulas for the estimates of the regression parameters:
Ordinary least squares produces the following features:
1. The line goes through the point . This is easily seen rearranging the expression
as
, which shows that the point
verifies the fitted regression equation.
2. The sum of the residuals is equal to zero, if the model includes a constant. To see why, minimize with respect to a taking the following partial derivative:
- Setting this partial derivative to zero and noting that
yields
as desired.
3. The linear combination of the residuals in which the coefficients are the x-values is equal to zero.
4. The estimates are unbiased.
Alternative formulas for the slope coefficient
There are alternative (and simpler) formulas for calculating :
Here, r is the correlation coefficient of X and Y, sx is the sample standard deviation of X and sy is the sample standard deviation of Y.
Inference
Under the assumption that the error term is normally distributed, the estimate of the slope coefficient has a normal distribution with mean equal to b and standard error given by:
A confidence interval for b can be created using a t-distribution with N-2 degrees of freedom:
Numerical example
Suppose we have the sample of points {(1,-1),(2,4),(6,3)}. The mean of X is 3 and the mean of Y is 2. The slope coefficient estimate is given by:
The standard error of the coefficient is 0.866. A 95% confidence interval is given by
- [0.5 − 0.866 × 12.7062, 0.5 + 0.866 × 12.7062] = [−10.504, 11.504].
Mathematical derivation of the least squares estimates
Assume that is a stochastic simple regression model and let
be a sample of size n. Here the sample is seen as observable nonrandom variables but the calculations don't change when assuming that the sample is represented by random variables
.
Let Q be the sum of squared errors:
Then taking partial derivatives with respect to α and β:
Setting and
to zero yields
which are known as the normal equations and can be written in matrix notation as
Using Cramer's rule we get
Dividing the last expression by n:
Isolating from the first normal equation yields
which is a common formula for in terms of
and the sample means.
may also be written as
using the following equalities:
The following calculation shows that is a minimum.
Hence the Hessian matrix of Q is given by
Since | D2Q(α,β) | > 0 and 2n > 0, D2Q(α,β) is positive definite for all (α,β) and is a minimum.
This entry is from Wikipedia, the leading user-contributed encyclopedia. It may not have been reviewed by professional editors (see full disclaimer)