Linear regression models are used to show or predict the relationship between two variables or factors. The factor that is being predicted (the factor that the equation solves for) is called the dependent variable. The factors that are used to predict the value of the dependent variable are called the independent variables.
In simple linear regression, each observation consists of two values. One value is for the dependent variable and one value is for the independent variable.
Simple Linear Regression Model
The simple linear regression model is represented like this: y = (β0 +β1 + Ε
By mathematical convention, the two factors that are involved in a simple linear regression analysis are designated x and y. The equation that describes how y is related to x is known as the regression model. The linear regression model also contains an error term that is represented by Ε, or the Greek letter epsilon. The error term is used to account for the variability in y that cannot be explained by the linear relationship between x and y. There also parameters that represent the population being studied. These parameters of the model that are represented by (β0+β1x).
Simple Linear Regression Model
The simple linear regression equation is represented like this: Ε(y) = (β0 +β1 x).
The simple linear regression equation is graphed as a straight line.
(β0 is the y intercept of the regression line.
β1 is the slope.
Ε(y) is the mean or expected value of y for a given value of x.
A regression line can show a positive linear relationship, a negative linear relationship, or no relationship. If the graphed line in a simple linear regression is flat (not sloped), there is no relationship between the two variables. If the regression line slopes upward with the lower end of the line at the y intercept (axis) of the graph, and the upper end of line extending upward into the graph field, away from the x intercept (axis) a positive linear relationship exists. If the regression line slopes downward with the upper end of the line at the y intercept (axis) of the graph, and the lower end of line extending downward into the graph field, toward the x intercept (axis) a negative linear relationship exists.
Estimated Linear Regression Equation
If the parameters of the population were known, the simple linear regression equation (shown below) could be used to compute the mean value of y for a known value of x.
Ε(y) = (β0 +β1 x).
However, in practice, the parameter values are not known so they must be estimated by using data from a sample of the population. The population parameters are estimated by using sample statistics. The sample statistics are represented by b0 +b1. When the sample statistics are substituted for the population parameters, the estimated regression equation is formed.
The estimated regression equation is shown below.
(ŷ) = (β0 +β1 x
(ŷ) is pronounced y hat.
The graph of the estimated simple regression equation is called the estimated regression line.
The b0 is the y intercept.
The b1 is the slope.
The ŷ) is the estimated value of y for a given value of x.
Important Note: Regression analysis is not used to interpret cause-and-effect relationships between variables. Regression analysis can, however, indicate how variables are related or to what extent variables are associated with each other. In so doing, regression analysis tends to make salient relationships that warrant a knowledgeable researcher taking a closer look.
The Least Squares Method is a statistical procedure for using sample data to find the value of the estimated regression equation. The Least Squares Method was proposed by Carl Friedrich Gauss, who was born in the year 1777 and died in 1855. The Least Squares Method is still widely used.
Anderson, D. R., Sweeney, D. J., and Williams, T. A. (2003). Essentials of Statistics for Business and Economics (3rd ed.) Mason, Ohio: Southwestern, Thompson Learning.
______. (2010). Explained: Regression Analysis. MIT News. Retrieved http://web.mit.edu/newsoffice/2010/explained-reg-analysis-0316.html
McIntyre, L. (1994). Using Cigarette Data for An Introduction to Multiple Regression. Journal of Statistics Education, 2(1). Retrieved http://www.amstat.org/publications/jse/v2n1/datasets.mcintyre.html
Mendenhall, W., and Sincich, T. (1992). Statistics for Engineering and the Sciences (3rd ed.), New York, NY: Dellen Publishing Co.
Panchenko, D. 18.443 Statistics for Applications, Fall 2006, Section 14, Simple Linear Regression. (Massachusetts Institute of Technology: MIT OpenCourseWare) Retrieved http://ocw.mit.edu License: Creative Commons BY-NC-SA