The simple linear regression model describes how y is related to x and an error term. The coefficients in the regression model are called parameters and the error term is a random variable. The regression equation describes how the expected value, or mean, of y is related to the dependent variable. If the parameters of the regression equation were known, we could compute the mean of y for a given value of x. However, the parameters are unknown and we have to estimate their values using our sample. Substituting the estimated parameters into the regression equation gives us the estimated regression equation.

The coefficients of the estimated regression equation are found using the least squares method. In the least squares method, the coefficients are chosen that will minimize the sum of the squared difference between the actual and predicted values of the dependent variable, y. The slope coefficient can be interpreted as the predicted change in y corresponding to a one unit increase in x. The intercept can be interpreted as the predict value of y when x is equal to zero. The value of y can be predict by plugging in the given value of x into the estimated regression equation.