17.1 Model Parameters

  • The estimates of the \(\beta\) coefficients are the values that minimize the sum of squared errors for the sample.
  • Considering \(n\) = sample size, \(k+1\) = number of \(\beta\) coefficients in the model (including the intercept) and \(SSE\) = sum of squared errors, \[\textrm{MSE}=\frac{\textrm{SSE}}{n-(k+1)}\] estimates \(\sigma^2,\) the variance of the errors. \(S=\sqrt{MSE}\) estimates \(\sigma\) and is known as the regression standard error or the residual standard error.
  • Each \(\beta\) coefficient represents the change in the mean response, \(E(y)\), per unit increase in the associated predictor variable when all the other predictors are held constant. For example, \(\beta_1\) represents the change in the mean response, \(E(y)\), per unit increase in \(x_1\) when \(x_2, x_3, \ldots, x_k\) are held constant.
  • The intercept term, \(\beta_0,\) represents the mean response, \(E(y)\), when all the predictors \(x_1, x_2, \ldots, x_k,\) are all zero (which may or may not have any practical meaning).