Sum of squared errors [SSE]

PrepNuggets

A measure of the accuracy of a simple linear regression model. It is calculated by summing the squared differences between the observed values and the predicted values of the dependent variable.

In simple terms, the SSE is a measure of how well the regression model fits the data. A lower SSE indicates a better fitting model, while a higher SSE indicates a poorer fitting model. The SSE represents the unexplained source of variation in the ANOVA table.

The main difference between SSE and SEE is that SSE is calculated using the squared differences between the observed and predicted values, while SEE is calculated using the distance between the observed and predicted values. SSE is generally considered a more robust measure of model accuracy, as it is less affected by extreme values or outliers. However, SEE is often easier to interpret and is more commonly used in practice.

See also: Least squares method, ANOVA table

Compare: SEE