Intro to Statistics

study guides for every class

that actually explain what's on your next test

Sum of Squared Errors (SSE)

from class:

Intro to Statistics

Definition

Sum of Squared Errors (SSE) measures the total deviation of observed values from the values predicted by a regression model. It is calculated by summing the squared differences between observed and predicted values.

congrats on reading the definition of Sum of Squared Errors (SSE). now let's actually learn it.

ok, let's learn stuff

5 Must Know Facts For Your Next Test

  1. SSE is used to assess the fit of a regression model; lower SSE indicates a better fit.
  2. The formula for SSE is $$ \sum_{i=1}^{n} (y_i - \hat{y}_i)^2 $$, where $y_i$ are the observed values and $\hat{y}_i$ are the predicted values.
  3. SSE is always non-negative because it sums squared terms.
  4. It plays a crucial role in calculating other key statistics, such as Mean Squared Error (MSE) and R-squared.
  5. SSE can be compared across different models to determine which one better fits the data.

Review Questions

  • What does SSE stand for and what does it measure?
  • How is SSE calculated in a regression model?
  • Why is SSE always non-negative?

"Sum of Squared Errors (SSE)" also found in:

© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.
Glossary
Guides