About

The Residual sum of Squares (RSS) is defined as below and is used in the Least Square Method in order to estimate the regression coefficient.

The smallest residual sum of squares is equivalent to the largest r squared.

The deviance calculation is a generalization of residual sum of squares.

Squared loss = <math>(y-\hat{y})^2</math>

Equation

<MATH> \begin{array}{rrl} \text{Residual sum of Squares (RSS)} & = & \sum_{i=1}^{\href{sample_size}{N}}(\href{residual}{residual})^2 \\ RSS & = & \sum_{i=1}^{\href{sample_size}{N}}(\href{residual}{e_i})^2 \\ RSS & = & \sum_{i=1}^{\href{sample_size}{N}}(Y_i-\hat{Y_i})^2 \\ \end{array} </MATH>

where:

The residual sum is squared to get rid of the negative sign.

Example

Simple Regression

<MATH> \begin{array}{rrl} RSS & = & \sum_{i=1}^{\href{sample_size}{N}}(Y_i-\hat{B}_0-\hat{B}_1 X_i)^2 \\ \end{array} </MATH>

Multiple Regression

<MATH> \begin{array}{rrl} RSS & = & \sum_{i=1}^{\href{sample_size}{N}}(Y_i-\hat{B}_0-\hat{B}_1 X_{i1}-\dots-\hat{B}_n X_{in})^2 \\ & = & \sum_{i=1}^{\href{sample_size}{N}}(Y_i-\hat{B}_0-\sum_{j=1}^{\href{dimension}{P}}\hat{B}_j X_{ij})^2 \\ \end{array} </MATH>