 # Regression Sum of Squares Formula

Also known as the explained sum, the model sum of squares or sum of squares dues to regression. It helps to represent how well a data that has been model has been modelled. It there is some variation in the modelled values to the total sum of squares, then that explained sum of squares formula is used. Helps measure how much variation there is in the data observed.

\begin{array}{l}\begin{aligned} S S T &=S S E+S S R \\ S S T &=S S_{y y} \text { total sum of squares } \\ S S R &=b_{1} S S_{x y} \text { regression sum of squares } \\ S S E &=S S T-S S R=\sum_{i=1}^{n} e_{i}^{2} \text { error (residual) sum of squares } \end{aligned}\end{array}

\begin{array}{l}\begin{aligned} S S_{y y} &=\sum_{i=1}^{n}\left(y_{i}-\bar{y}\right)^{2} \text { variation in direction of } y \\ S S_{x x} &=\sum_{i=1}^{n}\left(x_{i}-\bar{x}\right)^{2} \text { variation in direction of } x \\ S S_{x y} &=\sum_{i=1}^{n}\left(x_{i}-\bar{x}\right)\left(y_{i}-\bar{y}\right) \text { covariation of } x \text { and } y \end{aligned}\end{array}

### Solved example

Question: Find the regression sum of square line for the data set {(1, 2), (2, 1), (4, 6), (5, 6)}?

Solution:

\begin{array}{l}\begin{aligned} S S_{X X} &=\sum_{i=1}^{n} X_{i}^{2}-\frac{1}{n}\left(\sum_{i=1}^{n} X_{i}\right)^{2} \\ &=46-\frac{1}{4}(12)^{2} \\ &=10 \end{aligned}\end{array}

\begin{array}{l}\begin{aligned} S S_{Y Y} &=\sum_{i=1}^{n} Y_{i}^{2}-\frac{1}{n}\left(\sum_{i=1}^{n} Y_{i}\right)^{2} \\ &=77-\frac{1}{4}(15)^{2} \\ &=20.75 \end{aligned}\end{array}

\begin{array}{l}\begin{aligned} S S_{X Y} &=\sum_{i=1}^{n} X_{i} Y_{i}-\frac{1}{n}\left(\sum_{i=1}^{n} X_{i}\right)\left(\sum_{i=1}^{n} Y_{i}\right) \\ &=58-\frac{1}{4}(12)(15) \\ &=13 \\ \hat{\beta}_{1}=& \frac{S S_{X Y}}{S S_{X X}} \end{aligned}\end{array}

\begin{array}{l}\begin{aligned} &=\frac{13}{10} \\ &=1.3 \\ S S_{R} &=\hat{\beta}_{1} \times S S_{X Y} \\ &=1.3 \times 13 \\ &=16.9 \end{aligned}\end{array}