In calculation of least squares, a residual (δ) is the arithmatic difference between the mean value (x_bar) and an individual data value (y_i). Given appropriate precision, the sum of residuals is 0.
That is, Σ (x_bar - y_i) = 0, or Σ(δ) = 0.
In calculation of regression values (least squares analysis), a residual (δ) is the arithmatic difference between the calculated, or expected, value (called y_hat) and the observed value (y_i), or: δ = y_hat - y_i)
The zero residual line is the line that results from regression analysis, in the form "y-hat = a + bx" where a is the y-intercept and b is the slope of the "zero residual line."
To detemine the "zero residual line" for three points:
(1) draw a line between the first and last points
(2) shift the line, parallel to itself and in the direction of the middle point, so that the sum of the arithmatic differences between the observed y values and the y values on the line, at the corresponding x, are equal to zero.
That is, the sum of the vertical distances between the two end points in one direction (δ1 and δ3) is equal to the vertical distance between the mid point and the point on the line at that x value(δ2), in the opposite direction.
The process described above is effectively a graphic linear regression. May be more info than you want, but ...
Wiki link for least squares: http://en.wikipedia.org/wiki/Least_squares
Wiki link for Linear regression: http://en.wikipedia.org/wiki/Linear_regression