Least Square

Least square regression is one of the most well-known curve fitting methods. This technique is very applicable for curve fitting of experimental data. Least square regression doesn’t pass directly through each point necessarily, but it follows the best trend of all points by a single smooth curve.

Linear regression is the simplest kind of least square approximation which represents a group of points by a single straight line. Suppose a group of data as {(x1,y1) , (x2,y2) , … , (xn,yn)}. The linear regression of these data is defined as follows:


y = a.x + b


The residual error for each set of data can be written as follows:


ei = a.xi + b - yi


The least square regression finds the best fit by minimizing the sum of the square of the residuals. This condition can be met by following relations:







Solving these equations, a and b can be determined as follows:


Curve Fitting (Source Code in C++)                                          
o        Cubic Spline Interpolation                                                                                   
o        Lagrange Polynomials                                                                                           
o        Least Square Regrassion