Linear regression with polynomials
In my previous two stories “Linear regression with straight lines” and “Linear regression with quadratic equations”, I explained how to find the optimum parameters of specific functions so as to minimize the sum squared error between a fit function y(x) and a data set S(x), using the method of least squares analysis (LSA). Now, I will extend these methods to a polynomial of any degree.
Mathematical Derivation
Suppose we have a polynomial of order n, meaning that the highest power of x is n as in the equation
since the numbered parameters b0, b1, and so on start at n = 0, we will require a total of n+1 equations in order to solve for all these coefficients. As before, we define the sum squared error function as
and taking partial derivatives we end up with the general formula (after applying the chain rule for partial derivatives)
Now, equating the derivatives to zero, we end up with the system
and now we can write
Implementation in MATLAB
We will once again implement these equations in MATLAB. To do this, we will generate a random data set S(x) consisting of a degree-4 polynomial with a random number added at each location x. For our polynomial fit, we will use a guess of a…