A function , not necessarily periodic, can be approximated over an interval as a finite linear combination of independent functions :
The condition for the optimality of this approximation is
equivalent to the set of equations
This is a system of linear equations in the unknowns , and it is nonsingular as long as the basis functions are independent; if the basis functions are orthogonal the system is completely uncoupled and its solution is immediately available as
If, moreover, the basis functions are orthonormal, this reduces to
But, even if the are not orthogonal, the full system still determines a set of optimal .
An important difference is that, while the computed for orthogonal basis functions, as a consequence of the system being uncoupled, are independent of the order of the approximation, those computed for nonorthogonal basis functions do change with , even if the difference is relatively small if is large.
As an example, determine the optimal approximation of a given function as a truncated power series over a finite interval :
To put numbers into the symbols, let and ; then
, and can be computed and introduced into the system; at last, truncating to five terms I find
with an error around inside the interval; of course the error gets rapidly worse outside.
Note that s have nothing to do with the Taylor coefficients; Taylor series are valid in the neighborhood of a point, not over a finite interval.