There is an obvious alternative to using lagrangian multipliers for constrained optimization: reformulate the problem in the subspace of constraints and it becomes automatically a non constrained problem.

It is not always obvious, though, *how* one can do so in specific cases.

Sometimes a singular value decomposition (SVD) can do the trick. As an example, think of finding the plane which minimizes the sum of squared distances from a set of points , but exactly contains one point , being in homogeneous coordinates.

Of course a lagrangian formulation is possible:

subject to

to avoid the trivial solution, and to

,

leading to the lagrangian function

But one can also follow a different way.

The constraint defines a 2D subspace where the problem becomes non constrained. By applying the singular value decomposition to the single-row matrix , one finds one singular value different from zero and the

remaining three very close to zero (in the limits of machine precision).

SVD decomposes the matrix as

where the three columns of the matrix corresponding to null singular values form a basis for the nullspace of , say .

Of course the solution sought for, , lies in the nullspace of , so it must be a linear combination of the basis vectors we have just found:

To determine the s, remember that one wants the system

that is:

satisfied in the least squares sense. Call the matrix of this system:

Now one can draw again from the almighty SVD, obtaining ; one should find the three singular values all different from zero (unless are all precisely coplanar) and the solution sought for is, at last, the column of the right orthogonal matrix corresponding to the smallest singular value.

### Like this:

Like Loading...

*Related*

Pingback: Vanishing points in presence of noise | Tramp's Trifles