## Lagrangian rejection and fitting/2

Summarizing the previous article: we have a set $S$ of $m$ samples, mostly affected by errors having a Gaussian distribution. Zero or more of the samples may be outliers, that is affected by exceptional errors which do not fit in the Gaussian distribution. We want to find  the parameters $p$ of a model fitting the samples, excluding the outliers, and we try with

$\min_{p, w} \sum_i (1+w_i)^2 d (p, S_i)$   subject to   $\sum_i w_i^2 = q$

With respect to existing statistical methods like RANSAC and LMedS, this new method has the advantage that the weights $w_i$ are not binary but are treated as true real numbers, so that a closed form solution is available.

This is a problem of constrained minimum which can be solved with the method of Lagrange multipliers.

The lagrangian function is

$\Lambda (p, w) = \sum_i (1+w_i)^2 d (p, S_i) + \lambda (\sum_i w_i^2 - q)$

As usual, the system of equations to be solved is

$\nabla_{p,w} \Lambda = 0$

$\sum_i w_i^2 - q = 0$

that is:

$\sum_i (1+w_i)^2 \dfrac{\partial d (p, S_i)}{\partial p_1} = 0$

$\sum_i (1+w_i)^2 \dfrac{\partial d (p, S_i)}{\partial p_n} = 0$

$(1+w_1) d (p, S_1) + \lambda w_1 = 0$

$(1+w_m) d (p, S_m) + \lambda w_m = 0$

$\sum_i w_i^2 - q = 0$

This system of nonlinear equations can be easily solved, eg with some variant of the Newton method.

The initial approximation for the parameters $p$ can be obtained by preliminarly performing a least squares fitting on all the samples, including outliers, and a reasonable initial approximation for the weights is $w_i = \sqrt{\dfrac{q}{n}}$   $\forall i$.

For $\lambda$, a null initial approximation should be avoided, as it can lead to a very slow convergence.