Iteratively reweighted least squares: Difference between revisions
Austrartsua (talk | contribs) |
|||
Line 84: | Line 84: | ||
* [http://www.mai.liu.se/~akbjo/LSPbook.html Numerical Methods for Least Squares Problems by Åke Björck] (Chapter 4: Generalized Least Squares Problems.) |
* [http://www.mai.liu.se/~akbjo/LSPbook.html Numerical Methods for Least Squares Problems by Åke Björck] (Chapter 4: Generalized Least Squares Problems.) |
||
* [http://graphics.stanford.edu/~jplewis/lscourse/SLIDES.pdf Practical Least-Squares for Computer Graphics. SIGGRAPH Course 11] |
* [http://graphics.stanford.edu/~jplewis/lscourse/SLIDES.pdf Practical Least-Squares for Computer Graphics. SIGGRAPH Course 11] |
||
== External links == |
|||
* [http://stemblab.github.io/irls/ Solve under-determined linear systems iteratively] |
|||
{{DEFAULTSORT:Iteratively Reweighted Least Squares}} |
{{DEFAULTSORT:Iteratively Reweighted Least Squares}} |
Revision as of 09:13, 2 November 2014
Part of a series on |
Regression analysis |
---|
Models |
Estimation |
Background |
The method of iteratively reweighted least squares (IRLS) is used to solve certain optimization problems with objective functions of the form:
by an iterative method in which each step involves solving a weighted least squares problem of the form:
IRLS is used to find the maximum likelihood estimates of a generalized linear model, and in robust regression to find an M-estimator, as a way of mitigating the influence of outliers in an otherwise normally-distributed data set. For example, by minimizing the least absolute error rather than the least square error.
Although not a linear regression problem, Weiszfeld's algorithm for approximating the geometric median can also be viewed as a special case of iteratively reweighted least squares, in which the objective function is the sum of distances of the estimator from the samples.
One of the advantages of IRLS over linear and convex programming is that it can be used with Gauss–Newton and Levenberg–Marquardt numerical algorithms.
Examples
L1 minimization for sparse recovery
IRLS can be used for 1 minimization and smoothed p minimization, p < 1, in the compressed sensing problems. It has been proved that the algorithm has a linear rate of convergence for 1 norm and superlinear for t with t < 1, under the restricted isometry property, which is generally a sufficient condition for sparse solutions.[1][2] However in most practical situations, the restricted isometry property is not satisfied.
Lp norm linear regression
To find the parameters β = (β1, …,βk)T which minimize the Lp norm for the linear regression problem,
the IRLS algorithm at step t+1 involves solving the weighted linear least squares problem:[3]
where W(t) is the diagonal matrix of weights, usually with all elements set initially to:
and updated after each iteration to:
In the case p = 1, this corresponds to least absolute deviation regression (in this case, the problem would be better approached by use of linear programming methods,[4] so the result would be exact) and the formula is:
To avoid dividing by zero, regularization must be done, so in practice the formula is:
where is some small value, like 0.0001.[4]
Notes
- ^ Chartrand, R.; Yin, W. (March 31 – April 4, 2008). "Iteratively reweighted algorithms for compressive sensing". IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2008. pp. 3869–3872.
{{cite conference}}
: Unknown parameter|booktitle=
ignored (|book-title=
suggested) (help) - ^ Attention: This template ({{cite doi}}) is deprecated. To cite the publication identified by doi:10.1002/cpa.20303, please use {{cite journal}} (if it was published in a bona fide academic journal, otherwise {{cite report}} with
|doi=10.1002/cpa.20303
instead. - ^ Gentle, James (2007). "6.8.1 Solutions that Minimize Other Norms of the Residuals". Matrix algebra. New York: Springer. doi:10.1007/978-0-387-70873-7. ISBN 978-0-387-70872-0.
- ^ a b William A. Pfeil, Statistical Teaching Aids, Bachelor of Science thesis, Worcester Polytechnic Institute, 2006
References
- University of Colorado Applied Regression lecture slides
- Stanford Lecture Notes on the IRLS algorithm by Antoine Guitton
- Numerical Methods for Least Squares Problems by Åke Björck (Chapter 4: Generalized Least Squares Problems.)
- Practical Least-Squares for Computer Graphics. SIGGRAPH Course 11