Jump to content

Iteratively reweighted least squares: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
fix typo
Yobot (talk | contribs)
m WP:CHECKWIKI error fixes + general fixes using AWB (7754)
Line 24: Line 24:
| url = http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=4518498}}
| url = http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=4518498}}
</ref>
</ref>
<ref>{{cite web | url = http://www.ricam.oeaw.ac.at/people/page/fornasier/DDFG14.pdf | title = Iteratively reweighted least squares minimization for sparse recovery | author= I Daubechies et al (2008)| accessdate=2010-11-02}}</ref>. It was proven that algorithm has linear rate of convergence for '''<math>\ell</math><sub>1</sub>''' norm and superlinear for '''<math>\ell</math><sub> t</sub>''' with t < 1, under [[Restricted_isometry_property|Restricted Isometry Property]], which is generally sufficient condition for sparse solutions.
.<ref>{{cite web | url = http://www.ricam.oeaw.ac.at/people/page/fornasier/DDFG14.pdf | title = Iteratively reweighted least squares minimization for sparse recovery | author= I Daubechies et al (2008)| accessdate=2010-11-02}}</ref> It was proven that algorithm has linear rate of convergence for '''<math>\ell</math><sub>1</sub>''' norm and superlinear for '''<math>\ell</math><sub> t</sub>''' with t < 1, under [[Restricted isometry property|Restricted Isometry Property]], which is generally sufficient condition for sparse solutions.


=== ''L<sup>p</sup>'' norm linear regression ===
=== ''L<sup>p</sup>'' norm linear regression ===
Line 57: Line 57:
* [http://www.nrbook.com/a/bookcpdf/c15-7.pdf Robust Estimation in Numerical Recipes in C by Press et al] (requires the [http://www.nr.com/plugin/plugin_faq.html FileOpen] plugin to view)
* [http://www.nrbook.com/a/bookcpdf/c15-7.pdf Robust Estimation in Numerical Recipes in C by Press et al] (requires the [http://www.nr.com/plugin/plugin_faq.html FileOpen] plugin to view)


{{DEFAULTSORT:Iteratively Reweighted Least Squares}}
[[Category:Regression analysis]]
[[Category:Regression analysis]]
[[Category:Least squares]]
[[Category:Least squares]]

Revision as of 18:14, 10 June 2011

The method of iteratively reweighted least squares (IRLS) is used to solve certain optimization problems. It solves objective functions of the form:

by an iterative method in which each step involves solving a weighted least squares problem of the form:

IRLS is used to find the maximum likelihood estimates of a generalized linear model, and in robust regression to find an M-estimator, as a way of mitigating the influence of outliers in an otherwise normally-distributed data set. For example, by minimizing the least absolute error rather than the least square error.

Although not a linear regression problem, Weiszfeld's algorithm for approximating the geometric median can also be viewed as a special case of iteratively reweighted least squares, in which the objective function is the sum of distances of the estimator from the samples.

One of the advantages of IRLS over linear and convex programming is that it can be used with Gauss–Newton and Levenberg–Marquardt numerical algorithms.

Examples

L1 minimization for sparse recovery

IRLS could be used for 1 minimization and smoothed p minimization, p<1, in the compressed sensing problems[1] .[2] It was proven that algorithm has linear rate of convergence for 1 norm and superlinear for t with t < 1, under Restricted Isometry Property, which is generally sufficient condition for sparse solutions.

Lp norm linear regression

To find the parameters β = (β1, …,βk)T which minimize the Lp norm for the linear regression problem:

The IRLS algorithm at step t+1 involves solving the weighted linear least squares problem:

where W(t) is the diagonal matrix of weights with elements:

[3]

In the case p = 1, this corresponds to least absolute deviation regression (in this case, the problem would be better approached by use of linear programming methods).

Notes

  1. ^ Chartrand, R.; Yin, W. (March 31 – April 4, 2008). "Iteratively reweighted algorithms for compressive sensing". IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2008. pp. 3869–3872. {{cite conference}}: Unknown parameter |booktitle= ignored (|book-title= suggested) (help)CS1 maint: date format (link)
  2. ^ I Daubechies et al (2008). "Iteratively reweighted least squares minimization for sparse recovery" (PDF). Retrieved 2010-11-02.{{cite web}}: CS1 maint: numeric names: authors list (link)
  3. ^ Gentle, James (2007). "6.8.1 Solutions that Minimize Other Norms of the Residuals". Matrix algebra. New York: Springer. doi:10.1007/978-0-387-70873-7. ISBN 978-0-387-70872-0.

References