Jump to content

Iteratively reweighted least squares: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Fixed syntax error in first formula
Line 36: Line 36:


==Convergence==
==Convergence==
Convergence of the method is not guaranteed.
Convergence of the method is not guaranteed.{{fact}}


== Notes ==
== Notes ==

Revision as of 20:09, 20 April 2010

The method of iteratively reweighted least squares (IRLS) is used to solve certain optimization problems. It solves objective functions of the form:

by an iterative method in which each step involves solving a weighted least squares problem of the form:

IRLS is used to find the maximum likelihood estimates of a generalized linear model, and in robust regression to find an M-estimator, as a way of mitigating the influence of outliers in an otherwise normally-distributed data set. For example, by minimizing the least absolute error rather than the least square error.

Although not a linear regression problem, Weiszfeld's algorithm for approximating the geometric median can also be viewed as a special case of iteratively reweighted least squares, in which the objective function is the sum of distances of the estimator from the samples.

Examples

Lp norm linear regression

To find the parameters β = (β1, …,βk)T which minimise the Lp norm for the linear regression problem:

The IRLS algorithm at step t+1 involves solving the weighted linear least squares problem:

where W(t) is the diagonal matrix of weights with elements:

[1]

In the case p = 1, this corresponds to least absolute deviation regression.

Convergence

Convergence of the method is not guaranteed.[citation needed]

Notes

  1. ^ Gentle, James (2007). "6.8.1 Solutions that Minimize Other Norms of the Residuals". Matrix algebra. New York: Springer. doi:10.1007/978-0-387-70873-7. ISBN 978-0-387-70872-0.

References