Line search: Difference between revisions
Appearance
Content deleted Content added
No edit summary |
David Cooke (talk | contribs) fix quoting of ' and \ |
||
Line 1: | Line 1: | ||
In (unconstrained) [[optimization (mathematics)|optimization]], the |
In (unconstrained) [[optimization (mathematics)|optimization]], the '''linesearch''' strategy is one of two basic [[iteration|iterative]] approaches to finding a [[maxima and minima|local minimum]] <math>\mathbf{x}^*</math> of an [[objective function]] <math>f:\mathbb R^n\to\mathbb R</math>. The other method is that of [[trust region]]s. |
||
==Algorithm== |
==Algorithm== |
||
:i) Set iteration counter <math>k=0</math>, and make an initial guess, <math> |
:i) Set iteration counter <math>k=0</math>, and make an initial guess, <math>\mathbf{x}_0</math> for the minimum. |
||
:ii) Compute a [[descent direction]] <math> |
:ii) Compute a [[descent direction]] <math>\mathbf{p}_k</math>. |
||
:iii) Choose <math> |
:iii) Choose <math>\alpha_k</math> to 'loosely' minimize <math>\phi(\alpha)=f(\mathbf{x}_k+\alpha\mathbf{p}_k)</math> over <math>\alpha\in\mathbb R</math>. |
||
:iv) Update <math> |
:iv) Update <math>\mathbf{x}_{k+1}=\mathbf{x}_k+\alpha_k\mathbf{p}_k</math>, <math>k=k+1</math>. |
||
::If <math> |
::If <math>\|\nabla f(\mathbf{x}_k)\|\leq</math>tolerance, STOP. |
||
::Else, goto ii). |
::Else, goto ii). |
||
In step iii) we can either |
In step iii) we can either ''exactly'' minimize <math>\phi</math>, by solving <math>\phi'(\alpha_k)=0</math>, or ''loosely'', by asking for a sufficient decrease in <math>\phi</math>. The latter may be performed in a number of ways, perhaps by doing a [[backtracking linesearch]] or using the [[Wolfe conditions]]. |
||
Like other optimization methods, line search may be combined with [[simulated annealing]] to allow it to jump over some local [[minima]]. |
Like other optimization methods, line search may be combined with [[simulated annealing]] to allow it to jump over some local [[minima]]. |
Revision as of 09:02, 14 May 2006
In (unconstrained) optimization, the linesearch strategy is one of two basic iterative approaches to finding a local minimum of an objective function . The other method is that of trust regions.
Algorithm
- i) Set iteration counter , and make an initial guess, for the minimum.
- ii) Compute a descent direction .
- iii) Choose to 'loosely' minimize over .
- iv) Update , .
- If tolerance, STOP.
- Else, goto ii).
In step iii) we can either exactly minimize , by solving , or loosely, by asking for a sufficient decrease in . The latter may be performed in a number of ways, perhaps by doing a backtracking linesearch or using the Wolfe conditions.
Like other optimization methods, line search may be combined with simulated annealing to allow it to jump over some local minima.
References
- N. I. M. Gould and S. Leyffer, An introduction to algorithms for nonlinear optimization. In J. F. Blowey, A. W. Craig, and T. Shardlow, Frontiers in Numerical Analysis, pages 109-197. Springer Verlag, Berlin, 2003.