LOBPCG: Difference between revisions
m Added navbox |
No edit summary |
||
Line 1: | Line 1: | ||
'''Locally Optimal Block Preconditioned Conjugate Gradient Method''' ('''LOBPCG''') is an algorithm, proposed in (Knyazev, 2001), for finding the largest (or smallest) [[eigenvalues]] and the corresponding [[eigenvectors]] of a symmetric positive definite [[generalized eigenvalue problem]] |
'''Locally Optimal Block Preconditioned Conjugate Gradient Method''' ('''LOBPCG''') is an algorithm, proposed in (Knyazev, 2001), for finding the largest (or smallest) [[eigenvalues]] and the corresponding [[eigenvectors]] of a symmetric positive definite [[generalized eigenvalue problem]]. |
||
:<math>A x= \lambda B x,</math> |
:<math>A x= \lambda B x,</math> |
||
Line 27: | Line 27: | ||
or, in short, |
or, in short, |
||
:<math>x^{i+1} := x^i + \alpha^i w^i,\, |
:<math>x^{i+1} := x^i + \alpha^i w^i,\,</math> |
||
w^i := Tr^i,\, |
:<math>w^i := Tr^i,\,</math> |
||
r^i := Ax^i - \rho(x^i) Bx^i,</math> |
:<math>r^i := Ax^i - \rho(x^i) Bx^i,</math> |
||
is known as |
is known as preconditioned steepest accent (or descent), where the scalar |
||
<math>\alpha^i</math> is called the step size. The optimal step size can be determined by maximizing the Rayleigh quotient, i.e., |
<math>\alpha^i</math> is called the step size. The optimal step size can be determined by maximizing the Rayleigh quotient, i.e., |
||
:<math>x^{i+1} := \arg\max_{y\in span\{x^i,w^i\}} \rho(y)</math> |
:<math>x^{i+1} := \arg\max_{y\in span\{x^i,w^i\}} \rho(y)</math> |
||
( |
(or <math>\arg\min</math> in case of minimizing), |
||
in which case the method is called locally optimal. To further accelerate the convergence of the locally optimal precondiitoned steepest accent (or descent), one can add one extra vector to the two-term [[recurrence relation]] to make it three-term: |
in which case the method is called locally optimal. To further accelerate the convergence of the locally optimal precondiitoned steepest accent (or descent), one can add one extra vector to the two-term [[recurrence relation]] to make it three-term: |
||
:<math>x^{i+1} := \arg\max_{y\in span\{x^i,w^i,x^{i-1}\}} \rho(y)</math> |
:<math>x^{i+1} := \arg\max_{y\in span\{x^i,w^i,x^{i-1}\}} \rho(y)</math> |
||
(use <math>\arg\min</math> in case of minimizing). The |
(use <math>\arg\min</math> in case of minimizing). The maximization/minimization of the Rayleigh quotient in a 3-dimensional subspace can be performed numerically by the [[Rayleigh-Ritz]] method. |
||
This is a single-vector version of the LOBPCG method. It is one of possible generalization of the [[preconditioned]] [[conjugate gradient]] linear solvers to the case of symmetric [[eigenvalue]] problems. Even in the trivial case <math>T=I</math> and <math>B=I</math> the resulting approximation with <math>i>3</math> will be different from that obtained by the [[Lanczos algorithm]], although both approximations will belong to the same [[Krylov subspace]]. |
This is a single-vector version of the LOBPCG method. It is one of possible generalization of the [[Preconditioner|preconditioned]] [[Conjugate gradient method|conjugate gradient]] linear solvers to the case of symmetric [[eigenvalue]] problems. Even in the trivial case <math>T=I</math> and <math>B=I</math> the resulting approximation with <math>i>3</math> will be different from that obtained by the [[Lanczos algorithm]], although both approximations will belong to the same [[Krylov subspace]]. |
||
Iterating several approximate [[eigenvectors]] together in a block in a similar locally optimal fashion, gives the full block version of the LOBPCG. It allows robust computation of eigenvectors corresponding to nearly-multiple eigenvalues. |
Iterating several approximate [[eigenvectors]] together in a block in a similar locally optimal fashion, gives the full block version of the LOBPCG. It allows robust computation of eigenvectors corresponding to nearly-multiple eigenvalues. |
Revision as of 11:55, 28 July 2010
Locally Optimal Block Preconditioned Conjugate Gradient Method (LOBPCG) is an algorithm, proposed in (Knyazev, 2001), for finding the largest (or smallest) eigenvalues and the corresponding eigenvectors of a symmetric positive definite generalized eigenvalue problem.
for a given pair of complex Hermitian or real symmetric matrices, where the matrix is also assumed positive-definite.
The method performs an iterative maximization (or minimization) of the generalized Rayleigh quotient
which results in finding largest (or smallest) eigenpairs of
The direction of the steepest accent, which is the gradient, of the generalized Rayleigh quotient is positively proportional to the vector
called the eigenvector residual. If a preconditioner is available, it is applied to the residual giving vector
called the preconditioned residual. Without preconditioning, we set and so . An iterative method
or, in short,
is known as preconditioned steepest accent (or descent), where the scalar
is called the step size. The optimal step size can be determined by maximizing the Rayleigh quotient, i.e.,
(or in case of minimizing), in which case the method is called locally optimal. To further accelerate the convergence of the locally optimal precondiitoned steepest accent (or descent), one can add one extra vector to the two-term recurrence relation to make it three-term:
(use in case of minimizing). The maximization/minimization of the Rayleigh quotient in a 3-dimensional subspace can be performed numerically by the Rayleigh-Ritz method.
This is a single-vector version of the LOBPCG method. It is one of possible generalization of the preconditioned conjugate gradient linear solvers to the case of symmetric eigenvalue problems. Even in the trivial case and the resulting approximation with will be different from that obtained by the Lanczos algorithm, although both approximations will belong to the same Krylov subspace.
Iterating several approximate eigenvectors together in a block in a similar locally optimal fashion, gives the full block version of the LOBPCG. It allows robust computation of eigenvectors corresponding to nearly-multiple eigenvalues.
An implementation of LOBPCG is available in the public software package BLOPEX, maintained by Andrew Knyazev. The LOBPCG algorithm is also implemented in many other libraries, e.g.,: ABINIT, PESCAN, Anasazi Trilinos, SciPy, NGSolve, and PYFEMax.
References
Knyazev, A.V. (2001), "Toward the Optimal Preconditioned Eigensolver: Locally Optimal Block Preconditioned Conjugate Gradient Method", SIAM Journal on Scientific Computing, 23 (2): 517–541, doi:10.1137/S1064827500366124