RMM-DIIS

From VASP Wiki
Revision as of 13:02, 18 March 2019 by Karsai (talk | contribs) (Created page with "The schemes like {{TAG|Davidson iteration scheme}} and {{TAG|Conjugate gradient optimization}}, try to optimize the expectation value of the Hamiltonian for each wavefunction...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

The schemes like Davidson iteration scheme and Conjugate gradient optimization, try to optimize the expectation value of the Hamiltonian for each wavefunction using an increasing trial basis-set. Instead of minimizing the expectation value it is also possible to minimize the norm of the residual vector. This leads to a similar iteration scheme as described in Efficient single band eigenvalue-minimization, but a different eigenvalue problem has to be solved (see Refs. [1][2]).

There is a significant difference between optimizing the eigenvalue and the norm of the residual vector. The norm of the residual vector is given by

and possesses a quadratic unrestricted minimum at the each eigenfunction . If you have a good starting guess for the eigenfunction it is possible to use this algorithm without the knowledge of other wavefunctions, and therefore without the explicit orthogonalization of the preconditioned residual vector (eq. for in Single band steepest descent scheme). In this case after a sweep over all bands a Gram-Schmidt orthogonalization is necessary to obtain a new orthogonal trial-basis set. Without the explicit orthogonalization to the current set of trial wavefunctions all other algorithms tend to converge to the lowest band, no matter from which band they are start.

References