Top Qs
Timeline
Chat
Perspective
Conjugate gradient squared method
Algorithm for solving matrix-vector equations From Wikipedia, the free encyclopedia
Remove ads
In numerical linear algebra, the conjugate gradient squared method (CGS) is an iterative algorithm for solving systems of linear equations of the form , particularly in cases where computing the transpose is impractical.[1] The CGS method was developed as an improvement to the biconjugate gradient method.[2][3][4]
Remove ads
Background
Summarize
Perspective
A system of linear equations consists of a known matrix and a known vector . To solve the system is to find the value of the unknown vector .[3][5] A direct method for solving a system of linear equations is to take the inverse of the matrix , then calculate . However, computing the inverse is computationally expensive. Hence, iterative methods are commonly used. Iterative methods begin with a guess , and on each iteration the guess is improved. Once the difference between successive guesses is sufficiently small, the method has converged to a solution.[6][7]
As with the conjugate gradient method, biconjugate gradient method, and similar iterative methods for solving systems of linear equations, the CGS method can be used to find solutions to multi-variable optimisation problems, such as power-flow analysis, hyperparameter optimisation, and facial recognition.[8]
Remove ads
Algorithm
Summarize
Perspective
The algorithm is as follows:[9]
- Choose an initial guess
- Compute the residual
- Choose
- For do:
- If , the method fails.
- If :
- Else:
- Solve , where is a pre-conditioner.
- Solve
- Check for convergence: if there is convergence, end the loop and return the result
Remove ads
See also
References
Wikiwand - on
Seamless Wikipedia browsing. On steroids.
Remove ads