Top Qs
Timeline
Chat
Perspective
Gradient method
From Wikipedia, the free encyclopedia
Remove ads
In optimization, a gradient method is an algorithm to solve problems of the form
with the search directions defined by the gradient of the function at the current point. Examples of gradient methods are the gradient descent and the conjugate gradient.
Remove ads
See also
- Gradient descent
- Stochastic gradient descent
- Coordinate descent
- Frank–Wolfe algorithm
- Landweber iteration
- Random coordinate descent
- Conjugate gradient method
- Derivation of the conjugate gradient method
- Nonlinear conjugate gradient method
- Biconjugate gradient method
- Biconjugate gradient stabilized method
References
- Elijah Polak (1997). Optimization : Algorithms and Consistent Approximations. Springer-Verlag. ISBN 0-387-94971-2.
Wikiwand - on
Seamless Wikipedia browsing. On steroids.
Remove ads