Top Qs
Timeline
Chat
Perspective

Descent direction

From Wikipedia, the free encyclopedia

Remove ads
Remove ads

In optimization, a descent direction is a vector that points towards a local minimum of an objective function .

Computing by an iterative method, such as line search defines a descent direction at the th iterate to be any such that , where denotes the inner product. The motivation for such an approach is that small steps along guarantee that is reduced, by Taylor's theorem.

Using this definition, the negative of a non-zero gradient is always a descent direction, as .

Numerous methods exist to compute descent directions, all with differing merits, such as gradient descent or the conjugate gradient method.

More generally, if is a positive definite matrix, then is a descent direction at .[1] This generality is used in preconditioned gradient descent methods.

Remove ads

See also

References

Loading content...
Loading related searches...

Wikiwand - on

Seamless Wikipedia browsing. On steroids.

Remove ads