Gradient method

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

In optimization, a gradient method is an algorithm to solve problems of the form

minxnf(x)

with the search directions defined by the gradient of the function at the current point. Examples of gradient methods are the gradient descent and the conjugate gradient.

See also

<templatestyles src="Div col/styles.css"/>

References

  • Script error: No such module "citation/CS1".

Template:Optimization algorithms

Template:Linear-algebra-stub