Rosenbrock methods
Methods in numerical computation From Wikipedia, the free encyclopedia
Rosenbrock methods refers to either of two distinct ideas in numerical computation, both named for Howard H. Rosenbrock.
![]() |
Numerical solution of differential equations
Rosenbrock methods for stiff differential equations are a family of single-step methods for solving ordinary differential equations.[1][2] They are related to the implicit Runge–Kutta methods[3] and are also known as Kaps–Rentrop methods.[4]
Search method
Rosenbrock search is a numerical optimization algorithm applicable to optimization problems in which the objective function is inexpensive to compute and the derivative either does not exist or cannot be computed efficiently.[5] The idea of Rosenbrock search is also used to initialize some root-finding routines, such as fzero (based on Brent's method) in Matlab. Rosenbrock search is a form of derivative-free search but may perform better on functions with sharp ridges.[6] The method often identifies such a ridge which, in many applications, leads to a solution.[7]
See also
References
External links
Wikiwand - on
Seamless Wikipedia browsing. On steroids.