Top Qs
Timeline
Chat
Perspective
Berndt–Hall–Hall–Hausman algorithm
From Wikipedia, the free encyclopedia
Remove ads
The Berndt–Hall–Hall–Hausman (BHHH) algorithm is a numerical optimization algorithm similar to the Newton–Raphson algorithm, but it replaces the observed negative Hessian matrix with the outer product of the gradient. This approximation is based on the information matrix equality and therefore only valid while maximizing a likelihood function.[1] The BHHH algorithm is named after the four originators: Ernst R. Berndt, Bronwyn Hall, Robert Hall, and Jerry Hausman.[2]
Remove ads
Usage
Summarize
Perspective
If a nonlinear model is fitted to the data one often needs to estimate coefficients through optimization. A number of optimisation algorithms have the following general structure. Suppose that the function to be optimized is Q(β). Then the algorithms are iterative, defining a sequence of approximations, βk given by
- ,
where is the parameter estimate at step k, and is a parameter (called step size) which partly determines the particular algorithm. For the BHHH algorithm λk is determined by calculations within a given iterative step, involving a line-search until a point βk+1 is found satisfying certain criteria. In addition, for the BHHH algorithm, Q has the form
and A is calculated using
In other cases, e.g. Newton–Raphson, can have other forms. The BHHH algorithm has the advantage that, if certain conditions apply, convergence of the iterative procedure is guaranteed.[citation needed]
Remove ads
See also
References
Further reading
Wikiwand - on
Seamless Wikipedia browsing. On steroids.
Remove ads