Top Qs
Timeline
Chat
Perspective
Hannan–Quinn information criterion
From Wikipedia, the free encyclopedia
Remove ads
In statistics, the Hannan–Quinn information criterion (HQC) is a criterion for model selection.[1] It is an alternative to Akaike information criterion (AIC) and Bayesian information criterion (BIC). It is given as
Where:
- is the log-likelihood,
- k is the number of parameters, and
- n is the number of observations.
According to Burnham and Anderson, HQIC, "while often cited, seems to have seen little use in practice" (p. 287).[2] They also note that HQIC, like BIC, but unlike AIC, is not an estimator of Kullback–Leibler divergence.
Claeskens and Hjort note that HQC, like BIC, but unlike AIC, is not asymptotically efficient; however, it misses the optimal estimation rate by a very small factor (ch. 4).[3] They further point out that whatever method is being used for fine-tuning the criterion will be more important in practice than the term , since this latter number is small even for very large ; however, the term ensures that, unlike AIC, HQC is strongly consistent. It follows from the law of the iterated logarithm that any strongly consistent method must miss efficiency by at least a factor, so in this sense HQC is asymptotically very well-behaved.
Van der Pas and Grünwald prove that model selection based on a modified Bayesian estimator, the so-called switch distribution, in many cases behaves asymptotically like HQC, while retaining the advantages of Bayesian methods such as the use of priors.[4]
Remove ads
See also
- Akaike information criterion
- Bayesian information criterion
- Deviance information criterion
- Focused information criterion
- Shibata information criterion
References
Further reading
Wikiwand - on
Seamless Wikipedia browsing. On steroids.
Remove ads