Top Qs
Timeline
Chat
Perspective
Ziv–Zakai bound
A theoretical bound used in estimation theory From Wikipedia, the free encyclopedia
Remove ads
The Ziv–Zakai bound (named after Jacob Ziv and Moshe Zakai[1]) is used in theory of estimations to provide a lower bound on possible-probable error involving some random parameter from a noisy observation . The bound work by connecting probability of the excess error to the hypothesis testing. The bound is considered to be tighter than Cramér–Rao bound albeit more involved. Several modern version of the bound have been introduced [2] subsequent of the first version which was published 1969.[1]
Remove ads
Simple Form of the Bound
Suppose we want to estimate a random variable with the probability density from a noisy observation , then for any estimator a simple form of Ziv-Zakai bound is given by[1]
where is the minimum (Bayes) error probability for the binary hypothesis testing problem between
with prior probabilities and .
Remove ads
Generalization
Summarize
Perspective
The original lower bound can be tightened by introducing a notion of the valley-filling function, which for a function
with the bound given by
The most general version of the bound, which holds for both continuous and discrete random vectors, is also available. [3]
Remove ads
Tightness
Ziv-Zakai bound has some general tightness guarantees, such as[3]
- For continuous random variables:
- The bound is tight in the high signal-to-noise ratio regime for continuous random vectors.
- In the low signal-to-noise ratio regime, the bound is tight if unimodal and symmetric with respect to its mode.
- For discrete random variables:
- The bound requires a valley-filling function; otherwise, the bound is equal to zero.
- The bound is typically not tight for discrete random variables.
- A version of the bound known as the single point Ziv-Zakai bound is generally tighter than other versions of Ziv-Zakai.
Applications
The Ziv-Zakai bound has several appealing advantages. Unlike the other bounds, in fact, the Ziv-Zakai bound only requires one regularity condition, that is, the parameter under estimation needs to have a probability density function; this is one of the key advantages of the Ziv-Zakai bound . Hence, the Ziv-Zakai bound has a broader applicability than, for instance, the Cramér-Rao bound, which requires several smoothness assumptions on the probability density function of the estimand.
Remove ads
See also
References
Wikiwand - on
Seamless Wikipedia browsing. On steroids.
Remove ads