**Bayesian probability** is an interpretation of the concept of probability, in which, instead of frequency or propensity of some phenomenon, probability is interpreted as reasonable expectation[1] representing a state of knowledge[2] or as quantification of a personal belief.[3]

Part of a series on |

Bayesian statistics |
---|

Posterior = Likelihood × Prior ÷ Evidence |

Background |

Model building |

Posterior approximation |

Estimators |

The Bayesian interpretation of probability can be seen as an extension of propositional logic that enables reasoning with hypotheses;[4][5] that is, with propositions whose truth or falsity is unknown. In the Bayesian view, a probability is assigned to a hypothesis, whereas under frequentist inference, a hypothesis is typically tested without being assigned a probability.

Bayesian probability belongs to the category of evidential probabilities; to evaluate the probability of a hypothesis, the Bayesian probabilist specifies a prior probability. This, in turn, is then updated to a posterior probability in the light of new, relevant data (evidence).[6] The Bayesian interpretation provides a standard set of procedures and formulae to perform this calculation.

The term *Bayesian* derives from the 18th-century mathematician and theologian Thomas Bayes, who provided the first mathematical treatment of a non-trivial problem of statistical data analysis using what is now known as Bayesian inference.[7]^{: 131 } Mathematician Pierre-Simon Laplace pioneered and popularized what is now called Bayesian probability.[7]^{: 97–98 }

Oops something went wrong: