# Bernoulli distribution

## Probability distribution modeling a coin toss which need not be fair / From Wikipedia, the free encyclopedia

#### Dear Wikiwand AI, let's keep it short by simply answering these key questions:

Can you list the top facts and stats about Bernoulli distribution?

Summarize this article for a 10 year old

In probability theory and statistics, the **Bernoulli distribution**, named after Swiss mathematician Jacob Bernoulli,^{[1]} is the discrete probability distribution of a random variable which takes the value 1 with probability $p$ and the value 0 with probability $q=1-p$. Less formally, it can be thought of as a model for the set of possible outcomes of any single experiment that asks a yes–no question. Such questions lead to outcomes that are Boolean-valued: a single bit whose value is success/yes/true/one with probability *p* and failure/no/false/zero with probability *q*. It can be used to represent a (possibly biased) coin toss where 1 and 0 would represent "heads" and "tails", respectively, and *p* would be the probability of the coin landing on heads (or vice versa where 1 would represent tails and *p* would be the probability of tails). In particular, unfair coins would have $p\neq 1/2.$

**Quick Facts**Parameters, Support ...

Probability mass function
Three examples of Bernoulli distribution: $P(x=0)=0{.}2$ and $P(x=1)=0{.}8$
$P(x=0)=0{.}8$ and $P(x=1)=0{.}2$
$P(x=0)=0{.}5$ and $P(x=1)=0{.}5$ | |||

Parameters |
$0\leq p\leq 1$ | ||
---|---|---|---|

Support | $k\in \{0,1\}$ | ||

PMF | ${\begin{cases}q=1-p&{\text{if }}k=0\\p&{\text{if }}k=1\end{cases}}$ | ||

CDF | ${\begin{cases}0&{\text{if }}k<0\\1-p&{\text{if }}0\leq k<1\\1&{\text{if }}k\geq 1\end{cases}}$ | ||

Mean | $p$ | ||

Median | ${\begin{cases}0&{\text{if }}p<1/2\\\left[0,1\right]&{\text{if }}p=1/2\\1&{\text{if }}p>1/2\end{cases}}$ | ||

Mode | ${\begin{cases}0&{\text{if }}p<1/2\\0,1&{\text{if }}p=1/2\\1&{\text{if }}p>1/2\end{cases}}$ | ||

Variance | $p(1-p)=pq$ | ||

MAD | $2p(1-p)=2pq$ | ||

Skewness | ${\frac {q-p}{\sqrt {pq}}}$ | ||

Excess kurtosis | ${\frac {1-6pq}{pq}}$ | ||

Entropy | $-q\ln q-p\ln p$ | ||

MGF | $q+pe^{t}$ | ||

CF | $q+pe^{it}$ | ||

PGF | $q+pz$ | ||

Fisher information | ${\frac {1}{pq}}$ |

The Bernoulli distribution is a special case of the binomial distribution where a single trial is conducted (so *n* would be 1 for such a binomial distribution). It is also a special case of the **two-point distribution**, for which the possible outcomes need not be 0 and 1.
^{[2]}