# Cross-correlation

#### Dear Wikiwand AI, let's keep it short by simply answering these key questions:

Can you list the top facts and stats about Cross-correlation?

In probability and statistics, the term cross-correlations refers to the correlations between the entries of two random vectors ${\displaystyle \mathbf {X} }$ and ${\displaystyle \mathbf {Y} }$, while the correlations of a random vector ${\displaystyle \mathbf {X} }$ are the correlations between the entries of ${\displaystyle \mathbf {X} }$ itself, those forming the correlation matrix of ${\displaystyle \mathbf {X} }$. If each of ${\displaystyle \mathbf {X} }$ and ${\displaystyle \mathbf {Y} }$ is a scalar random variable which is realized repeatedly in a time series, then the correlations of the various temporal instances of ${\displaystyle \mathbf {X} }$ are known as autocorrelations of ${\displaystyle \mathbf {X} }$, and the cross-correlations of ${\displaystyle \mathbf {X} }$ with ${\displaystyle \mathbf {Y} }$ across time are temporal cross-correlations. In probability and statistics, the definition of correlation always includes a standardising factor in such a way that correlations have values between −1 and +1.
If ${\displaystyle X}$ and ${\displaystyle Y}$ are two independent random variables with probability density functions ${\displaystyle f}$ and ${\displaystyle g}$, respectively, then the probability density of the difference ${\displaystyle Y-X}$ is formally given by the cross-correlation (in the signal-processing sense) ${\displaystyle f\star g}$; however, this terminology is not used in probability and statistics. In contrast, the convolution ${\displaystyle f*g}$ (equivalent to the cross-correlation of ${\displaystyle f(t)}$ and ${\displaystyle g(-t)}$) gives the probability density function of the sum ${\displaystyle X+Y}$.