Cross-correlation matrix
From Wikipedia, the free encyclopedia
The cross-correlation matrix of two random vectors is a matrix containing as elements the cross-correlations of all pairs of elements of the random vectors. The cross-correlation matrix is used in various digital signal processing algorithms.
![]() | This article has multiple issues. Please help improve it or discuss these issues on the talk page. (Learn how and when to remove these messages)
|
Definition
For two random vectors and , each containing random elements whose expected value and variance exist, the cross-correlation matrix of and is defined by[1]: p.337
and has dimensions . Written component-wise:
The random vectors and need not have the same dimension, and either might be a scalar value.
Example
For example, if and are random vectors, then is a matrix whose -th entry is .
Complex random vectors
If and are complex random vectors, each containing random variables whose expected value and variance exist, the cross-correlation matrix of and is defined by
where denotes Hermitian transposition.
Uncorrelatedness
Summarize
Perspective
Two random vectors and are called uncorrelated if
They are uncorrelated if and only if their cross-covariance matrix matrix is zero.
In the case of two complex random vectors and they are called uncorrelated if
and
Properties
Summarize
Perspective
Relation to the cross-covariance matrix
The cross-correlation is related to the cross-covariance matrix as follows:
- Respectively for complex random vectors:
See also
- Autocorrelation
- Correlation does not imply causation
- Covariance function
- Pearson product-moment correlation coefficient
- Correlation function (astronomy)
- Correlation function (statistical mechanics)
- Correlation function (quantum field theory)
- Mutual information
- Rate distortion theory
- Radial distribution function
References
Further reading
Wikiwand - on
Seamless Wikipedia browsing. On steroids.