In the following properties, the operator
denotes the row-wise Kronecker product (face-splitting product) and the operator
denotes the column-wise Kronecker product
- Transpose (V. Slyusar, 1996[9][11][12]):
,
- Bilinearity and associativity:[9][11][12]

where A, B and C are matrices, and k is a scalar,
,[12]
where
is a vector, - The mixed-product property (V. Slyusar, 1997[12]):
,
,[13]
[15]
,[16]
where
denotes the Hadamard product,
,[12]
,[9]
where
is a row vector,
,[16]
,
where
is a permutation matrix.[7]-
,[13][15]
Similarly:
,
-
,[12]
,
where
and
are vectors,
,[17]
,-
,[18]
where
and
are vectors (it is a combine of properties 3 an 8),
Similarly:

-
,
where
is vector convolution;
are "count sketch" matrices; and
is the Fourier transform matrix (this result is an evolving of count sketch properties[19]).
This can be generalized for appropriate matrices
:

because property 11 above gives us

And the convolution theorem gives us

-
,[20]
where
is
matrix,
is
matrix,
is a vector of 1's of length
, and
is a vector of 1's of length 
or
,[21]
where
is
matrix,
means element by element multiplication and
is a vector of 1's of length
.
,
where
denotes the penetrating face product of matrices.[13]
Similarly:
, where
is
matrix,
is
matrix,.
-
,[12]
[13]=
=
,
,[21]
where
is the vector consisting of the diagonal elements of
,
means stack the columns of a matrix
on top of each other to give a vector. -
.[13][15]
Similarly:
,
where
and
are vectors -
If
is a diagonal matrix and
is its main diagonal:

Here,
is the column-wise vectorization operator.
Examples
Source:[18]
![{\displaystyle {\begin{aligned}&\left({\begin{bmatrix}1&0\\0&1\\1&0\end{bmatrix}}\bullet {\begin{bmatrix}1&0\\1&0\\0&1\end{bmatrix}}\right)\left({\begin{bmatrix}1&1\\1&-1\end{bmatrix}}\otimes {\begin{bmatrix}1&1\\1&-1\end{bmatrix}}\right)\left({\begin{bmatrix}\sigma _{1}&0\\0&\sigma _{2}\\\end{bmatrix}}\otimes {\begin{bmatrix}\rho _{1}&0\\0&\rho _{2}\\\end{bmatrix}}\right)\left({\begin{bmatrix}x_{1}\\x_{2}\end{bmatrix}}\ast {\begin{bmatrix}y_{1}\\y_{2}\end{bmatrix}}\right)\\[5pt]{}={}&\left({\begin{bmatrix}1&0\\0&1\\1&0\end{bmatrix}}\bullet {\begin{bmatrix}1&0\\1&0\\0&1\end{bmatrix}}\right)\left({\begin{bmatrix}1&1\\1&-1\end{bmatrix}}{\begin{bmatrix}\sigma _{1}&0\\0&\sigma _{2}\\\end{bmatrix}}{\begin{bmatrix}x_{1}\\x_{2}\end{bmatrix}}\,\otimes \,{\begin{bmatrix}1&1\\1&-1\end{bmatrix}}{\begin{bmatrix}\rho _{1}&0\\0&\rho _{2}\\\end{bmatrix}}{\begin{bmatrix}y_{1}\\y_{2}\end{bmatrix}}\right)\\[5pt]{}={}&{\begin{bmatrix}1&0\\0&1\\1&0\end{bmatrix}}{\begin{bmatrix}1&1\\1&-1\end{bmatrix}}{\begin{bmatrix}\sigma _{1}&0\\0&\sigma _{2}\\\end{bmatrix}}{\begin{bmatrix}x_{1}\\x_{2}\end{bmatrix}}\,\circ \,{\begin{bmatrix}1&0\\1&0\\0&1\end{bmatrix}}{\begin{bmatrix}1&1\\1&-1\end{bmatrix}}{\begin{bmatrix}\rho _{1}&0\\0&\rho _{2}\\\end{bmatrix}}{\begin{bmatrix}y_{1}\\y_{2}\end{bmatrix}}.\end{aligned}}}](//wikimedia.org/api/rest_v1/media/math/render/svg/077e2f2bb61c0f8393874c7f0ceb5f1bef78355c)
Theorem
Source:[18]
If
, where
are independent components a random matrix
with independent identically distributed rows
, such that
and
,
then for any vector 

with probability
if the quantity of rows

In particular, if the entries of
are
can get

which matches the Johnson–Lindenstrauss lemma of
when
is small.