Tensor

multilinear map on some combination of scalars, vectors, covectors, and tensors From Wikipedia, the free encyclopedia

A tensor is a mathematical object. Tensors provide a mathematical framework for solving physics problems in areas such as elasticity, fluid mechanics and general relativity.[1] The word tensor comes from the Latin word tendere meaning "to stretch".

A tensor of order zero (zeroth-order tensor) is a scalar (simple number).[2] A tensor of order one (first-order tensor) is a linear map that maps every vector into a scalar.[2] A vector is a tensor of order one.[2] A tensor of order two (second-order tensor) is a linear map that maps every vector into a vector (e.g. a matrix).[2]

In linear algebra, the tensor product of two vector spaces and , ,[3] is itself a vector space. It is a way of creating a new vector space analogous of multiplication of integers.[4]

The tensor product can also be applied between two vectors, thus producing a tensor; for two vectors and in the vector spaces and respectively, the object is a tensor, and is an element of the space . However, not all elements of are simple tensor products of vectors in and - rather, every element in is some linear combination (weighted sum) of these so-called "simple tensors". This is a powerful abstract way to think about tensors - the weighted sum of different vector pairs from and .[5]

The tensor product (on vectors) is a linear operation on both of its vector inputs; thus, . (This makes it inherently different from the direct sum pairing of vectors, where scaling the direct sum scales both of the input vectors.) Additionally, the tensor product is distributive on both inputs: . This law also helps illustrate why not all tensors can be expressed as the tensor products of individual vectors. Consider the tensor . One might think of using , but because of distributivity, that would instead produce the tensor .

Since all elements of a tensor space are linear combinations of simple tensors, we can naturally think about a basis for the tensor space (which, remember, is also a vector space) in terms of simple tensors. In fact, distributivity tells us that given a basis of and a basis of , the various pairwise tensor products of those basis vectors results in a basis for the tensor space. For example, if the vectors and form a basis for , and and are a basis for , then for vectors and , . (This is not enough to prove that all tensors can be expressed with this tensor product basis, but they can nonetheless).

The tensor product can be used cleanly on more than two vector spaces (and vectors) at a time. All the properties discussed so far extend fairly naturally - simple tensors are now of the form , and each position in the tensor distributes and scales independently. These form higher-order tensors, with an order 4 tensor being made up of tensor products between 4 vectors. Because of their mathematical properties and scalability, tensors have a huge variety of applications in both math and physics.

Examples

Matrices as Order-2 Tensors

A classic example is that the tensor product of the space of column vectors with length N and the space of row vectors with length M is equivalent to the vector space of N x M matrices. In fact, the tensor product between a column and a row vector is equivalent to the matrix-multiplication product, which in the case of column-row multiplication is known as an outer product. The equivalence between the tensor space and the matrix space is made abundantly clear by examining the tensor basis generated by the standard basis vectors for column vectors and for row vectors. We'll call the column basis vector with a 1 in position by the name , and the row basis vector with a 1 in position by the name . Each element of the tensor basis is then some by matrix multiplication. Notice that by the rules of matrix multiplication, a column vector of length N times a row vector of length M produces an N by M matrix, and always produces a matrix with a 1 at position and zeros everywhere else. Hopefully, that should make the equivalence obvious: each simple tensor in this basis corresponds precisely with one of the positions in the N x M matrix!

For example, consider the following 2 x 2 matrix:In tensor form, this is the same as Which, if you evaluate those tensor products as matrix multiplication, becomes which naturally simplifies to our original matrix, as desired.

Interestingly, this equivalence (or isomorphism) shows us something else interesting about matrices: since not all elements of a tensor space can be expressed as a single tensor product of vectors, so too can not all matrices be expressed as the outer product of vectors. This is a fundamental concept in linear/matrix algebra, and corresponds to the rank of a matrix - simple tensors e.g. vector outer products only produce rank-1 matrices, but rank-N matrices (the max rank possible for an N x N matrix) can be expressed as the linear combination of these rank-1 matrices. This is precisely equivalent to expressing a generic tensor as a sum of simple tensors.

Tensors as Representing Vector-Vector Dependence/Correlation

Tensors are a natural way to represent the dependence between vectors from different vector spaces. For example, if we consider the magnitude of a simple tensor between vector a and x to be the probability of event a happening and event b happening, we can express probabilities for more complex scenarios as a general tensor with each component simple tensor representing the dependent probabilities of both simple tensors. For example, the tensor to represent there being a 50% chance of witnessing a and x together and a 50% chance of witnessing b and y together, but no chance of witnessing a and y together nor of witnessing b and x together. In other words, the events a and x are dependent upon each other, as are the events b and y. Since matrices and tensors between two vector spaces are equivalent, this is the same principle as behind correlation matrices in statistics.

This same concept is also the mathematical underpinning behind quantum mechanics, where events are said to be entangled if they are dependent upon each other - that is, they cannot be expressed as a single simple tensor.

References

Loading related searches...

Wikiwand - on

Seamless Wikipedia browsing. On steroids.