 Appendix B

A Brief Introduction to Tensors and Their Properties

B.1. BASIC PROPERTIES OF TENSORS

B.1.1 Examples of Tensors

The gradient of a vector field is a good example of a tensor.  Visualize a vector field: at every point in space, the field has a vector value $u\left({x}_{1},{x}_{2},{x}_{3}\right)$.  Let $G=u\otimes \nabla \text{\hspace{0.17em}}$ represent the gradient of u.  By definition, G enables you to calculate the change in u when you move from a point x in space to a nearby point at $x+dx$:

$du=G\cdot dx$

G is a second order tensor.  From this example, we see that when you multiply a vector by a tensor, the result is another vector.

This is a general property of all second order tensors.  A tensor is a linear mapping of a vector onto another vector.  Two examples, together with the vectors they operate on, are: The stress tensor

$t=\text{\hspace{0.17em}}n\cdot \sigma$

where n is a unit vector normal to a surface, $\sigma$ is the stress tensor and t is the traction vector acting on the surface. The deformation gradient tensor

$dw=F\cdot dx$

where dx is an infinitesimal line element in an undeformed solid, and dw is the vector representing the deformed line element.

B.1.2 Matrix representation of a tensor

To evaluate and manipulate tensors, we express them as components in a basis, just as for vectors.  We can use the displacement gradient to illustrate how this is done.  Let $u\left({x}_{1},{x}_{2},{x}_{3}\right)$ be a vector field, and let $G=u\nabla$ represent the gradient of u.  Recall the definition of G

$du=G\cdot dx$

Now, let $\left\{{e}_{1},{e}_{2},{e}_{3}\right\}$ be a Cartesian basis, and express both du and dx as components.  Then, calculate the components of du in terms of dx using the usual rules of calculus

$\begin{array}{l}d{u}_{1}=\frac{\partial {u}_{1}}{\partial {x}_{1}}d{x}_{1}+\frac{\partial {u}_{1}}{\partial {x}_{2}}d{x}_{2}+\frac{\partial {u}_{1}}{\partial {x}_{3}}d{x}_{3}\\ d{u}_{2}=\frac{\partial {u}_{2}}{\partial {x}_{1}}d{x}_{1}+\frac{\partial {u}_{2}}{\partial {x}_{2}}d{x}_{2}+\frac{\partial {u}_{2}}{\partial {x}_{3}}d{x}_{3}\\ d{u}_{3}=\frac{\partial {u}_{3}}{\partial {x}_{1}}d{x}_{1}+\frac{\partial {u}_{3}}{\partial {x}_{2}}d{x}_{2}+\frac{\partial {u}_{3}}{\partial {x}_{3}}d{x}_{3}\end{array}$

We could represent this as a matrix product

$\left[\begin{array}{c}d{u}_{1}\\ d{u}_{2}\\ d{u}_{3}\end{array}\right]=\left[\begin{array}{ccc}\frac{\partial {u}_{1}}{\partial {x}_{1}}& \frac{\partial {u}_{1}}{\partial {x}_{2}}& \frac{\partial {u}_{1}}{\partial {x}_{3}}\\ \frac{\partial {u}_{2}}{\partial {x}_{1}}& \frac{\partial {u}_{2}}{\partial {x}_{2}}& \frac{\partial {u}_{2}}{\partial {x}_{3}}\\ \frac{\partial {u}_{3}}{\partial {x}_{1}}& \frac{\partial {u}_{3}}{\partial {x}_{2}}& \frac{\partial {u}_{3}}{\partial {x}_{3}}\end{array}\right]\left[\begin{array}{c}d{x}_{1}\\ d{x}_{2}\\ d{x}_{3}\end{array}\right]$

From this we see that G can be represented as a $3×3$ matrix.  The elements of the matrix are known as the components of G in the basis $\left\{{e}_{1},{e}_{2},{e}_{3}\right\}$.  All second order tensors can be represented in this form.  For example, a general second order tensor S could be written as

$S\equiv \left[\begin{array}{ccc}{S}_{11}& {S}_{12}& {S}_{13}\\ {S}_{21}& {S}_{22}& {S}_{23}\\ {S}_{31}& {S}_{32}& {S}_{33}\end{array}\right]$

You have probably already seen the matrix representation of stress and strain components in introductory courses.

Since S can be represented as a matrix, all operations that can be performed on a $3×3$ matrix can also be performed on S.  Examples include sums and products, the transpose, inverse, and determinant.  One can also compute eigenvalues and eigenvectors for tensors, and thus define the log of a tensor, the square root of a tensor, etc.  These tensor operations are summarized below.

Note that the numbers ${S}_{11}$, ${S}_{12}$, … ${S}_{33}$ depend on the basis $\left\{{e}_{1},{e}_{2},{e}_{3}\right\}$, just as the components of a vector depend on the basis used to represent the vector.  However, just as the magnitude and direction of a vector are independent of the basis, so the properties of a tensor are independent of the basis.  That is to say, if S is a tensor and u is a vector, then the vector

$v=S\cdot u$

has the same magnitude and direction, irrespective of the basis used to represent u, v, and S.

B.1.3 The difference between a matrix and a tensor

If a tensor is a matrix, why is a matrix not the same thing as a tensor?  Well, although you can multiply the three components of a vector u by any $3×3$ matrix,

$\left[\begin{array}{c}{b}_{1}\\ {b}_{2}\\ {b}_{3}\end{array}\right]=\left[\begin{array}{ccc}{a}_{11}& {a}_{12}& {a}_{13}\\ {a}_{21}& {a}_{22}& {a}_{23}\\ {a}_{31}& {a}_{32}& {a}_{33}\end{array}\right]\left[\begin{array}{c}{u}_{1}\\ {u}_{2}\\ {u}_{3}\end{array}\right]$

the resulting three numbers $\left({b}_{1},{b}_{2},{b}_{3}\right)$ may or may not represent the components of a vector.  If they are the components of a vector, then the matrix represents the components of a tensor A, if not, then the matrix is just an ordinary old matrix.

To check whether $\left({b}_{1},{b}_{2},{b}_{3}\right)$ are the components of a vector, you need to check how $\left({b}_{1},{b}_{2},{b}_{3}\right)$ change due to a change of basis.  That is to say, choose a new basis, calculate the new components of u in this basis, and calculate the new matrix in this basis (the new elements of the matrix will depend on how the matrix was defined.  The elements may or may not change $–$ if they don’t, then the matrix cannot be the components of a tensor).  Then, evaluate the matrix product to find a new left hand side, say $\left({\beta }_{1},{\beta }_{2},{\beta }_{3}\right)$.  If  $\left({\beta }_{1},{\beta }_{2},{\beta }_{3}\right)$ are related to $\left({b}_{1},{b}_{2},{b}_{3}\right)$ by the same transformation that was used to calculate the new components of u, then $\left({b}_{1},{b}_{2},{b}_{3}\right)$ are the components of a vector, and, therefore, the matrix represents the components of a tensor.

B.1.4 Creating a tensor using a dyadic product of two vectors.

Let a and b be two vectors.  The dyadic product of a and b  is a second order tensor S denoted by

$S=a\otimes b$.

with the property

$S\cdot u=\left(a\otimes b\right)\cdot u=a\left(b\cdot u\right)$

for all vectors u.  (Clearly, this maps u onto a vector parallel to a with magnitude $|a|\left(b\cdot u\right)$ )

The components of $a\otimes b$ in a basis $\left\{{e}_{1},{e}_{2},{e}_{3}\right\}$ are

$\left[\begin{array}{ccc}{a}_{1}{b}_{1}& {a}_{1}{b}_{2}& {a}_{1}{b}_{3}\\ {a}_{2}{b}_{1}& {a}_{2}{b}_{2}& {a}_{2}{b}_{3}\\ {a}_{3}{b}_{1}& {a}_{3}{b}_{2}& {a}_{3}{b}_{3}\end{array}\right]$

Note that not all tensors can be constructed using a dyadic product of only two vectors (this is because $\left(a\otimes b\right)\cdot u$ always has to be parallel to a, and therefore the representation cannot map a vector onto an arbitrary vector).  However, if a, b, and c are three independent vectors (i.e. no two of them are parallel) then all tensors can be constructed as a sum of scalar multiples of the nine possible dyadic products of these vectors.

B.2. OPERATIONS ON SECOND ORDER TENSORS Tensor components

Let $\left\{{e}_{1},{e}_{2},{e}_{3}\right\}$ be a Cartesian basis, and let S be a second order tensor.  The components of S in $\left\{{e}_{1},{e}_{2},{e}_{3}\right\}$ may be represented as a matrix

$\left[\begin{array}{ccc}{S}_{11}& {S}_{12}& {S}_{13}\\ {S}_{21}& {S}_{22}& {S}_{23}\\ {S}_{31}& {S}_{32}& {S}_{33}\end{array}\right]$

where

$\begin{array}{l}{S}_{11}={e}_{1}\cdot \left(S\cdot {e}_{1}\right),\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{S}_{12}={e}_{1}\cdot \left(S\cdot {e}_{2}\right),\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{S}_{11}={e}_{1}\cdot \left(S\cdot {e}_{3}\right),\\ {S}_{21}={e}_{2}\cdot \left(S\cdot {e}_{1}\right),\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{S}_{22}={e}_{2}\cdot \left(S\cdot {e}_{2}\right),\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{S}_{21}={e}_{2}\cdot \left(S\cdot {e}_{3}\right),\\ {S}_{31}={e}_{3}\cdot \left(S\cdot {e}_{1}\right),\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{S}_{32}={e}_{3}\cdot \left(S\cdot {e}_{2}\right),\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{S}_{31}={e}_{3}\cdot \left(S\cdot {e}_{3}\right),\end{array}$

The representation of a tensor in terms of its components can also be expressed in dyadic form as

$S=\sum _{j=1}^{3}\sum _{i=1}^{3}{S}_{ij}{e}_{i}\otimes {e}_{j}$

This representation is particularly convenient when using polar coordinates, as described in Appendix E.

#### Addition Let S and T be two tensors.  Then $U=S+T$ is also a tensor.

Denote the Cartesian components of U, S and T by matrices as defined above.  The components of U are then related to the components of S and T by

$\left[\begin{array}{ccc}{U}_{11}& {U}_{12}& {U}_{13}\\ {U}_{21}& {U}_{22}& {U}_{23}\\ {U}_{31}& {U}_{32}& {U}_{33}\end{array}\right]=\left[\begin{array}{ccc}{S}_{11}+{T}_{11}& {S}_{12}+{T}_{12}& {S}_{13}+{T}_{13}\\ {S}_{21}+{T}_{21}& {S}_{22}+{T}_{22}& {S}_{23}+{T}_{23}\\ {S}_{31}+{T}_{31}& {S}_{32}+{T}_{32}& {S}_{33}+{T}_{33}\end{array}\right]$

#### Product of a tensor and a vector

Let u be a vector and S a second order tensor.  Then

$v=S\cdot u$

is a vector.

Let $\left({u}_{1},{u}_{2},{u}_{3}\right)$ and $\left({v}_{1},{v}_{2},{v}_{3}\right)$ denote the components of vectors u and v in a Cartesian basis $\left\{{e}_{1},{e}_{2},{e}_{3}\right\}$, and denote the Cartesian components of S as described above.  Then

$\left[\begin{array}{c}{v}_{1}\\ {v}_{2}\\ {v}_{3}\end{array}\right]=\left[\begin{array}{ccc}{S}_{11}& {S}_{12}& {S}_{13}\\ {S}_{21}& {S}_{22}& {S}_{23}\\ {S}_{31}& {S}_{32}& {S}_{33}\end{array}\right]\left[\begin{array}{c}{u}_{1}\\ {u}_{2}\\ {u}_{3}\end{array}\right]=\left[\begin{array}{c}{S}_{11}{u}_{1}+{S}_{12}{u}_{2}+{S}_{13}{u}_{3}\\ {S}_{21}{u}_{1}+{S}_{22}{u}_{2}+{S}_{23}{u}_{3}\\ {S}_{31}{u}_{1}+{S}_{32}{u}_{2}+{S}_{33}{u}_{3}\end{array}\right]$

The product

$v=u\cdot S$

is also a vector.  In component form

#### Product of two tensors

Let T and S be two second order tensors.  Then $U=T\cdot S$ is also a tensor.

Denote the components of U, S and T by $3×3$ matrices.  Then,

$\begin{array}{l}\left[\begin{array}{ccc}{U}_{11}& {U}_{12}& {U}_{13}\\ {U}_{21}& {U}_{22}& {U}_{23}\\ {U}_{31}& {U}_{32}& {U}_{33}\end{array}\right]=\left[\begin{array}{ccc}{T}_{11}& {T}_{12}& {T}_{13}\\ {T}_{21}& {T}_{22}& {T}_{23}\\ {T}_{31}& {T}_{32}& {T}_{33}\end{array}\right]\left[\begin{array}{ccc}{S}_{11}& {S}_{12}& {S}_{13}\\ {S}_{21}& {S}_{22}& {S}_{23}\\ {S}_{31}& {S}_{32}& {S}_{33}\end{array}\right]\\ \text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}=\left[\begin{array}{ccc}{T}_{11}{S}_{11}+{T}_{12}{S}_{21}+{T}_{13}{S}_{31}& {T}_{11}{S}_{12}+{T}_{12}{S}_{22}+{T}_{13}{S}_{32}& {T}_{11}{S}_{13}+{T}_{12}{S}_{23}+{T}_{13}{S}_{33}\\ {T}_{21}{S}_{11}+{T}_{22}{S}_{21}+{T}_{23}{S}_{31}& {T}_{21}{S}_{12}+{T}_{22}{S}_{22}+{T}_{23}{S}_{32}& {T}_{21}{S}_{12}+{T}_{22}{S}_{22}+{T}_{23}{S}_{32}\\ {T}_{31}{S}_{11}+{T}_{32}{S}_{21}+{T}_{33}{S}_{31}& {T}_{31}{S}_{12}+{T}_{32}{S}_{22}+{T}_{33}{S}_{32}& {T}_{31}{S}_{13}+{T}_{32}{S}_{23}+{T}_{33}{S}_{33}\end{array}\right]\end{array}$

Note that tensor products, like matrix products, are not commutative; i.e. $T\cdot S\ne S\cdot T$

#### Transpose

Let S be a tensor.  The transpose of S is denoted by ${S}^{T}$ and is defined so that

$u\cdot {S}^{T}=S\cdot u$

Denote the components of S by a 3x3 matrix.  The components of  ${S}^{T}$ are then

${S}^{T}\equiv \left[\begin{array}{ccc}{S}_{11}& {S}_{21}& {S}_{31}\\ {S}_{12}& {S}_{22}& {S}_{32}\\ {S}_{13}& {S}_{23}& {S}_{33}\end{array}\right]$

i.e. the rows and columns of the matrix are switched.

Note that, if A and B are two tensors, then

${\left(A\cdot B\right)}^{T}={B}^{T}\cdot {A}^{T}$ Trace

Let S be a tensor, and denote the components of S by a $3×3$ matrix.  The trace of S is denoted by tr(S) or trace(S), and can be computed by summing the diagonals of the matrix of components

$\text{trace}\left(S\right)={S}_{11}+{S}_{22}+{S}_{33}$

More formally, let $\left\{{e}_{1},{e}_{2},{e}_{3}\right\}$ be any Cartesian basis.  Then

$\text{trace}\left(S\right)={e}_{1}\cdot S\cdot {e}_{1}+{e}_{2}\cdot S\cdot {e}_{2}+{e}_{3}\cdot S\cdot {e}_{3}$

The trace of a tensor is an example of an invariant of the tensor $–$ you get the same value for trace(S) whatever basis you use to define the matrix of components of S. Contraction.

Inner Product: Let S and T be two second order tensors.  The inner product of S and T is a scalar, denoted by $S:T$.  Represent S and T by their components in a basis.  Then

$\begin{array}{l}S:T={S}_{11}{T}_{11}+{S}_{12}{T}_{12}+{S}_{13}{T}_{13}\\ \text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}+{S}_{21}{T}_{21}+{S}_{22}{T}_{22}+{S}_{23}{T}_{23}\\ \text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}+{S}_{31}{T}_{31}+{S}_{32}{T}_{32}+{S}_{33}{T}_{33}\end{array}$

Observe that $S:T=T:S$, and also that $S:I=\text{trace(}S\text{)}$, where I is the identity tensor. Outer product: Let S and T be two second order tensors.  The outer product of S and T is a scalar, denoted by $S\cdot \cdot T$.  Represent S and T by their components in a basis.  Then

$\begin{array}{l}S\cdot \cdot T={S}_{11}{T}_{11}+{S}_{21}{T}_{12}+{S}_{31}{T}_{13}\\ \text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}+{S}_{12}{T}_{21}+{S}_{22}{T}_{22}+{S}_{32}{T}_{23}\\ \text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}+{S}_{13}{T}_{31}+{S}_{23}{T}_{32}+{S}_{33}{T}_{33}\end{array}$

Observe that $S\cdot \cdot T={S}^{T}:T$ Determinant

The determinant of a tensor is defined as the determinant of the matrix of its components in a basis.  For a second order tensor

$\begin{array}{l}\mathrm{det}S=\mathrm{det}\left[\begin{array}{l}{S}_{11}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{S}_{12}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{S}_{13}\\ {S}_{21}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{S}_{22}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{S}_{23}\text{\hspace{0.17em}}\\ {S}_{31}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{S}_{32}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{S}_{33}\end{array}\right]\\ \text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}={S}_{11}\left({S}_{22}{S}_{33}-{S}_{23}{S}_{32}\right)+{S}_{12}\left({S}_{23}{S}_{31}-{S}_{21}{S}_{33}\right)+{S}_{13}\left({S}_{21}{S}_{32}-{S}_{31}{S}_{22}\right)\end{array}$

Note that if S and T are two tensors, then

$\mathrm{det}\left(S\right)=\mathrm{det}\left({S}^{T}\right)\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\mathrm{det}\left(S\cdot T\right)=\mathrm{det}\left(S\right)\mathrm{det}\left(T\right)$ Inverse

Let S be a second order tensor.  The inverse of S exists if and only if $\mathrm{det}\left(S\right)\ne 0$, and is defined by

${S}^{-1}\cdot S=I$

where ${S}^{-1}$ denotes the inverse of S and I is the identity tensor.

The inverse of a tensor may be computed by calculating the inverse of the matrix of its components.  The result cannot be expressed in a compact form for a general three dimensional second order tensor, and is best computed by methods such as Gaussian elimination. Eigenvalues and Eigenvectors (Principal values and direction)

Let S be a second order tensor.  The scalars $\lambda$ and unit vectors m which satisfy

$S\cdot m=\lambda m$

are known as the eigenvalues and eigenvectors of S, or the principal values and principal directions of S. Note that $\lambda$ may be complex.  For a second order tensor in three dimensions, there are generally three values of $\lambda$ and three unique unit vectors m which satisfy this equation.  Occasionally, there may be only two or one value of $\lambda$.  If this is the case, there are infinitely many possible vectors m that satisfy the equation.  The eigenvalues of a tensor, and the components of the eigenvectors, may be computed by finding the eigenvalues and eigenvectors of the matrix of components (see A.3.2)

The eigenvalues of a symmetric tensor are always real.  The eigenvalues of a skew tensor are always pure imaginary or zero. Change of Basis.

Let S be a tensor, and let $\left\{{e}_{1},{e}_{2},{e}_{3}\right\}$ be a Cartesian basis.  Suppose that the components of S in the basis $\left\{{e}_{1},{e}_{2},{e}_{3}\right\}$ are known to be

$\left[{S}^{\left(e\right)}\right]=\left[\begin{array}{ccc}{S}_{11}^{\left(e\right)}& {S}_{12}^{\left(e\right)}& {S}_{13}^{\left(e\right)}\\ {S}_{21}^{\left(e\right)}& {S}_{22}^{\left(e\right)}& {S}_{23}^{\left(e\right)}\\ {S}_{31}^{\left(e\right)}& {S}_{32}^{\left(e\right)}& {S}_{33}^{\left(e\right)}\end{array}\right]$

Now, suppose that we wish to compute the components of  S in a second Cartesian basis, $\left\{{m}_{1},{m}_{2},{m}_{3}\right\}$.  Denote these components by

$\left[{S}^{\left(m\right)}\right]=\left[\begin{array}{ccc}{S}_{11}^{\left(m\right)}& {S}_{12}^{\left(m\right)}& {S}_{13}^{\left(m\right)}\\ {S}_{21}^{\left(m\right)}& {S}_{22}^{\left(m\right)}& {S}_{23}^{\left(m\right)}\\ {S}_{31}^{\left(m\right)}& {S}_{32}^{\left(m\right)}& {S}_{33}^{\left(m\right)}\end{array}\right]$

To do so, first compute the components of the transformation matrix [Q]

$\left[Q\right]=\left[\begin{array}{l}{m}_{1}\cdot {e}_{1}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{m}_{1}\cdot {e}_{2}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{m}_{1}\cdot {e}_{3}\\ {m}_{2}\cdot {e}_{1}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{m}_{2}\cdot {e}_{2}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{m}_{2}\cdot {e}_{3}\\ {m}_{3}\cdot {e}_{1}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{m}_{3}\cdot {e}_{2}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{m}_{3}\cdot {e}_{3}\end{array}\right]$

(this is the same matrix you would use to transform vector components from $\left\{{e}_{1},{e}_{2},{e}_{3}\right\}$ to $\left\{{m}_{1},{m}_{2},{m}_{3}\right\}$ ).  Then,

$\left[{S}^{\left(m\right)}\right]=\left[Q\right]\left[{S}^{\left(e\right)}\right]{\left[Q\right]}^{T}$

or, written out in full

$\left[\begin{array}{ccc}{S}_{11}^{\left(m\right)}& {S}_{12}^{\left(m\right)}& {S}_{13}^{\left(m\right)}\\ {S}_{21}^{\left(m\right)}& {S}_{22}^{\left(m\right)}& {S}_{23}^{\left(m\right)}\\ {S}_{31}^{\left(m\right)}& {S}_{32}^{\left(m\right)}& {S}_{33}^{\left(m\right)}\end{array}\right]=\left[\begin{array}{l}{m}_{1}\cdot {e}_{1}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{m}_{1}\cdot {e}_{2}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{m}_{1}\cdot {e}_{3}\\ {m}_{2}\cdot {e}_{1}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{m}_{2}\cdot {e}_{2}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{m}_{2}\cdot {e}_{3}\\ {m}_{3}\cdot {e}_{1}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{m}_{3}\cdot {e}_{2}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{m}_{3}\cdot {e}_{3}\end{array}\right]\left[\begin{array}{ccc}{S}_{11}^{\left(e\right)}& {S}_{12}^{\left(e\right)}& {S}_{13}^{\left(e\right)}\\ {S}_{21}^{\left(e\right)}& {S}_{22}^{\left(e\right)}& {S}_{23}^{\left(e\right)}\\ {S}_{31}^{\left(e\right)}& {S}_{32}^{\left(e\right)}& {S}_{33}^{\left(e\right)}\end{array}\right]\left[\begin{array}{l}{m}_{1}\cdot {e}_{1}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{m}_{2}\cdot {e}_{1}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{m}_{3}\cdot {e}_{1}\\ {m}_{1}\cdot {e}_{2}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{m}_{2}\cdot {e}_{2}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{m}_{3}\cdot {e}_{2}\\ {m}_{1}\cdot {e}_{3}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{m}_{2}\cdot {e}_{3}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{m}_{3}\cdot {e}_{3}\end{array}\right]$

To prove this result, let u and v be vectors satisfying

$v=S\cdot u$

Denote the components of u and v in the two bases by  $\underset{_}{{u}^{\left(e\right)}},\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\underset{_}{{u}^{\left(m\right)}}$ and $\underset{_}{{v}^{\left(e\right)}},\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\underset{_}{{v}^{\left(m\right)}}$, respectively.  Recall that the vector components are related by

$\begin{array}{l}\underset{_}{{u}^{\left(m\right)}}=\left[Q\right]\underset{_}{{u}^{\left(e\right)}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\underset{_}{{u}^{\left(e\right)}}={\left[}^{Q}\underset{_}{{u}^{\left(m\right)}}\\ \underset{_}{{v}^{\left(m\right)}}=\left[Q\right]\underset{_}{{v}^{\left(e\right)}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\underset{_}{{v}^{\left(e\right)}}={\left[}^{Q}\underset{_}{{v}^{\left(m\right)}}\end{array}$

Now, we could express the tensor-vector product in either basis

$\underset{_}{{v}^{\left(m\right)}}=\left[{S}^{\left(m\right)}\right]\underset{_}{{u}^{\left(m\right)}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\underset{_}{{v}^{\left(e\right)}}=\left[{S}^{\left(e\right)}\right]\underset{_}{{u}^{\left(e\right)}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}$

Substitute for $\underset{_}{{u}^{\left(e\right)}},\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\underset{_}{{v}^{\left(e\right)}}$ from above into the second of these two relations, we see that

${\left[Q\right]}^{T}\underset{_}{{v}^{\left(m\right)}}=\left[{S}^{\left(e\right)}\right]{\left[Q\right]}^{T}\underset{_}{{u}^{\left(m\right)}}\text{\hspace{0.17em}}$

Recall that

$\left[Q\right]{\left[Q\right]}^{T}=\left[I\right]\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\left[I\right]\underset{_}{{v}^{\left(m\right)}}=\underset{_}{{v}^{\left(m\right)}}$

so multiplying both sides by [Q] shows that

$\underset{_}{{v}^{\left(m\right)}}=\left[Q\right]\left[{S}^{\left(e\right)}\right]{\left[Q\right]}^{T}\underset{_}{{u}^{\left(m\right)}}\text{\hspace{0.17em}}$

so, comparing with the first of equation (1)

$\left[{S}^{\left(m\right)}\right]=\left[Q\right]\left[{S}^{\left(e\right)}\right]{\left[Q\right]}^{T}$

as stated. Invariants

Invariants of a tensor are functions of the tensor components which remain constant under a basis change.  That is to say, the invariant has the same value when computed in two arbitrary bases $\left\{{e}_{1},{e}_{2},{e}_{3}\right\}$ and $\left\{{m}_{1},{m}_{2},{m}_{3}\right\}$.  A symmetric second order tensor always has three independent invariants.

Examples of invariants are

1.      The three eigenvalues

2.      The determinant

3.      The trace

4.      The inner and outer products

These are not all independent $–$ for example any of 2-4 can be calculated in terms of 1.

B3 SPECIAL TENSORS Identity tensor  The identity tensor I is the tensor such that, for any tensor S or vector v

$\begin{array}{l}I\cdot v=v\cdot I=v\\ S\cdot I=I\cdot S=S\end{array}$

In any basis, the identity tensor has components

$\left[\begin{array}{ccc}1& 0& 0\\ 0& 1& 0\\ 0& 0& 1\end{array}\right]$ Symmetric Tensor A symmetric tensor S has the property

$S={S}^{T}$

The components of a symmetric tensor have the form

$\left[\begin{array}{ccc}{S}_{11}& {S}_{12}& {S}_{13}\\ {S}_{12}& {S}_{22}& {S}_{23}\\ {S}_{13}& {S}_{23}& {S}_{33}\end{array}\right]$

so that there are only six independent components of the tensor, instead of nine. Skew Tensor  A skew tensor S has the property

${S}^{T}=-S$

The components of a skew tensor have the form

$\left[\begin{array}{ccc}0& {S}_{12}& {S}_{13}\\ -{S}_{12}& 0& {S}_{23}\\ -{S}_{13}& -{S}_{23}& 0\end{array}\right]$ Orthogonal Tensors An orthogonal tensor S has the property

$\begin{array}{l}S\cdot {S}^{T}={S}^{T}\cdot S=I\\ {S}^{-1}={S}^{T}\end{array}$

An orthogonal tensor must have $\mathrm{det}\left(S\right)=±1$; a tensor with $\mathrm{det}\left(S\right)=+1$ is known as a proper orthogonal tensor.