0 ’3

20 2

1 0 2 2

2.2.2 SECOND ORDER TENSORS

A second-order tensor S is a linear mapping that associates a given vector

u with a second vector v as,

v = Su (2.15)

Italic boldface capitals will be used throughout this chapter to denote second-

order tensors. Later on, however, the distinction between lower- and upper-

case quantities will be needed for more important purposes and no explicit

di¬erentiation will be made between vectors and second-order tensors. The

term linear in the above de¬nition is used to imply that given two arbitrary

vectors u1 and u2 and arbitrary scalars ± and β, then,

S(±u1 + βu2 ) = ±Su1 + βSu2 (2.16)

Recognizing in Equation (2.15) that u and v are vectors and thus coordinate-

independent, the tensor S that operates on u to give v must also, in a similar

sense, have a coordinate-independent nature. If the vectors u and v can be

expressed in terms of components in a speci¬c basis, namely, Equation (2.1),

then it is to be expected that the tensor S can somehow be expressed in

terms of components in the same basis; this is shown below. Consequently,

a tensor will have di¬erent components in di¬erent bases, but the tensor

itself will remain unchanged.

The simple example of a second-order tensor that satis¬es the above

de¬nition is the identity tensor I, which maps any given vector u onto

itself as,

u = Iu (2.17)

Another example is the transformation tensor Q shown in Figure 2.2, which

rotates vectors in space in such a way that the standard Cartesian base

8 MATHEMATICAL PRELIMINARIES

vectors e1 , e2 , and e3 become e1 , e2 , and e3 , that is,

ei = Qei ; i = 1, 2, 3 (2.18)

The relationship between this important tensor and the angle cosines Qij

introduced in the previous section will be elaborated below.

Simple operations such as the sum, product, and inverse of second-order

tensors are de¬ned in an obvious manner so that for any vector u,

(S 1 + S 2 ) u = S 1 u + S 2 u (2.19a)

(S 1 S 2 ) u = S 1 (S 2 u) (2.19b)

S ’1 S = I (2.19c)

Additionally, the transpose of a tensor S is de¬ned as the tensor S T , which

for any two vectors u and v satis¬es,

u · Sv = v · S T u (2.20)

For example, the transpose of the identity tensor is again the identity I,

because use of the above de¬nition shows that for any pair of vectors u and

v, I T satis¬es,

v · I T u = u · Iv

= u·v

= v·u

= v · Iu (2.21)

A tensor S that, like the identity, satis¬es S T = S is said to be symmetric;

whereas a tensor for which S T = ’S is said to be skew. As an example of

a skew tensor consider the tensor W w, associated with an arbitrary vector

w, de¬ned in such a way that for any vector u,

W wu = w—u (2.22)

where — denotes the standard vector or cross product. Proof that W w

is skew follows from the cyclic commutative property of the mixed vector

product, which, for any u and v, gives,

v · W T u = u · W wv

w

= u · (w—v)

= ’v · (w—u)

= ’v · W wu (2.23)

A ¬nal example of general interest is the transpose of the transformation

tensor Q. Applying the de¬nition of the transpose tensor to the new and

old base vectors and recalling the de¬nition of Q given in Equation (2.18)

9

2.2 VECTOR AND TENSOR ALGEBRA

gives,

ej · QT ei = ei · Qej

= ei · ej

= ej · ei ; i, j = 1, 2, 3 (2.24)

Comparing the ¬rst and last expressions in this equation leads to,

QT ei = ei ; i = 1, 2, 3 (2.25)

which implies that the transpose tensor QT rotates the new base vectors ei

back to their original positions ei . Moreover, combining Equations (2.25)

and (2.18) gives,

QT Q = I (2.26)

Tensors that satisfy this important property are said to be orthogonal. In

fact, any arbitrary second-order tensor A can be expressed as the sum of

a symmetric plus a skew tensor or as the product of an orthogonal times a

symmetric tensor as,

S T = S, W T = ’W

A = S + W; (2.27a)

S T = S, QT Q = I

A = QS; (2.27b)

The ¬rst expression is rather trivial and follows from taking S = (A+AT )/2

and W = (A ’ AT )/2; whereas the second, less obvious equation, is known

as the polar decomposition and plays a crucial role in continuum mechanics.

Many examples of symmetric and skew tensors will occur throughout the

remaining continuum mechanics sections of this text.

Second-order tensors can often be derived from the dyadic or tensor

product of two vectors u and v to give a tensor, denoted u — v, which to

any arbitrary third vector w assigns the following vector,

(u — v)w = (w · v)u (2.28)

where (w · v)u is obviously the projection in the u direction of the scalar

component of w in the v direction. This seemingly bizarre de¬nition of the

tensor u — v transpires to make physical sense, particularly in the case of

the stress tensor, from which, incidentally, the word tensor originates, from

the association with a tensile stress.

The tensor product satis¬es the following properties:

(u — v)T = (v — u) (2.29a)

S(u — v) = (Su — v) (2.29b)

(u — v)S = (u — S T v) (2.29c)

u — (v 1 + v 2 ) = u — v 1 + u — v 2 (2.29d)

10 MATHEMATICAL PRELIMINARIES

EXAMPLE 2.2: Proof of (2.29b)

Any of the properties of the dyadic product can be easily demonstrated

using its De¬nition (2.28). For example we can prove (2.29b) by show-

ing that for any vector w we have:

S(u — v)w = S[(u — v)w]

= Su(v · w)

= (Su — v)w

Now recall that a vector can be expressed in terms of a linear combina-

tion of the base vectors e1 , e2 , and e3 as shown in Equation (2.1). Hence

it is not unreasonable to suggest that in a similar fashion a tensor could be

expressed in terms of a linear combination of dyadic products of these base

vectors. In particular, the nine tensors ei — ej for i, j = 1, 2, 3 obtained

by the dyadic product of the three Cartesian base vectors form a basis on

which any second-order tensor can be expressed. For instance, the identity

tensor I can be written as,

3 3

I= or I = δij ei — ej (2.30a,b)

ei — ei

±=1 ±,β=1

In order to verify these expressions simply note that when either equation is

applied to a vector u, use of De¬nition (2.28) together with Equations (2.6)

and (2.19a) gives,

3

Iu = ei — ei u

±=1

3

= (u · ei )ei

±=1

3

= ui ei

±=1

=u (2.31)

In general, we can express any given tensor S as a linear combination of

ei — ej in terms of a set of nine components Sij as,

3

S= Sij ei — ej (2.32)

±,β=1

where the components Sij can be obtained in a manner similar to that used

11

2.2 VECTOR AND TENSOR ALGEBRA

in Equation (2.6) for vectors as,

Sij = ei · Sej (2.33)

Proof of this expression follows from Equations (2.32) and (2.28) as,

3

ei · Sej = ei · Skl ek — el ej

k,l=1

3

= ei · Skl (ej · el )ek

k,l=1

3

= ei · Skl δlj ek

k,l=1

3

= Skl δlj δik

k,l=1

= Sij (2.34)

For example, the transformation tensor Q can be expressed in terms of its

components Qij as,

3

Q= Qij ei — ej (2.35)

±,β=1

where, using Equation (2.33) together with Equation (2.18), the components

Qij can now be seen to coincide with the angle cosines introduced in the

previous section as,

Qij = ei · Qej = ei · ej (2.36)

12 MATHEMATICAL PRELIMINARIES

EXAMPLE 2.3: Components of u — v

We can evaluate the components of the tensor product u — v in two dif-

ferent ways. Firstly, direct use of Equation (2.33) and De¬nition (2.28)

gives,

(u — v)ij = ei · (u — v)ej

= (ei · u)(v · ej )

= ui vj

Alternatively, use of Equation (2.1) and Property (2.29d) gives,

3 3

(u — v) = ui ei vj ej

—

±=1 j=1

3

= ui vj ei — ej

i.j=1

From which the same components of the dyadic product are immedi-

ately identi¬ed.

The components of any second-order tensor S can be arranged in the

form of a 3 — 3 matrix as,

®

S11 S12 S13

[S] = ° S21 S22 S23 » (2.37)

S31 S32 S33

For instance, use of Equation (2.33) shows that the skew tensor Ww de¬ned

in Equation (2.22) can be written as,

®

0 ’w3 w2

[W w] = ° w3 (2.38)

0 ’w1 »

’w2 w1 0

where w1 , w2 , and w3 are the components of w.

Tensorial expressions can now be duplicated in terms of matrix opera-

tions with the tensor components. For instance, it is simple to show that

the basic Equation (2.15) becomes,

[v] = [S][u] (2.39)