General Relativity For Tellytubbys

The Tensor

Sir Kevin Aylward B.Sc., Warden of the Kings Ale


Back to the Contents section

Vector Refresher

A vector is a tensor of rank one, what a tensor is will shortly become clearer, so bare with us for a bit please, but in short, a tensor may be thought of as a product of vectors with "transformation law" restrictions. There are many other descriptions, but we'll leave that for now.

A vector can be described as numbers or functions pointing in certain directions, i.e.

 

the a's are the numbers or functions and are called the "components" of the "basis vectors" which are the e's in the above. Note the position of the indexes for the components and basis vectors. This form of a vector is called the contravariant form, why?, beats me.

The other form, below, is called the covariant form, for the same reason as above.

 

 

These two forms of the same vector are called reciprocal to each other, and once again, always pay attention to what indexes are upstairs and downstairs, it will greatly simplify things if this is recognized at the outset as of deep significance.

First Important Notes:

1) There is no limit, mathematically to the number of terms, in G.R. there are 4.

2) The basis vectors are not constant or unit vectors in general.

Summation Convention

Instead of writing

 

We could write

 

 But this is shortened to

 

That is, the sum sign is dropped, but it is understood that whenever an index appears twice in any product, then a summation is inherently implied. Unless otherwise specified, all such products imply an equation with sums of values

The Metric

An element of arc length, by drawing a little diagram my little Tellytubbys, in the direction of the basis can be expressed as.

, Not summed here.

, Note that the implied summation is used here

or

 

Where the definition is now made for the metric tensor components, i.e.

 

Thus we have our first real 2nd order tensor, the metric tensor. Not to be confused with the inches tensor, and as a side note, its symmetric to boot.

Conversion between Covariant and Contravariant Components

Recalling from the above, somewhere, the definition of reciprocal vectors  

Which, is conveniently expressed as

, and noting what is clearly an obvious definition for the new delta symbol introduced here.

So, given that a vector can have components expressed in contravariant form, the components in covariant form can now be obtained:

 

 

 

And obviously

 

So one can raise and lower indexes, by multiplying by the appropriate metric tensor.

And just to make sure we know what the above means, it means a system of equations thus

, in expanded form, means

 

 

 

Tensor Sums

Tensor expressions usually result in sums of products of terms, such as

 

because all the summing indexes take on all values you can swap between indexes that are repeated in a single product. e.g. the above can be written also as:

 

 

without changing anything. Write it out to check for yourself.

Notes:

The second term indexes of the equation above do not need to be changed, although if desired feel free to do so.

You cannot swap indexes that are not being summed, unless you swap them everywhere.

A "units" check must make like match like, i.e. repeated indexes in a product reduces the order of the tensor by two. Both sides of the equation must match.

Tensor Transformation Law

The job here is to find out how to calculate components in one coordinate system when one knows the components in another coordinate system.

Consider an example 3 variable position vector

 

Since the x's are independent by construction, the covariant basis vectors can be seen to be given by

, these vectors are tangent to the coordinate lines.

 

Now consider the same vector represented in two different coordinate systems by

 

then,

 and where  

But we also have

 

 

hence,

 

Is the transformation law from one coordinate system to another coordinate system.

And with a bit of pissing around you can find out for yourself, for example

 

Is the transformation law for a 2nd order tensor

And no surprise here, covariant tensor (vector) transforms as

 

Hence, it is now clear why the upstairs and downstairs indexes are where the are.

Rounding off this section, consider the vectors normal to the surfaces defined by

 

These are given by,

 

Why are they not queer vectors then? Well consider

 

Then

 

 

and considering

 

 

So, its down with a pint of Guinness to let this all sink in


© Kevin Aylward 2000 - 2015

All rights reserved

The information on the page may be reproduced

providing that this source is acknowledged.

Website last modified 31st May 2015

http://www.kevinaylward.co.uk/gr/index.html

www.kevinaylward.co.uk