By now you will have noticed a theme in all the prior sections. A transformation can either map everything in an n-dimensional space to somewhere else in that n-dimensional space, or it squishes that n-dimensional space into a lower dimensional space like a line or a plane.

You will probably also appreciate that some transformations squash, or expand space more than others. For simple changes in the unit vectors, it is pretty obvious how much space gets squashed or expanded.

For other transformations, its not quite as obvious how much space is being squashed or expanded.

In every case though, there is a computation we can do to work out how much space is being squished or expanded overall regardless of the other effects of the transformation. The result of that computation is called the Determinant.

To interpret the Determinant, we can think about what happens to a 1x1 square in the 2D case or a 1x1x1 volume in the 3D case. The factor by which that square or volume has a greater area or volume is the Determinant. Because the transformation is linear, every other area or volume is going to be changed by that same factor.

Now, as for computing this thing - lets start from the 2D case.

When we are just dealing with squares this is quite easy - its just basic high school geometry - width height. The determinant is literally the area defined by the scale of each of the two basis vectors.

You might think that doing more complicated things like shears or rotations would complicate matters, but as it turns out, they do not.

Even though this shape changes quite substantially, the determinant of the transformation is still 1. The reason for that is that we can imagine that the unit square stretches out to a rectangle twice its size, but then we need to subtract away the empty right-triangles on either end, which grow as fast as that rectangle does.

Things start to get a bit more interesting when we have shears in two directions.

Under this transformation, the x-shear and y-shear are equal, such that the transformed basis vectors converge on a point. Notice how as that happens, the area itself gets smaller and smaller even though the magnitude of our basis vectors increases. However, if we scale one of the basis vectors, notice that they will not convege on the same point anymore:

With these intuitions, we should be pretty comfortable to say that the determinant in two dimensions has something to do with the relationship between both the left and right diagonal. As the product of the left diagonal grows, the overall scale factor for area affected by the transformaton increases, because the basis vectors are getting further and further away from each other. As the product of the right diagonal grows, it actually shrinks because the basis vectors are getting closer towards each other.

Thus, overall, the determinant is the difference between the growth factors, the left diagonal, and the shrinking factors, the right diagonal.

So it should be no surprise then, that we can represent the determinant for a matrix like this:

with the following formula:

How about generalizing to a higher dimensional space? For a three dimensional space we could imagine a volume made up of three two-dimensional faces, one along the at , one along at and one along at .

Do you notice something that seems off about this visualization though? The plane along the is appears to be facing in the negative direction. Lets have a look at the matrix at this point.

Recall that this 3x3 matrix can be described as a set of planes each with two-component basis vectors. If we go along the first row, observe that there are three possible 2x2 matricies in the second and third row.

Of each of these, observe that the value in the first row is a scale factor the corresponding 2x2 determinant is also a scale factor. So we have three different scale factors along with three different areas in play, each of which gives us a 3x3 volume. Of course, notice in this case that the volume of the and volumes is zero.

Now what will happen if we start to scale on an axis other than the axis? Well, the determinant (and hence the area) of the 2x2 plane will increase and so the volume will increase, even though the scale factor never increases.

, 1.00,

, ,

, ,

Okay, lets step up the complexity a little. What happens if we shear along the plane?

, ,

, ,

, ,

Like the 2D case, the area of the unit cube being transformed here does not change. As it grows more towards the x axis, we just take away an equivalent amount of volume on each side. That can be visualized below.

, ,

0, ,

, ,

Similarly to the 2D case, if we shear on the and the components, the volume will actually start to collapse into something that looks an awful lot like a plane. Can you see why? Look at what happens to the basis vectors.

We are not done yet however. What happens if you add some shear on the plane?

You will notice that even though we added shear on the plane, two basis vectors still ended up being coplanar, as opposed to colinear. So we still ended up with a plane.

This time, we shall add some additional shear on the plane.

Now the volume kind of ends up becoming an inverted and skewed version of itself, this such that it has a negative volume, as though it has been turned inside out.

With those intuitions, we should be able to find the algorithm for a 3x3 determinant relatively acceptable. We go across the first row and take the sub-determinant of along the second and third rows. Recall that the scalar value in the first row sort of gives volume to the area in another dimension. Then, we subtract the scaled sub-determinant of applied to since that was just collapsing space on to a plane and makes the overall volume smaller. Then, the scalar value of applied to expands the overall volume again, so we add that.

Or more generally, for a 3x3 matrix of the form:

As it turns out, this same algorithm can be followed along any row or column applied to their corresponding sub-determinants, though special attention should be paid to whether that sub-determinant is negated in the overall equation. The general rule for a 3x3 looks something like this:

The rule for 4x4 or even an matrix is very similar - it is just recursively defined in terms of a scalar versus other sub-determinants. Which makes sense if you remember the way that dimensionality itself is defined. A plane is just a line of every possible vertical line. A volume is just a line of every possible plane and so on.

Now, computing the determinant in this way for a 3x3 matrix is already tedious enough and gets expotentially more tedious as you increase the number of dimensions. Thankfully, there is an easier way of doing it by exploiting a few invariants.

Observe what happens when you try to compute the determinant of a matrix like the following when you go down the first column:

Notice that because most of the scaled sub-determinants are zero that the overall recursive formula expands quite nicely:

Now the question is how we can get our matrix into this form so we can do the computation in a far simpler way. The answer of course, is Elementary Row Operations. However, Elementary Row Operations might preserve the row space, but they do not preserve the overall characteristic of the transformation, including the degree to which it scales space or flips space over on itself.

The fact that it does not should be relatively easy to see when you consider the end result of a perfect row elimination. You might turn something messy into what is essentially the identity matrix which does nothing to the space. Both matrices had the same row space, but the transformation is no longer there.

So when we are performing row operations we need to keep track of some extra information which allows us to undo the changes that affect the determinant.

Type 1 Row Exchanging: when we exchange two rows, we are actually flipping space over on itself, which means that the determinant will negate. So we need to keep track of how many times we negate the determinant so that we can undo it later.

Type 2 Row Scaling: recall that when we multiply an entire row by a scalar, the determinant is going to scale by exactly that amount. So we need to keep track of that scalar for later, since that is one factor by which the row-reduced determinant would have been scaled.

Type 3 Right Linear Combinations: paradoxically, adding scalar multiples of any row to any other row (so long as the destination row itself is not scaled as a part of the combination, hence the term right linear combination), will not change the determinant at all. Consider a shear where we add a scalar multiple of the second row to the first, to see why this is so:

With all that in mind, we can compute the determinant of a 3x3 in far more straightforward fashion. Observe how the nature of the transformation changes and the factor by which the area changes as we apply row reduction operations.

Before we begin, first observe a few things about this transformation. First, note that it inverts space along the axis. Also notice that it appears to expand space as well.

First, subtract 2 times the first row from the third. This is a Type 3 operation and will not affect the determinant.

Now subtract the second row from the first. This is also a Type 3 operation and does not affect the determinant.

Then add half of the first row to the second. This is Type 3 and als does not affect the determinant.

Add three times the second row to the third, which is also Type 3

Now, from here we can compute the determinant directly by just multiplying along the diagonal (recall that the determinant has not yet changed):

For the sake of completeness, clean up the matrix by multipying the first row by , the second row by -1 and the third row by . These operations do change the determinant by an overall factor of .

Computing the determinant of this newly row-reduced matrix is straightforward

And to recover our original determinant, we need to do the inverse of the change to the determinant, that is, divide by

A few things to observe about the determinant based on what we have seen here. First, it should be pretty intuitive to see that if you scale an entire matrix by some scalar, you will also be scaling the determinant by that scalar, raised to the power of the number of rows in that matrix. For instance, scaling the identity matrix by 3 will change the determinant to 9, since first it scales by 3 in the x direction, 3 in the y direction and 3 in the z direction.

Secondly, if you think about two matrices as transformations, the determinant of their product is the product of the determinants.