On the Subject of Matrices

Go take a linear algebra course, we'll wait... but the bomb won't.

To defuse the module all you have to do is answer several questions about a specific matrix. No tricks involved, very simple... until you have to diagonalize matrices.

NOTE: Each question has a new matrix so be sure to ask your defuser about it at each stage. Number of stages varies.

Warning: I suck at teaching and writing manuals, proceed to read the appendix at your own risk or find relevant resources on the internet.

This Manual covers only the very basics (even that poorly) needed to defuse the module, it is highly recommended to learn from other sources such as lectures, wikipedia and/or other online sources

Appendix content:

  • Introduction to Matrices
  • Shapes of Matrices
    • Diagonal
    • Triangular
    • Symmetric
    • Skew-Symmetric
    • Centro-Symmetric
  • Matrix Related Numbers
    • Small determinants
    • Minors & Cofactors
    • Determinants
    • Permanents
    • Traces
    • Invertible Matrix
  • Matrix Bonanza Squared
    • Involutory Matrix
    • Idempotent Matrix
  • Solving Equations
    • Row Echelon form
    • Matrix Nullity
    • Matrix Rank
  • Final Boss
    • Eigenvalues
    • Eigenvectors
    • Diagonalizable Matrix

On the Super-subject of Linear Algebra

Linear algebra is a very deeply studied topic regarding linear transformations and objects on which one is able to carry out such operations (vectors).

While Vectors are very deeply studied and cover a wide range of applications... We have no damn time for that so we have to simplify!

Vectors

For our purposes vectors are a list of real numbers. And all vectors here have exactly 3 numbers.

Vectors can be indexed. If we have a vector V then Vi is the ith number from the top in the vector.
Example:

W = ( 2 7 6 ) , W1 = 2 , W2 = 7 , W3 = 6

Matrices

For our purposes matrices are a 3x3 grid of real numbers with which we can do various operations.

Matrices can be indexed too, first number is row and second is column:

A = ( 123 456 789 )

A1,1 = 1 , A1,2 = 2 , A1,3 = 3 , A2,1 = 4 , A2,2 = 5 , A2,3 = 6 , A3,1 = 7 , A3,2 = 8 , A3,3 = 9

Some equations will have letters as indexes (usually i and j) this means to get a specific entry in the matrix or vector you first plug in the index into the place of the letter and then calculate normally.

Example:

Ai,j = 10 * i + j A = ( 111213 212223 313233 )

Operations

Matrices and Vectors can both be added to another Matrix/Vector of the same size and they can be multiplied by a number.

Vectors

Vectors can be added together if they have the same size and they can be multiplied by a number, so called scalar multiplication

Formulae:

C = A + B Ci = Ai + Bi
B = a A Bi = a * Ai

Example:

( 2 0 1 ) + 2 ( 3 3 0 ) = ( 8 6 1 )
Matrices

Just like vectors matrices of the same shape can be also added together and can be multiplied by a scalar number

Formulae:

C = A + B Ci,j = Ai,j + Bi,j
B = a A Bi,j = a * Ai,j

Example:

( 123 456 789 ) + 3 ( 100 010 001 ) = ( 423 486 7812 )
Matrix-Matrix multiplication
C = A B Ci,j = k=1n Ai,k * Bk,j

n is the size of the matrix. Since we are calculating with only 3x3 matrices we can replace n with 3 and write out the sum as:

C = A B Ci,j = Ai,1 * B1,j + Ai,2 * B2,j + Ai,3 * B3,j

Example:

( 61-4 3-12 351 ) ( -421 1-10 -274 ) =

( 6(-4)+1*1+(-4)(-2) 6*2+1(-1)+(-4)*7 6*1+1*0+(-4)*4 3(-4)+(-1)*1+2(-2) 3*2+(-1)(-1)+2*7 3*1+(-1)*0+2*4 3(-4)+5*1+1(-2) 3*2+5(-1)+1*7 3*1+5*0+1*4 ) = ( -15-17-10 -172111 -987 )
Matrix-Vector multiplication
w = A v wi = k=1n Ai,k * vk

We can expand the sum for n=3 like in the matrix multiplication operation:

w = A v wi = Ai,1 * v1 + Ai,2 * v2 + Ai,3 * v3

Example:

( 61-4 3-12 351 ) ( 4 1 3 ) = ( 6*4+1*1+(-4)*3 3*4+(-1)*1+2*3 3*4+5*1+1 *3 ) = ( 13 17 20 )

On the Sub-subject of Shapes of Matrices

Shape Condition Example
Diagonal A matrix is diagonal if all the values NOT on the main diagonal (going from top-left to bottom-right) are 0. ( 500 000 001 )
Triangular A matrix is triangular if all the values below or above the main diagonal are 0. ( 400 930 071 )
Symmetric A matrix is symmetric if every value is equal to its mirror reflection across the main diagonal. ( 498 937 871 )
Skew-Symmetric A matrix is skew-symmetric if every value is equal to negative its mirror reflection across the main diagonal. NOTE: This means that the diagonal must be full of 0s ( 09-8 -907 8-70 )
Centro-Symmetric A matrix is centro-symmetric if every value is equal to its center-symmetric counterpart. ( 308 767 803 )
Identity The identity matrix is a diagonal matrix with only ones on the diagonal. ( 100 010 001 )

On the Sub-subjects of Minors & Cofactors & Determinants & Permanents & Traces

Small Determinants and Permanents

To understand determinants on 3x3 matrices we first need to understand determinants on 2x2 matrices.
Luckily this is very easy:

det ( ab cd ) = a d - b c

Calculating the permanent is very similar but we use a plus sign instead of the minus:

perm ( ab cd ) = a d + b c

Minors

A minor of a matrix is the determinant of a matrix created by removing one row and one column from the original.

Example:

M1,2 of ( 61-4 3-12 351 ) = det ( 32 31 ) = det ( 32 31 ) = 3 * 1 - 3 * 2 = -3

Cofactors

Cofactors are very similar to Minors but they alter the sign of the number when the sum of the indices (i + j) is odd.

Formula:

Ci,j=(-1)i+jMi,j

Example:

M1,1 = 1 C1,1 = 1
M3,2 = 2 C3,2 = -2

3x3 Determinants and Permanents

Knowledge of how Minors and Cofactors work is very useful for calculating determinants because they follow this equation:

det ( abc def ghi ) = a * det ( ef hi ) - b * det ( df gi ) + c * det ( de gh )

Permanents are again similar but with addition:

perm ( abc def ghi ) = a * perm ( ef hi ) + b * perm ( df gi ) + c * perm ( de gh )

Determinants interesting properties:

The determinant of a product is a product of determinants (assuming both matrices are square)

det(A*B)=det(A)*det(B)=det(B*A)

For a n by n matrix A and a number c the following equation is true:

det(cA)=cn*det(A)

Traces

The trace of a matrix is simply the sum of the elements along the main diagonal.

tr ( abc def ghi ) = a + e + i

On the Sub-subject of Invertible and Inverse matrices

Matrix A is invertible if there exists a matrix B such that A*B=I (I is the identity matrix, that means a matrix with ones on the diagonal and zeroes everywhere else)
In that case B is the inverse matrix of A usually denoted A-1

A very useful fact to know is that: a square matrix A is invertible if and only if det(A)0

A non invertible matrix is called a singular matrix

Remember Cofactors? if not please got a section or two back.
Each cofactor has two indices, if we use these indices as matrix indices we can define a Cofactor matrix C

If we transpose the matrix, which means that we flip it along the main diagonal, we get something called the Adjugate matrix.

Finally, we divide the matrix by the determinant of A we get the inverse.

Final formula is:

Ai,j-1 = 1det(A) adj(A)i,j = 1det(A) Cj,i

On the Sub-subject of Matrix Bonanza Squared

If you can define multiplication, then powers are just repeated multiplication:

A2=A*A

Some matrices have special properties when they are squared for example:

A is involutoryA2=I
A is idempotentA2=A

Using the properties of determinants one can determine that the determinant of an involutory matrix is 1 or -1 and for idempotent matrices it's 1 or 0 BUT not all matrices with such determinants are idempotent or involutory, it's a one-way relationship

On the Sub-subject of Solving Equations

If we take a matrix we can interpret it as a set of linear equations where the matrix entries are the coefficients for the variables.
From school you might remeber that we can multiply equations by a non-zero number and that we can substract equations from each other
Here we'll do the same but each equation is a row in the matrix, so we're multiplying rows by nonzero numbers and we're subtracting or adding rows together, swapping rows is also allowed:

( 2-22 81-2 -21110 ) ~ ( 2-22 09-10 -21110 ) ~ ( 2-22 09-10 0912 ) ~ ( 2-22 09-10 0022 )

Notice how the matrix has each row starting with zeroes and each row begins later than the previous ones.

This is what we call a Row Echelon form of the matrix.

Another term good to know is Reduced Row Echelon form, where the first nonzero term in the row is always 1, this is achieved by simply dividing each row by the value first nonzero value.

Rank and Nullity

Row echelon form is very useful for determining the nullity and rank of an matrix, first we reduce the matrix to row echelon form.

( 2-22 81-2 -211-12 ) ~ ( 2-22 81-2 09-10 ) ~ ( 2-22 09-10 09-10 ) ~ ( 2-22 09-10 000 )

Now that we know the row echelon form, the nullity is simply the count of rows that are full of 0 and the rank is the count of the other rows.

Hence here the nullity is 1 and the rank is 2.

A possibly useful fact to know is that if the nullity is greater than 0 then the determinant must be equal to 0.

On the Sub-subject of The Final Boss

Eigenvalues and Eigenvectors

Eigenvalues and eigenvectors make pairs (λv) and they satisfy the equation:

Av=λv

This can be rewritten as:

(A-λI)v=0

To find the eigenvalues we solve this equation, which is called the characteristic equation:

det(A-λI)=0

Example:

det ( ( 3-68 4-1725 4-1420 )-( λ00 0λ0 00λ ) ) = 0
det ( 3-λ-68 4-17-λ25 4-1420-λ ) = 0
- λ3 + 6 λ2 - 11 λ + 6 = 0

Here we have acquired something called the characteristic polynomial, whose solutions are the eigenvalues.

- ( λ-3 ) ( λ-2 ) ( λ-1 ) = 0
λ = 3 , 2 , 1

So we've found the three eigenvalues of the matrix.

After finding the eigenvalues we need to find if there are 3 distinct eigenvectors using the eigenvector equation:

(A-λI)v=0

Remember last chapter about nullity? Here the concept comes back. Because eigenvectors have something called the geometric multiplicity. To easily find it without much linalg background you take the matrix (A-λI) and determine the nullity for a certain eigenvalue. If you then take the sum of the geometric multiplicity over all eigenvalues you have to get the dimension of the vectors associated with the matrix, since we have 3x3 matrices the dimension is 3.

Example (continuation):

null ( 3-λ-68 4-17-λ25 4-1420-λ ) null ( 3-1-68 4-17-125 4-1420-1 )
( 2-68 4-1825 4-1419 ) ~ ( 2-68 4-1825 0-23 ) ~ ( 2-68 0-69 0-23 ) ~ ( 2-68 0-69 000 ) nullity = 1
null ( 3-λ-68 4-17-λ25 4-1420-λ ) null ( 3-2-68 4-17-225 4-1420-2 )
( 1-68 4-1925 4-1418 ) ~ ( 1-68 05-7 4-1418 ) ~ ( 1-68 05-7 010-14 ) ~ ( 1-68 05-7 000 ) nullity = 1
null ( 3-λ-68 4-17-λ25 4-1420-λ ) null ( 3-3-68 4-17-325 4-1420-3 )
( 0-68 4-2025 4-1417 ) ~ ( 4-2025 4-1417 0-68 ) ~ ( 4-2025 06-8 0-68 ) ~ ( 4-2025 06-8 000 ) nullity = 1

Now we sum the nullities we got from every eigenvalue, those being 1 + 1 + 1 = 3 and because we're working with 3x3 matrices we can conclude that our original matrix is diagonalizable.

Notes: In this example we had 3 eigenvalues but it's entirely possible you'll have only 1 or 2 eigenvalues, also all the nullities here were 1, but if we had a single eigenvalue we might end up with a nullity between 1 and 3. Important note: Here I've calculated all 3 nullities but when you have 3 different eigenvalues you know the matrix is always diagonalizable.