Chapter 2 – Contents and Take-Aways

Here, you can find a (non-exhaustive!) list of the contents and take-aways of Chapter 2: Matrix Algebra. This list serves as an opportunity to assess both how thoroughly you should read the chapter before opening it for the first time, and how well you have managed to follow along once you have read it.

Chapter 2: Matrix Algebra discusses

    • the basics of the matrix concept, including addition and multiplication of matrices, and important matrix properties (symmetry, lower/upper triangular form, diagonality, etc.)
    • matrices and linear equation systems, focused on the connection of invertability and existence of a unique solution in the context of “square” systems (as many unknowns as equations)
    • criteria for matrix inversion and related concepts: determinants, rank, eigenvalues and -vectors, definiteness
    • computation of inverse matrices: the 2×2-matrix formula and the Gauß-Jordan algorithm

 

Someone with profound knowledge of the contents of this chapter should

    • know the dimension conformability conditions for matrix addition and multiplication, and how to compute the sum and product of conformable matrices
    • be able to represent a system of n equations in k unknowns in matrix form
    • know how invertability of a matrix A relates to existence of a unique solution in an associated equation system Ax = b
    • be thoroughly familiar with the equivalent and sufficient conditions for matrix invertability as summarized at the end of the chapter (determinant, rank, eigenvalues, unique solution in associated system; definiteness)
    • know the “sum-of-squares” property of the scalar product of a vector with itself, i.e. v'v = \sum_{j=1}^n v_j^2 for v\in\mathbb R^n
    • be aware of some inversion rules for “derived matrices”, i.e. if A, B and C are invertible, how to invert e.g. A' and ABC
    • be familiar with the Gauß-Jordan method for matrix inversion, and have a rough idea of why it works
    • know the definition of a column space of a matrix, and how it is useful when thinking about the existence of solutions in linear equation systems
    • be familiar with the procedure to finding the eigenvalues of a matrix, and have a rough idea of why it works

 

and be able to answer a number of related questions, including

    • How are row and column rank defined? Can a matrix have a strictly greater column rank than row rank?
    • What are the three elementary matrix operations? How do they affect the rank and the determinant?
    • Is the matrix product associative? Is it commutative?
    • Provided that it exists for a given matrix A, is the inverse matrix A^{-1} always unique?
    • Can any square matrix be brought to an upper triangular form using only elementary matrix operations? Which condition ensures that we can bring it to identity form?
    • What characterizes an indefinite matrix? Is any positive definite matrix also positive semi-definite? Can it also be negative semi-definite?
    • Consider the matrix

          \[\begin{pmatrix} 1 & 4 & 0\\ 2 & 3 & 1\\ 0 & 0 & -2\end{pmatrix}.\]

      What is its determinant? Is it invertible? If so, what is its inverse matrix?