Matrices & Quadratic Forms
Linear Algebra
Linear algebra is the formalisation/generalisation of linear equations involving vectors and matrices. A linear algebraic equation looks like
where is a matrix, and , are vectors. In an equation like this, we're interested in the existence of and the number of solutions. Linear ODEs are also of interest, looking like
where is a matrix, is a vector, and is a function over a vector.
- I'm really not about to go into what a matrix or it's transpose is
- denotes the transpose of
- is a column vector, indexed
- is a row vector
- You can index matrices using the notation , which is the element in row and column , indexed from 1
Matrices can be partitioned into sub-matrices:
Column and row partitions give row/column vectors.
- A square matrix of order has dimensions x
- The leading diagonal is entries
- The trace of a square matrix is the sum of the leading diagonal
- A diagonal matrix has only entries on the leading diagonal
- The identity matrix is a diagonal matrix of ones
The Inner Product
The inner product of two vectors , a row vector, and , a column vector:
- (1x) matrix times (x1) to yield a scalar
- If the inner product is zero, then and are orthogonal
- In euclidian space, the inner product is the dot product
- The norm/magnitude/length of a vector is
- If norm is one, vector is unit vector
Linear Independence
Consider a set of vectors all of equal dimensions, . The vector is linearly dependent on the vectors if there exists non-zero scalars such that:
If no such scalars exist, the set of vectors are linearly independent.
Finding the linearly independent rows in a matrix:
- is independent of since for any
-
- Row 3 is linearly dependent on rows 1 and 2
- There are 2 linearly independent rows
- It can also be found that there are two linearly independent columns
Any matrix has the same number of linearly independent rows and linearly independent columns
A more formalised approach is to put the matrix into row echelon form, and then count the number of non-zero rows. in row echelon form may be obtained by gaussian elimination:
Minors, Cofactors, and Determinants
For an x matrix , the determinant is defined as
- denotes a chosen row along which to compute the sum
- is the cofactor of element
- is the minor of element
- The minor is obtained by calculating the determinant from the matrix obtained by deleting row and column
- The cofactor is the minor with the appropriate sign from the matrix of signs
Determinant Properties
- If a constant scalar times any row/column is added to any other row/column, the is unchanged
- If and are of the same order, then
- iff the rank of is less than its order, for a square matrix.
Rank
The rank of a matrix is the number of linearly independent columns/rows
Any non-zero x matrix has rank if at least one of it's -square minors is non-zero, while every -square minor is zero.
- -square denotes the order of the determinant used to calculate the minor
For example:
- The determinant is 0
- The rank is less than 3
- The minor .
- The order of this minor is 2
- Thus, the rank of is 2
There are two other ways to find the rank of a matrix, via gaussian elimination into row-echelon form, or by the definition of linear independence.
Inverses of Matrices
The inverse of a square matrix is defined:
- is unique
is the adjoint of , the transpose of the matrix of cofactors:
If , is singular and has no inverse.
Pseudo-inverse of a Non-Square Matrix
Given a more general x matrix , we want some inverse such that , or .
If (more columns than rows, matrix is fat), and , then the right pseudo-inverse is defined as:
If (more rows than columns, matrix is tall), and , then the left pseudo-inverse is defined as:
For example, the right pseudo inverse of :
Symmetric Matrices
A matrix is symmetric if
A matrix is skew-symmetric if
For any square matrix :
- is a symmetric matrix
- is a symmetric matrix
- is a skew-symmetric matrix
Every square matrix can be written as the sum of a symmetric matrix and skew-symmetric matrix :
Quadratic forms
Consider a polynomial with variables and constants of the form:
When expanded:
This is known as a quadratic form, and can be written:
where is an column vector, and is an symmetric matrix. In two variables:
Linear forms are also a thing. A general linear form in three variables , , :
This allows us to represent any quadratic function as a sum of:
For example: