Linear Algebra Review Part 2
I decided that the last method that I was using to review Linear Algebra wasn't working too well, so I am going to try something else. I want to review Linear Algebra before I review the math behind Machine Learning and Deep Learning algorithms.
References
- I am planning on using this YouTube video series by Dr. Trefor Bazett to review linear algebra.
- Other References:
Notes
Introduction
- Linear algebra is the study of Linear Transformations:
- Lines goes to lines
- origin stays put
- View matrices as a combination of column vectors
- Linear algebra looks at Linear Equations - all variables have a power of 1.
- Every linear system of equations has zero, one, or infinitely many solutions.
- When writing a system of linear equations in matrix form, there is the coefficient matrix (which is multiplied by the variables) and the portion on the right hand side is referred to as the constant matrix. The entire thing is referred to as the augmented matrix.
Solving Systems of Linear Equations
- Allowed Manipulations to Solve Systems of Linear Equations:
- What you can do to solve systems of linear equations.
- These operations don't change the solution.
- Elementary Row Operations (ERO):
- Multiply a row by any value of
- Replace a row by the sum of it and another row times some constant.
- You can interchange two different rows..
- If you reduce the coefficient matrix by using EROs and get a row of the coefficient matrix to contain all 0s while the same row's value in the constant matrix , then the system has no solutions.
- If you have all 0 rows (rows with 0s in the coefficient and constant matrix) except for one row, then the system has infinite many solutions.
Row Echelon Form
- What is our goal when doing Elementary Row Operations?
- Row Echelon Form is When the Matrix meets the following three requirements:
- The first number in the row (called the leading coefficient) is 1.
- Every leading 1 is to the ight of the one above it.
- Any non-zero rows are always above rows with all zeroes.
Examples of Row Echelon Form:
The echelon form of a matrix isn't unique, which means there are infinite answers possible when you perform row reduction. Reduced row echelon form is at the other end of the spectrum; it is unique, which means row-reduction will produce the same answer no matter how you perform the same row operations.
- If the leading coefficient in each row is the only non-zero number in that column, the matrix is said to be in reduced row echelon form.
Properties of Vectors
- You can use Back Substitution to get an equation that gives you what the value all variables would be given some other information.
- Scalar Multiplication - adds to all elements in vector the same, e.g.
- Vector Addition - component wise, e.g.
Linear Combinations and Spans
- Linear Combination:
- Let be a vector space over the field . As usual, we call elements of vectors and call elements of scalars. If are vectors and are scalars, then the linear combination of those vectors with those scalars as coefficients is:
- Linear Combinations:
- , where
- The linear span of a set of vectors, denoted , is defined as the set of all linear combinations of the vectors in .
- In mathematics, the standard basis of a coordinated vector space (such as ) is the set of vectors, each of whose components are all zero, except for one that equals 1. For example, in the case of the Euclidean plane formed by the pairs of real numbers, the standard basis is formed by the vectors:
- Any vector can be written as a linear combination of standard basis vectors.
- When asking if a vector (1) is a span of other vectors (2), you are asking whether that vector (1) can be written as a linear combination of the other vectors (2).
- In other words, if , then .
The Matrix Equation
- The Matrix Equation:
- notation means you are defining the thing on the right (that you don't understand) by the thing on the left (that you do understand)
Algebraic Rules in Linear Algebra
- Distributivity of Matrix vector Product:
- The Big Theorem
- Geometric Question:
- Is ?
- Algebraic Question:
- Can we find some for every in .
- The following are equivalent (If one of these is true, then they are all true):
- The columns of span
- For every , is some linear combination of the columns of A
- For every , has a solution.
- The Reduced Row echelon form of has a leading 1 in every row.
- Transformation is onto.
- If any of the above are true, then the response to the Geometric Question and the Algebraic Question are yes. Else, the response is no.
Homogenous Systems
- Homogeneous systems are systems where .
- Properties of Homogenous Systems:
- is always a solution.
- Suppose , then .
- Suppose , then
- Every solution to is the sum of one particular solution and a homogeneous solution.
Linear Dependence and Independence
- The above can be true in two cases:
- .
- At least some Linear Dependence
Definition:
is:
- Linearly dependent if you can find not all zero so
- Linearly independent if forces
Determining Linear Independence vs Linear Dependence
- Solve the system of linear equations and that will tell you.
- If asked whether a system of linear equations is linearly independent (asked if some vectors are linearly independent):
- Put the vectors in an augmented matrix
- Assume the constant matrix to contain all zeros (assume a Homogeneous solution)
- Reduce the matrix to Row Echelon form or Reduced Row Echelon form
- Determine linear independence
Transformations and Matrix Transformations
- A transformation is like a function that transforms some vectors into other vectors.
- T = Transformation
- . A transformation takes in a vector of and transforms it into .
- In the equation above, is the domain and is the codomain
- To be a transformation, there must be a unique output for every input.
- Example of Transformation:
- The matrix transforms in to in .
- Matrix Transformations: taking a vector and multiplying it by a vector
- A linear transformation is a transformation such that:
- (Respects scalar multiplication)
- (Respects vector addition)
- Rotation is a linear transformation.
- Every combination is a linear combination of the same standard basis vectors.
- One one way to think about a vector is as a description of what combination of standard basis vectors are you talking about.
- Matrix transformations are the same thing as linear transformations.
One to One and Onto
A Transformation
means:- Note: For the notes below, even if x is not written as , it still denotes a vector. The same is true of b.
- One-to-one transformations: A transformation is one-to-one if, for every vector b in , the equation has at most one solution x in .
- Equivalent ways of saying T is one-to-one:
- For every vector in , the equation has zero or one solution x in .
- Different inputs of have different outputs.
- If , then.
- Equivalent ways of saying T is not one-to-one:
- There exists some vector in such that the equation has more than one solution in
- There are two different inputs of with the same output.
- There exist vectors such that but
- A transformation is onto if, for every vector s in , the equation has at least one solution in .
- Equivalent ways of saying that T is onto:
- The range of T is equal to the codomain of T.
- Every vector in the codomain is the output of some input vector.
- Equivalent ways of saying that is not onto:
- The range of T is smaller than the codomain of T
- There exists a vector b in such that the equation does not have a solution.
- There is a vector in the codomain that is not the output of any input vector.
Matrix Multiplication
- I still remember how to do this pretty well.
- There is a definition of matrix multiplication in terms of transformations that I didn't really understand.
- The key idea is that Linear Transformations are like Matrix Multiplications
- only sometimes
Elementary Matrices
- An elementary row operation is going to be equivalent to multiplying by a certain elementary matrix.
- First Elementary Matrix (Do Nothing): Identity Matrix
- Matrix For interchanging rows is the identity matrix with the rows interchanged.
- For example: multiplying a matrix (A) by an identity matrix with the first two rows swapped will result in a matrix like A, expect with the first two rows swapped.
- Multiply a row by a number = Multiplying by the Identity matrix, except one of the rows of the Identity matrix has its
1
replaced by the number you want to multiply by. - Example: Multiply row 1 by k:
- Example: Add k times row 1 to row 2:
Invert Matrices
- Definition: A is invertible if there exists a matrix such that .
- Every has a unique solution if is invertible.
- For a matrix to be invertible, it has to be a square matrix.
- The reduced row echelon form of an invertible matrix will be an identity matrix.
- If a matrix is , then a matrix if invertible if and only if
- A linear transformation is invertible if there is a transformation such that .
- Invertible transformations are all one-to-one.
- Invertible matrices correspond with invertible transformations.
Determinants
- For ,
Vector Spaces
- A vector space is a nonempty set with two operations Vector Addition and Scalar Multiplication that obeys the 10 rules:
- Polynomials are a vector space
- : Polynomials of degree less than or equal to n
- A set in a vector space is linearly independent if only for all zero.
- is the Standard basis of .
Subspaces
- Subspace of a Vector Space is a subset such that if , then:
- Lines through the origin are subspaces.
- All of is a subspace.
- is a subspace.
- The Span of any vectors will always be subspace.
The Nullspace and Column Space
- For an matrix A with columns ,
- The column space is like what vectors can be written as a linear combination of the vectors of A
- For an matrix A with columns ,
- The Nullspace is the set of all Homogeneous solutions where
- The null space is in the domain, the column space is in the codomain.
The Basis of a Subspace, Column Space, Null Space
- Subspace
- A set of vectors is a basis for a subspace if:
- span the subspace
- are linearly independent
- A basis for a subspace is a list of vectors that
- spans the subspace
- is linearly independent
- To find the column space, reduce the matrix to RREF form, find the columns with leading ones, then go back to the original matrix and the vectors of those columns are the column space.
- Writing a vector in terms of another basis:
The Dimension of a Subspace, Column Space, Null Space
- For a subspace , . The # of basis vectors is always unique for a subspace.
- Standard Basis: spans and is linearly independent
- Dimension of Null Space:
- Set
- Find the basis the number of free columns in REF = # of basis vectors
- = # of free columns in REF
- Dimension of Column Space:
- Reduce the matrix to REF
- Columns with leasing 1s are a basis for
- = # of leading 1s
The Dimension Theorem
- Define:
Conversion Between Bases
Two Bases:
The Change of Basis Matrix:
Eigenvalues and Eigenvectors
- Eigenvalues are associated with eigenvectors
- Associated with an eigenvalue are infinitely many eigenvectors
- Eigenvectors are always non-zero
- How to Find the Eigenvalue Solve:
- There can be multiple eigenvalues that satisfy the above.
- You use those eigenvalues to find the corresponding eigenvectors.
- For a matrix in upper triangular form:
Diagonal Matrices
- The Rank of a matrix in upper triangular form is the number of nonzero-s on the diagonal.
- The eigenvalues of a matrix in the upper triangular form are the values in the diagonal (in the matrix above, they would be .
The Diagonalization process
- For , find diagonal so .
- Let be linearly independent eigenvectors for A, with being eigenvalues.
- Set . Note that is invertible that Linearly independent.
- .
Similar Matrices
- if there is some so
- The determinant of a product is the product of the determinant of each of the elements.
- A similar matrix can be think of a change of basis of a matrix
Dot Products and Length
- , where is the angle between the vectors
- The dot product is not just a notion of length, but also of angle
Orthogonality
- An orthogonal set has
Orthogonal Decomposition Theorem
- The orthogonal compliment to a subspace is
- Every vector can be written as the sum of two vectors, one in the subspace and one in W perpendicular. That is what the orthogonal decomposition theorem states.
Comments
There are currently no comments to show for this article.