Matrix Linear Algebra - bsiever/WUSTL-CSE-Curriculum GitHub Wiki
Required By: cse416a, cse417t, cse468t, cse517a, cse518a, cse559a, cse543t, cse586a
1. Potential Learning Outcomes (Q: is that what existing courses deliver or what we want?)
here:
FromStudents will have a good understanding of:
- Linear systems, row-reduction, and matrix equations
- Linear transformations, invertibility, rank+nullity
- Determinants
- Vector spaces, basis and dimension
- Eigenvectors and eigenvalues, diagonalization
- Inner products, orthogonality, Gram-Schmidt
- The spectral theorem and quadratic forms
My take on it (Marion):
Vectors:
- understand vectors from algebraic and geometric point of view
- be able to perform vector operations such as addition, scalar multiplication, dot products, (possibly/optional: cross and outer products)
- understand vector projection
- apply Gaussian elimination to solve (small) systems of linear equations (possibly Gauss/Jordan elimination) by hand
- know how at least one iterative method (Jacobi iteration, Gauss-Seidel method) used to solve (medium to large) linear systems computationally works
- be aware of numerical issues when solving (medium to large) linear systems computationally
- understand spanning sets and linear independence
Matrices:
- be able to compute matrix operations (addition, element-wise multiplication, matrix multiplication; possibly/optional: outer products) on small matrices
- possibly/optional: be able to perform operations on block/partitioned matrices
- understand what the following are and how to compute them: matrix powers, transpose of a matrix, inverse of a matrix
- be aware of properties of matrix operations (commutativity, associativity, distributivity, multiplicative identity etc.) and matrix transposes -> all of those will be helpful to handle matrix equations
- know when a matrix is invertible
- apply Gauss-Jordan to compute the inverse of small invertible matrices
- be aware of at least one method to compute matrix inverses of medium to large matrices computationally (e.g., via LU factorization)
- be able to compute determinants of small matrices
- understand how determinants can be used to solve (square) systems of linear equations (Cramer's Rule)
Eigenvalues/Eigenvectors:
- be able to recognize eigenvalue problems
- be able to compute eigenvalues for small matrices using the characteristic polynomial
- understand how matrix diagonalization is helpful to find eigenvalues and eigenvectors
- be aware of at least one method to compute the eigenvalues/eigenvectors of a medium to large matrix computationally (e.g., via power iteration)
Orthogonality:
- understand the key concepts of linear algebra: subspace, basis, dimension, rank
- understand what orthogonal and orthonormal sets of vectors, as well as orthogonal matrices are
- apply the Gram-Schmidt process to compute an orthogonal basis
- understand the relation of Gram-Schmidt to QR factorization and acknowledge it's usefulness to numerically compute eigenvalues (OR algorithm)
Other:
- possible/optional: understand that matrices can be ill-conditioned and what that means for solving linear systems and other numerical operations
- understand that least-squares approximation is application of linear algebra
- understand the singular value decomposition and its applications
From faculty survey:
- ability to recognize when a linear transformation can be represented by a spectral decomposition and to write a formula for the decomposition
- be comfortable to think in finite (n-)dimensional inner-product spaces
- be comfortable with handling matrices, including manipulating matrix equations and taking derivates wrt vectors (--> I am doubtful that this is currently delivered in Math 309)
2. Topics Covered
3. WashU Courses
ESE318)
3.1 MATH 309 (others: ?Content Covered
- Linear Equations
- Matrices, Matrix Operations, Inverse, Determinant, Cramer's Rule, Matrix Rank
- Vector Spcaes, Basis, Basis Change, Inner Product Space
- Gram-Schmidt, Orthogonal Complement
- Linear Transformations, Matrices for LTs, Kernel, Range, Transition Matrix, Similarity
- Eigenvalues and Eigenvectors, Diagonalization/Spectral Theorem
- Symmetric Matrices
- Complex Numbers, Division, Polar Form, Complex Vector/Inner Product Spaces
- DeMoivre’s Theorem
- Systems of Linear Differential Equations
- Quadratic Forms
- Unitary Matrices, Hermitian Matrices, Diagonalization
Other semesters:
- Fall 2018: Quadratic Forms, Singular Value Decomposition (SVD)
Pedagogy / Delivery
Lectures, assignent, 2 midterms, 1 final exam
Learning Outcomes
4. Peers
21_241: Matrices and Linear Transformations --> course book LA Modern Introduction
4.1 CMU:-
Content Covered
- Systems of Linear Equations
- Matrices and Gaussian Elimination
- Logic. What is a Proof? What is a Theorem? Techniques of proof.
- Vector Spaces, Geometry of Linear Equations
- Linear Transformations and Matrices , Matrix Operations
- Matrix Multipllication , Inverses, Finding Matrix Inverses, Determinants
- Cofactor Expansion - Introduction, The Laplace Expansion Formula.
- The Eigenvalue Problem, Properties of Eigenvalues and Eigenvectors
- Subspaces, Properties of Linear Independence, and Bases
- Basis and Dimension of Row/Column Spaces, Null Spaces
- Eigenvalues, Eigenvectors, Eigenspaces, Multiplicity
- Orthogonal Vectors, Subspaces, Projections, Decomposition Theorem, Orthogonal Matrices. The Gram-Schmidt Process
- Least Squares Approximations
- Coordinates, Change of Basis
- Matrix of a Linear Transformation, Coordinate Representations of Linear Transformations
- Similarity
- Diagonalization, Orthogonal Diagonalization of Symmetric Matrices
- Singular Value Decomposition, Computing the Singular Value Decomposition of a Matrix
-
Learning Outcomes:
- Master the computational techniques required to solve a variety of problems in linear algebra.
- Gain an understanding of the mathematical structures that help us to understand those problems.
- Develop your ability to reason mathematically: to think precisely about mathematical problems and express yourself clearly in writing.
4.2 UIUC
5. Issues/Identified Missing Content
I suggest that after we identified the content, we can have a small targeted survey/chat with instructors of upper level courses to gauge what, if anything, is missing.
- Applications that showcase usefulness of mathematical concepts in CS, such as clustering, dimensionality reduction, network centrality, cosine similarity, ...
- Matrix equations, Matrix derivatives
- Proficiency in handling partial derivatives, transforming sum representation into matrix equation and vice versa
- ... ?
- ESE 318 focuses also on Differential Equations (MATH 217 is actually a prerequisite). I personally don't think it is a good course for CSE majors.