MATH 307 Learning Goals ----------------------------------------------------------- Course level learning goals: After completing this course, students should be able to 1. use linear algebra to answer qualitative and quantitative questions about applications such as (*) Interpolation (*) Finite difference approximations (*) Least Squares (*) Fourier series (*) FFT (*) Power method (*) Recursion relations (*) the Anderson tight binding model (*) Markov chains (*) Principal component analysis 2. use a computer algebra systems such as MATLAB or Octave to perform reasonably complicated multi-step calculations involving matrices. 3. use MATLAB or Octave to obtain evidence for the truth or falsity of conjectures. ----------------------------------------------------------- Topic level learning goals: 1. Linear Equations 1.1 Solving Linear Equations Topics: 1.1.T1 Solving a non-singular system of n equations in n unknowns 1.1.T2 Reduced row echelon form 1.1.T3 Gaussian elimination steps using MATLAB/Octave 1.1.T4 Matrix norm and condition number Prerequisites: 1.1.P1 Use Gaussian elimination to bring a system of linear equations into upper triangular form and reduced row echelon form (rref). 1.1.P2 Determine whether a system of equations has a unique solution, infinitely many solutions or no solutions, and compute all solutions if they exist. 1.1.P3 Compute the inverse of a matrix when it exists, use the inverse to solve a system of equations, describe for what systems of equations this is possible. Learning Goals: 1.1.G1 Know the definitions of the matrix norm, Hilbert-Schmidt norm, how to compute the matrix norm of a diagonal matrix, and the Hilbert-Schmidt norm of any matrix. 1.1.G2 Understand what the matrix norm says about stretching the length of vectors. 1.1.G3 Know the definition of the condition number of a matrix, use the condition to estimate relative errors. 1.1.G4 Use MATLAB/Octave to enter matrices and vectors, make larger matrices from smaller blocks, multiply matrices, compute the inverse and transpose, extract elements, rows, columns and submatrices, solve linear equations using A\b, generate random matrices, use tic() and toc() to time operations, compute norms and condition numbers. 1.1.G5 Use MATLAB/Octave to test conjectures about norms, condition numbers, etc. 1.2 Interpolation Topics: 1.2.T1 Introduction: what is interpolation? 1.2.T2 Lagrange interpolation 1.2.T3 Cubic splines 1.2.T4 The linear equations for cubic splines Prerequisites: 1.2.P1 Know how to compute the determinant of a square matrix and what its value means about existence and uniqueness of solutions. Learning Goals: 1.2.G1 Know the definition of an interpolation function, understand the idea of getting a unique interpolation function by either restricting the class of functions or optimizing over a larger set. 1.2.G2 Know the definition of Lagrange interpolation, be able to set up the system of linear equations for the coefficients, be able to derive the formula for the determinant of the Vandermonde matrix that appears in this equation. 1.2.G3 Be able to explain why Lagrange interpolation is not a practical method for large numbers of points. 1.2.G4 Use the plot command to produce basic plots in MATLAB/Octave 1.2.G5 Use the functions linspace, vander and polyval to plot the interpolating polynomial in Lagrange interpolation 1.2.G6 Understand how the minimizing the bending energy leads to a description of the shape of the spline as piecewise polynomial function. 1.2.G7 Be able to write down the system of equations for the coefficients of the polynomials. 1.2.G8 Use MATLAB/Octave to compute and plot cubic splines 1.2.G9 Use the MATLAB/Octave functions zeros and ones and m files. 1.3 Finite difference approximations Topics: 1.3.T1 Introduction and example 1.3.T2 Discretization Learning Goals: 1.3.G1 Be able to take a second order linear boundary value problem and write down the corresponding finite difference equation. 1.3.G2 Use the finite difference equation and MATLAB/Octave to compute an approximate solution. 1.3.G3 Use the MATLAB/Octave command diag. 2. Basis and Dimension 2.1 Basis and dimension Topics: 2.1.T1 Linear dependence and independence 2.1.T2 Subspaces 2.1.T3 Nullspace N(A) and Range R(A) 2.1.T4 Basis 2.1.T5 Finding basis and dimension of N(A) 2.1.T6 The matrix version of Gaussian elimination 2.1.T7 A basis for R(A) 2.1.T8 The rank of a matrix 2.1.T9 Bases for R(A^T) and N(A^T) Prerequisites: .... this needs to be better co-ordinated with 152 and 221 Learning Goals: 2.1.G1 Be able to recast the dependence or independence of a collection of vectors as a statement about existence of solution to a set of linear equations. 2.1.G2 Be able to decide if a collection of vectors are dependent or independent 2.1.G3 Understand what a subspace is and be able to decide if a collection of vectors is a subspace. 2.1.G4 Be able to describe the significance of the two parts (independence and span) of the definition. 2.1.G5 Be able to check if a collection of vectors is a basis 2.1.G6 Be able to express the Gaussian elimination process as a matrix factorization, A = EU. 2.1.G7 Be able to compute bases for each of the four fundamental subspaces N(A), R(A), N(A^T) R(A^T) of a matrix A. 2.1.G8 Be able to compute the rank of the matrix 2.1.G9 Know the formulas for the dimension of each of the four subspaces and be able to explain why they are true. 2.2 Graphs and Networks Topics: 2.2.T1 Directed graphs and their incidence matrix 2.2.T2 Nullspace and range of incidence matrix and its transpose 2.2.T3 The null space N(D) 2.2.T4 The range R(D) 2.2.T5 The null space N(D^T) 2.2.T6 The range R(D^T) 2.2.T7 Summary and Orthogonality relations 2.2.T8 Resistors and the Laplacian 2.2.T9 Kirchhoff's law and the null space of L 2.2.T10 Connecting a battery 2.2.T11 Two resistors in series 2.2.T12 Example: a resistor cube Prerequisites: Learning Goals: 2.2.G1 Be able to write down the incidence matrix of a graph, and to draw the graph given the incidence matrix. 2.2.G2 Know the interpretations for each of the four subspaces and their dimensions in terms of voltage and current vectors. 2.2.G3 Know the definition of the Laplace operator for a graph and be able to write it down. 2.2.G4 Understand the connection between Kirchoff's law and the nullspace of L. 2.2.G5 Be able to calculate the voltages and currents in a resistor network when a battery is attached. 2.2.G6 Be able to calculate the effective resistance between two nodes in a resistor network. 2.2.G7 Know how to re-order rows and columns of a matrix and extract submatrices in MATLAB/Octave. 3. Orthogonality 3.1 Orthogonality and Projections Topics: 3.1.T1 Orthogonal vectors 3.1.T2 Orthogonal subspaces 3.1.T3 The fundamental subspaces of a matrix revisited 3.1.T4 Projections 3.1.T5 Least squares solutions 3.1.T6 Straight line fit 3.1.T7 Football rankings Prerequisites: 3.1.P1 Know the formula for the length of a projected vector onto a line. 3.1.P2 Know the definition of unit vector and how to find a unit vector parallel to a given vector. ....there's a fair amount of overlapping with MATH 152 here.... Learing Goals: 3.1.G1 Be able to compute the inner product of two vectors, the norm of a vector, and the angle between vectors. 3.1.G2 Be able to recognize when two vectors are orthogonal. 3.1.G3 Derive the Chauchy-Schwarz inequality. 3.1.G4 Know when two subspaces are orthogonal and the definition of orthogonal complement. 3.1.G5 Be able to show orthogonality relations between the fundamental subspaces of a matrix. 3.1.G6 Be able to compute the projection of a vector onto a line and onto a (hyper)plane. 3.1.G7 Be able to find the closest vector to being a solution of a linear system by calculating the least square solution. 3.1.G8 Apply the least square method to find straight line fits of data points and to rank sport teams. 3.2 Orthonormal bases, Orthogonal matrices, Gram-Schmidt and QR decomposition Topics: 3.2.T1 Orthonormal bases 3.2.T2 Orthogonal matrices 3.2.T3 Matrices with orthogonal columns 3.2.T4 Gram-Schmidt procedure 3.2.T5 The QR factorization 3.2.T6 Using MATLAB/Octave for QR Prerequisites: Learining Goals: 3.2.G1 Be able to show whether a basis of vectors is orthonormal and expand a vector in such a basis. 3.2.G2 Be able to recognize when a matrix is orthogonal and when it has columns that form an orthonormal set. 3.2.G3 Apply the Gram-Schmidt procedure to create a set of orthonormal vectors from any collection of linearly independent vectors. 3.2.G4 Use the Gram-Schmidt procedure to perform matrix QR factorization. 3.2.G5 Use the MATLAB/Octave command qr to find the QR factorization. 3.3 Complex inner product Topics: 3.3.T1 Complex numbers (review) 3.3.T2 Complex inner product and Unitary matrices Prerequisites: ...how much of complex numbers are the students supposed to know already?... Learning Goals: 3.3.G1 Recognize the real and imaginary part of a complex number, be able to compute the modulus and the complex conjugate of a complex number, perform basic algebra with complex numbers (addition, multiplication, fraction simplification). 3.3.G2 Know the relationship between the complex exponential and the unit circle in the complex plane, be able to compute the sum, derivative and anti-derivative of complex exponentials. 3.4.G4 Use the inner product of two complex vectors to calculate the norm of a complex vector, know the definition of the adjoint of a matrix and the unitary matrix. 3.4.G5 Be able to enter complex matrices in MATLAB/Octave and use MATLAB/Octave commands real, imag, conj, abs, exp 3.4 Fourier series Topics: 3.4.T1 Complex form 3.4.T2 Inner product for a space of functions 3.4.T3 An orthonormal basis 3.4.T4 An example 3.4.T5 Parseval's formula Prerequisites: Learning Goals: 3.4.G1 Know the real and complex form of the expansion of a function in a Fourier series. 3.4.G2 Interpret the Fourier series of a function as an expansion of a vector in an orthonormal basis. 3.4.G3 Be able to compute the Fourier coefficients for a given function and determine how well the series approximates the function. 3.4.G4 Derive Parseval's formula 3.4.G5 Use MATLAB/Octave scripts to plot partial sums of a Fourier series 3.5 The Discrete Fourier Transform Topics: 3.5.T1 Definition 3.5.T2 The Fast Fourier transform Prerequisites: Learning Goals: 3.5.G1 Know the definition of the Fast Fourier transform of a discrete set of points and derive Parseval's formula for the discrete case. 3.5.G2 Know the FFT algorithm. [There isn't really a substantial application for FFT in the course. Maybe something on signal processing?] 4. Eigenvalues and Eigenvectors 4.1 Eigenvalues and Eigenvectors Topics: 4.1.T1 Definition 4.1.T2 Standard procedure 4.1.T3 Example 1 4.1.T4 Example 2 4.1.T5 Example 3 4.1.T6 Example 4 4.1.T7 A basis of eigenvectors 4.1.T8 When there are not enough eigenvectors 4.1.T9 Diagonalization 4.1.T10 Jordan canonical form 4.1.T11 Eigenvalues, determinant and trace 4.1.T12 Powers of a diagonalizable matrix Prerequisites: 4.1.P1. How to find the roots of a polynomial. ...some overlapping with MATH 152.... Learning Goals: 4.1.G1 Know the definition of eigenvalues and eigenvectors and be able to compute them. 4.1.G2 Use MATLAB/Octave command poly, root, eig. 4.1.G3 Understand multiplicities of eigenvectors and show when eigenvectors form a basis. 4.1.G4 Use eigenvalues and eigenvectors to perform matrix diagonalization and to find the Jordan Canonical Form for non-diagonalizable matrices. 4.1.G5 Know the relationship between eigenvalues and the determinant of a matrix. 4.1.G6 Know the definition of trace of a matrix and the relationship between eigenvalues and trace for diagonalizable matrices. 4.1.G6 Use eigenvalues to compute powers of a diagonalizable matrix. 4.2 Power Method for Computing Eigenvalues Topics: 4.2.T1 Eigenvalues of real symmetric matrices 4.2.T2 The power method Prerequisites: Learning Goals: 4.2.G1 Know the properties of the eigenvalues of real symmetric matrices. 4.2.G2 Know the definition of Hermitian matrix. 4.2.G3 Be able to find a single eigenvalue/eigenvector pair using the power method. 4.3 Recursion Relations Topics: 4.3.T1 Fibonacci numbers 4.3.T2 Other recursion relations Prerequisites: Learning Goals: 4.3.G1 Use matrix equations to solve a recurrence relation. 4.3.G2 Be able to compute Fibonacci numbers. 4.4 The Anderson Tight Binding Model Topics: 4.4.T1 Description of the model 4.4.T2 Recursion relation 4.4.T3 A potential with most values zero 4.4.T4 Conduction bands for a crystal Prerequisites: Learning Goals: 4.4.G1 Understand how to discretize the Schrodinger equation for a single electron moving in a one dimensional semi-infinite crystal. 4.4.G2 Given the potentials, compute the energies for which a bound state exists and identify the conduction bands of the crystal. 4.5 Markov Chains Topics: 4.5.T1 Random walk 4.5.T2 Properties of stochastic matrices 4.5.T3 (1) P preserves state vectors 4.5.T4 (2) P has an eigenvalue \lambda_1 = 1 4.5.T5 (4) Other eigenvalues of P have modulus \le 1 4.5.T6 (3) The eigenvector v1 (or some multiple of it) has all non-negative entries 4.5.T7 (3) and (4) vs. (3') and (4') and Pn with all positive entries 4.5.T8 Google PageRank Prerequisites: ...co-ordinate with MATH 152... Learning Goals: 4.5.G1 Be able to model random walks (Markov chains) in terms of stochastic matrices and state vectors. 4.5.G2 Know the properties of a stochastic matrix and determine the convergence properties of state vectors for a randon walk. 4.5.G3 Apply random walk theory to real-life problem such as the Google PageRank process. 4.6 Principal Coordinates Analysis (PCA) Topics: 4.6.T1 Introduction 4.6.T2 Definitions and useful properties 4.6.T3 Reconstructing points in R^p 4.6.T4 Reconstructing two points in R^p 4.6.T5 Reconstructing three points in R^p 4.6.T6 Arbitrary number of points in R^p 4.6.T7 Example 4.6.T8 The dimension of the plot 4.6.T9 Real examples Prerequisites: Learning Goals: 4.6.G1 Understand the use of Principal Coordinates Analysis to represent objects described by some dissimilarity relations as points in a space of suitably chosen dimension. 4.6.G2 Apply PCA in some real-life situations.