CIS 515, Spring 2012
Brief description:
The goal of this course is to provide
firm foundations in linear algebra and optimization techniques
that will enable students to analyze and solve problems
arising in various areas of computer science, especially
computer vision, robotics, machine learning, computer graphics,
embedded systems, and market engineering and systems.
The students will acquire a firm theoretical knowledge of these
concepts and tools. They will also learn how to use these
tools in practice by tackling various judiciously chosen projects
(from computer vision, etc.).
This course will serve as a basis to more advanced courses
in computer vision, convex optimization, machine learning,
robotics, computer graphics, embedded systems,
and market engineering and systems.
Topics covered include:
Fundamentals of linear algebra:
Basic concepts; solving linear systems; eigenvalues and
eigenvectors; introduction to the finite elements method;
singular value decomposition, pseudo-inverses, PCA.
Basics of quadratic optimization; the
Rayleigh-Ritz ratio.
Basics of optimization: review of analysis (derivatives, gradient, Hessian,
Lagrange multipliers).
Syllabus:
(*) means: if time permits.
-
Linear Algebra
-
Basics
-
Motivations: Linear independence, rank
-
Vector Spaces
-
Linear independence, Bases
-
Linear maps
-
Image (range) and Kernel (nullspace), rank
-
Linear maps and Matrices
-
Affine maps
-
Direct products, sums and direct sums
-
The dual space E^* and linear forms
-
Hyperplanes and linear forms
-
Transpose of a linar map and of a matrix
-
The four fundamental subspaces
-
Determinants
-
Permutations, signature of a permutation
-
Alternating multilinear maps
-
Definition of a determinant
-
Inverse matrices and determinants
-
Systems of linear equations and determinants
-
Determinant of a linear map
-
The Cayley-Hamilton theorem
-
Solving Linear Systems
-
Gaussian Elimination
-
LU factorization
-
Gaussian elimination of tridiagonal systems
-
Cholesky Decomposition
-
Vector Norms and Matrix norms
-
Normed vector spaces
-
Matrix norms
-
Condition numbers of matrices
-
Iterative Methods for Solving Linear Systems
-
Convergence of Sequences of Vectors and Matrices
-
Convergence of Iterative Methods
-
Description of the Methods of Jacobi, Gauss-Seidel, and Relaxation
-
Convergence of the Methods of Jacobi, Gauss-Seidel, and Relaxation
-
Euclidean spaces
-
Inner products
-
Orthogonality, duality
-
Adjoint of a linear map
-
Linear isometries and Orthogonal Matrices
-
The orthogonal group
-
QR-Decomposition for invertible matrices
-
Orthogonal reflections
-
QR-decomposition using Householder matrices
-
Hermitian Spaces
-
Hermitian spaces, pre-Hilbert spaces
-
Orthogonality, duality, adjoint of a linear map
-
Linear isometries and unitary matrices
-
The unitary group
-
Eigenvalues and Eigenvectors
-
Basic Definitions
-
Algebraic and Geometric multiplicity
-
Reduction to upper triangular form; the Schur decomposition
-
Spectral Theorems
-
Symmetric matrices, Normal matrices
-
Self-adjoint and other special linear maps
-
Normal and other special matrices
-
Introduction to the Finite Elements Method
(Ritz, Galerkin, Rayleigh-Ritz)
-
A one-dimensional problem: bending of a beam
-
A two-dimensional problem: an elastic membrane
-
Time-dependent boundary problems: the wave equation
-
Singular Value Decomposition (SVD) and Polar Form
-
Symmetric, Positive, Definite Matrices
-
Symmetric, Positive, Semi-Definite Matrices
-
SVD for square matrices
-
SVD for rectangular matrices
-
Applications of SVD and Pseudo-Inverses
-
Least squares problems and the Pseudo-Inverse
-
Data compression and SVD
-
The Rayleigh-Ritz ratio and the Courant-Fisher theorem
-
Principal component analysis (PCA)
-
Best affine approximation
-
Basic Quadratic Optimization
-
Minimizing transpos{x} A x + 2 transpos{x} b,
where A is a positive definite symmetric matrix.
-
(*) Minimizing transpos{x} A x + 2 transpos{x} b,
where A is a positive semidefinite symmetric matrix
using a pseudo-inverse
-
(*) Minimizing transpos{x} A x on the unit sphere
subject to linear constraints
-
(*)The Schur complement
-
(*) Applications to computer vision (finding contours)
-
Basics of Optimization
-
Review of Analysis
-
First and second derivative of a function
-
Gradient, Hessian
-
Extrema of real functions
-
Lagrangians and Lagrange multipliers
-
Extrema of real functions: using second derivatives
-
Extrema of real functions: using convexity
-
Newton's Method
Back to
Gallier Homepage
published by: