EE546/STAT593C - Sparse Representations: Theory, Algorithms, and Applications
In the last few years, a set of methods developed with contributions from
statistics, signal processing, mathematics and computer science has led to
spectacular results in data acquisition, analysis, and modeling. This set
of methods deals with "sparse representations", in other words, with models
of data that have very large numbers of parameters, features, variables, of
which only a few are active/relevant. Recent research shows that such models
can be estimated and analyzed with surprising efficiency.
The course will cover the mathematical and statistical theory
underlying sparse representations, the relevant algorithms, and some
of the many possible extensions of these ideas to fields as varied as:
compressed sensing, signal processing, machine learning, control, and
stream data processing.
- 5/11: Project presentations scheduled for May 25, 27 and June 1, 3. If you don't have a date scheduled
please email Maryam. Final project reports are due by Sunday June 6th.
- 4/27: Project proposals due Tues May 4th. We'll have a projects "discussion session" on Monday May 3rd,
1-2:30pm, Room EE 403 (EE bldg).
- 4/26: There'll be no office hours on Monday 4/26 (due to travel).
- 4/10: Papers for week 3 posted.
- 4/8: Notes for lectures 2 and 3 posted.
- 4/3: Papers for week 2 posted.
- 3/31: Papers for the second class are posted.
Basic course information
Credit: 2-3 units
Lectures: Tuesdays and Thursdays, 4:30-5:50pm, EE Building room 042.
Maryam's office: Paul Allen Center, Room CSE 230. Phone: (206) 616-4781.
Office hours: Mondays 1:30-2:30pm, or by appointment.
Marina's office:Padelford B-321. Phone: (206)543-8484
Office hours: by appointment.
Prerequisites: Convex optimization (LP, SDP, duality), probability theory, linear algebra, statistics.
- Homework, class participation: Some assignments will be posed in class throughout the quarter;
students can volunteer to
present their answer in the following class.
- Final project and presentation: 30 minute presentation in the last two weeks of class
(before finals week) and a report. Project proposals are due beginning of May.
Grading: final project/presentation, class participation (weights TBD, project will have a large weight).
top of ee546 page
Lectures and readings
Slides, papers, and other reading material for lectures will be posted here. Please try to read the papers before each class.
- Introduction and Overview. Slides (pdf)
- The Johnson-Lindenstrauss Lemma and Restricted Isometry Property (RIP).
Dasgupta and Gupta, An Elementary Proof of a Theorem of Johnson and Lindenstrauss
Baraniuk et al., A Simple Proof of the Restricted Isometry Property for Random Matrices
Lecture notes 2
- Sparse vector recovery by l1 minimization: RIP-1, expanders
Berinde et al., Combining geometry and combinatorics: A unified approach to sparse signal recovery
(pdf), theorems 1, 2 and 3 only (pages 6-10).
Lecture notes 3
- Sparse recovery by l1 minimization: RIP, nullspace conditions
Candes, The Restricted Isometry Property and Its Implications for Compressed Sensing
Candes et al, Stable Signal Recovery from Incomplete and Inaccurate Measurements
Zhang, Theory of Compressive Sensing via l1 minimization: a non-RIP Analysis and Extensions
(pdf) (sections 2,3, Theorem 4.2)
Lecture notes 4
- Algorithms (and algorithmic ideas) for sparse vector recovery
Overview: Tropp, Wright, Computational Methods for Sparse Solution of Linear Inverse Problems
Subgradient methods: Stephen Boyd, Course notes for EE364b, Stanford University:
Needell, Tropp, CoSamp: Iterative Signal Recovery from Incomplete and Inaccurate
Tropp, Gilbert, Signal Recovery From Random Measurements
Via Orthogonal Matching Pursuit (pdf)
Lecture notes 9
- Matrix rank problems: Low-rank matrix recovery
Recht, Fazel, Parrilo, Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear
Norm Minimization (pdf)
Candes, Plan, Tight Oracle Bounds for Low-rank Matrix Recovery
from a Minimal Number of Random Measurements (pdf) (Theorem 2.3)
Background (matrix norms and subgradients): Lewis, The convex analysis of unitarily invariant matrix functions,
- Low-rank Matrix Completion problem:
Recht, A Simpler Approach to Matrix Completion (pdf)
A nice note about the concerntation inequality used (non-commutative Bernstein's inequality, due to Ahlswede and Winter)
- Low-rank + Sparse decompositions (and robust PCA):
Chandrasekaran, et al, Rank-sparsity Incoherence for Matrix Decomposition
(pdf), and short conference version:
Candes, et al, Robust Principal Component Analysis?
Application in graphical models
- Algorithms for low-rank matrix problems (many algorithms, here's a sample):
Convex relaxation: interior-point methods; first-order methods (e.g., SVT, FPCA)
"Greedy" methods: AdMIRA, SVP
Special case methods: OptSpace (for matrix completion)
Related methods: weighted nuclear norm minimization
- A few last topics, and wrap up...
Message passing algorithms for compressed sensing: Chandar, Shah, Wornell,
A Simple Message-Passing Algorithm for Compressed Sensing
Donoho, Maleki, Montanari, Message-passing algorithms for compressed sensing
(pdf), and this version.
Some upcoming topics/papers:
- Matrix rank minimization: Low-rank recovery, Matrix completion, Low-rank+Sparse decomposition:
Candes, Recht, Exact Matrix Completion via Convex Optimization.
Dvijotham, Fazel, A Nullspace Analysis of the Nuclear Norm Heuristic for Rank Minimization.
top of ee546 page
Student Presentations Schedule
Presentations should be about 25 minutes with 5 minutes time for questions. They will be in our usual classroom
and time slot, 4:30pm-6pm (or 6:30pm on the 25th). Final written reports are due on Sunday June 6th.
Talk slides will be posted here.
Tuesday 5/25: Dennis Meng (Cardinality and convex envelopes);
Hoyt Koepke (Compressed Sensing and Gravimetric Inversion);
Alfred Gui (Video Inpainting using Hankel rank minimization);
Sasha Aravkin (Compressed sensing with applications to wave computation)
Thursday 5/27: Ting Kei Pong (Alternating Directions Methods for l1 minimization);
Mikala Johnson (Exploring Singular Value Projection);
Kristi Tsukida (Robust PCA and applications in genetics)
Tuesday 6/1: Andrew Clark (Channel sensing and encyption keys);
Phillip Lee (Connections between channel coding and compressed sensing)
Thursday 6/3: Mark Contois (Principal Component Pursuit for document indexing and search);
Karthik Mohan (Iterative Reweighted Least Squares for matrix rank minimization);
Amol Kapila (DNA microarrays using compressed sensing)
top of ee546 page
References on Low-Rank Matrix Recovery and Completion via Convex Optimization at University of Illinois (good and up to date
list of papers for low-rank matrix problems)
Compressed sensing resources at Rice University (good listing of papers)
Nuit Blanche---a compressed sensing blog
Some related applications:
Face Recognition via Sparse Representation (Yi Ma's lab at UIUC)
Video inpainting using matrix rank minimization (Mario Sznaier's lab at Northeastern)
top of ee546 page