EE546/STAT593C  Sparse Representations: Theory, Algorithms, and Applications
In the last few years, a set of methods developed with contributions from
statistics, signal processing, mathematics and computer science has led to
spectacular results in data acquisition, analysis, and modeling. This set
of methods deals with "sparse representations", in other words, with models
of data that have very large numbers of parameters, features, variables, of
which only a few are active/relevant. Recent research shows that such models
can be estimated and analyzed with surprising efficiency.
The course will cover the mathematical and statistical theory
underlying sparse representations, the relevant algorithms, and some
of the many possible extensions of these ideas to fields as varied as:
compressed sensing, signal processing, machine learning, control, and
stream data processing.
Announcements
 5/11: Project presentations scheduled for May 25, 27 and June 1, 3. If you don't have a date scheduled
please email Maryam. Final project reports are due by Sunday June 6th.
 4/27: Project proposals due Tues May 4th. We'll have a projects "discussion session" on Monday May 3rd,
12:30pm, Room EE 403 (EE bldg).
 4/26: There'll be no office hours on Monday 4/26 (due to travel).
 4/10: Papers for week 3 posted.
 4/8: Notes for lectures 2 and 3 posted.
 4/3: Papers for week 2 posted.
 3/31: Papers for the second class are posted.
Basic course information
Credit: 23 units
Lectures: Tuesdays and Thursdays, 4:305:50pm, EE Building room 042.
Instructors:
Maryam Fazel
and
Marina Meila
Maryam's office: Paul Allen Center, Room CSE 230. Phone: (206) 6164781.
Office hours: Mondays 1:302:30pm, or by appointment.
Marina's office:Padelford B321. Phone: (206)5438484
Office hours: by appointment.
Prerequisites: Convex optimization (LP, SDP, duality), probability theory, linear algebra, statistics.
Course requirements:
 Homework, class participation: Some assignments will be posed in class throughout the quarter;
students can volunteer to
present their answer in the following class.
 Final project and presentation: 30 minute presentation in the last two weeks of class
(before finals week) and a report. Project proposals are due beginning of May.
Grading: final project/presentation, class participation (weights TBD, project will have a large weight).
top of ee546 page
Lectures and readings
Slides, papers, and other reading material for lectures will be posted here. Please try to read the papers before each class.
 Introduction and Overview. Slides (pdf)
 The JohnsonLindenstrauss Lemma and Restricted Isometry Property (RIP).
Papers:
Dasgupta and Gupta, An Elementary Proof of a Theorem of Johnson and Lindenstrauss
(pdf)
Baraniuk et al., A Simple Proof of the Restricted Isometry Property for Random Matrices
(pdf)
Lecture notes 2
 Sparse vector recovery by l1 minimization: RIP1, expanders
Papers:
Berinde et al., Combining geometry and combinatorics: A unified approach to sparse signal recovery
(pdf), theorems 1, 2 and 3 only (pages 610).
Lecture notes 3
 Sparse recovery by l1 minimization: RIP, nullspace conditions
Candes, The Restricted Isometry Property and Its Implications for Compressed Sensing
(pdf)
Candes et al, Stable Signal Recovery from Incomplete and Inaccurate Measurements
(pdf)
Zhang, Theory of Compressive Sensing via l1 minimization: a nonRIP Analysis and Extensions
(pdf) (sections 2,3, Theorem 4.2)
Lecture notes 4
 Algorithms (and algorithmic ideas) for sparse vector recovery
Overview: Tropp, Wright, Computational Methods for Sparse Solution of Linear Inverse Problems
(pdf)
Subgradient methods: Stephen Boyd, Course notes for EE364b, Stanford University:
Subgradients,
Subgradient algorithms
Greedy methods:
Needell, Tropp, CoSamp: Iterative Signal Recovery from Incomplete and Inaccurate
Samples (pdf)
Tropp, Gilbert, Signal Recovery From Random Measurements
Via Orthogonal Matching Pursuit (pdf)
Lecture notes 9
 Matrix rank problems: Lowrank matrix recovery
Recht, Fazel, Parrilo, Guaranteed MinimumRank Solutions of Linear Matrix Equations via Nuclear
Norm Minimization (pdf)
Candes, Plan, Tight Oracle Bounds for Lowrank Matrix Recovery
from a Minimal Number of Random Measurements (pdf) (Theorem 2.3)
Background (matrix norms and subgradients): Lewis, The convex analysis of unitarily invariant matrix functions,
(pdf)
 Lowrank Matrix Completion problem:
Recht, A Simpler Approach to Matrix Completion (pdf)
A nice note about the concerntation inequality used (noncommutative Bernstein's inequality, due to Ahlswede and Winter)
is here.
 Lowrank + Sparse decompositions (and robust PCA):
Chandrasekaran, et al, Ranksparsity Incoherence for Matrix Decomposition
(pdf), and short conference version:
(pdf)
Candes, et al, Robust Principal Component Analysis?
(pdf)
Application in graphical models
 Algorithms for lowrank matrix problems (many algorithms, here's a sample):
Convex relaxation: interiorpoint methods; firstorder methods (e.g., SVT, FPCA)
"Greedy" methods: AdMIRA, SVP
Special case methods: OptSpace (for matrix completion)
Related methods: weighted nuclear norm minimization
 A few last topics, and wrap up...
Message passing algorithms for compressed sensing: Chandar, Shah, Wornell,
A Simple MessagePassing Algorithm for Compressed Sensing
(pdf)
Donoho, Maleki, Montanari, Messagepassing algorithms for compressed sensing
(pdf), and this version.
Some upcoming topics/papers:
 Matrix rank minimization: Lowrank recovery, Matrix completion, Lowrank+Sparse decomposition:
Candes, Recht, Exact Matrix Completion via Convex Optimization.
(pdf)
Dvijotham, Fazel, A Nullspace Analysis of the Nuclear Norm Heuristic for Rank Minimization.
(pdf)
top of ee546 page
Student Presentations Schedule
Presentations should be about 25 minutes with 5 minutes time for questions. They will be in our usual classroom
and time slot, 4:30pm6pm (or 6:30pm on the 25th). Final written reports are due on Sunday June 6th.
Talk slides will be posted here.
Tuesday 5/25: Dennis Meng (Cardinality and convex envelopes);
Hoyt Koepke (Compressed Sensing and Gravimetric Inversion);
Alfred Gui (Video Inpainting using Hankel rank minimization);
Sasha Aravkin (Compressed sensing with applications to wave computation)
Thursday 5/27: Ting Kei Pong (Alternating Directions Methods for l1 minimization);
Mikala Johnson (Exploring Singular Value Projection);
Kristi Tsukida (Robust PCA and applications in genetics)
Tuesday 6/1: Andrew Clark (Channel sensing and encyption keys);
Paramjit Sandhu;
Phillip Lee (Connections between channel coding and compressed sensing)
Thursday 6/3: Mark Contois (Principal Component Pursuit for document indexing and search);
Karthik Mohan (Iterative Reweighted Least Squares for matrix rank minimization);
Amol Kapila (DNA microarrays using compressed sensing)
top of ee546 page
Links
General resources:
References on LowRank Matrix Recovery and Completion via Convex Optimization at University of Illinois (good and up to date
list of papers for lowrank matrix problems)
Compressed sensing resources at Rice University (good listing of papers)
Nuit Blanchea compressed sensing blog
Some related applications:
Face Recognition via Sparse Representation (Yi Ma's lab at UIUC)
Video inpainting using matrix rank minimization (Mario Sznaier's lab at Northeastern)
top of ee546 page
