EE546 index

EE546, Special Topics: Convex Optimization Algorithms

Instructor: Maryam Fazel
Spring 2016

Modern large-scale convex optimization algorithms have had an immense impact in areas including machine learning, signal processing, and engineering design. The objectives of this course are to

  • Study classes of convex optimization algorithms along with their complexity analysis
  • Discuss structural convex optimization, to develop the capability of designing customized algorithms by exploiting problem structure
  • Expose students to research frontiers in convex optimization and its applications

Achnowledgement: Course material prepared in collaboration with Dr. Lin Xiao, Researcher at Microsoft Research, Redmond, WA.


Announcements

  • Final report due on June 8th. Use Canvas to to upload your submission. The final report should be at most 8 pages.
  • Poster session for course projects: Wed June 1st, 2:30-4pm (poster set up 2-3:30pm), in Paul Allen Center Atrium. Poster boards, easels, coffee+cookies are provided.
  • HW2 due Mon, May 30. Use Canvas to access the HW files and to upload your submission. You can use Canvas discussion board to post questions about hw.
  • Use Canvas to submit your project proposal.
  • Mid-quarter project report will be due on Fri May 13. See details in the project section.
  • Welcome to EE 546!


Lecture notes

1. Introduction

Smooth optimization:
2. Gradient methods
3. Optimal gradient methods

Non-smooth optimization:
4. Subgradients
5. Subgradient methods

Proximal gradient methods and smoothing:
6. Proximal mapping
7. Proximal gradient methods
8. Smoothing methods

Decomposition and splitting methods:
9. Dual decomposition and dual algorithms
10. Augmented Lagrangian, alternating direction multiplier method

Interior-point methods:
11. Newton's method, self-concordant analysis
12. Interior-point methods

13. Stochastic and online optimization
14. Coordinate descent methods (not covered; here are old slides from 2014)

15. Conclusions


Homeworks

There will be 2 homework sets, including some theory and a focus on implementation (in Matlab) and insights into algorithms discussed in class.

Here is a helpful Matlab tutorial, including object oriented features: Yagtom

  • HW 1 is assigned, due Fri May 6th by midnight. Use Canvas to access the HW files and to upload your submission.
  • HW 2 will be assigned on Mon May 16 and is due on Fri May 27th.


Course project

Projects can be done individually or in groups of 2.
  • Project timeline: Proposals (2 pages) due on April 22th, Mid-quarter report due on May 13th, Poster presentation on June 1st, Final report due on June 8th.

  • Mid-quarter progress report: The progress report should be at most 4 pages long and report on the work you've done done so far, intermediate results, and next steps.


Course information

Credit: 3 units
Lectures: Mondays and Wednesdays, 9-10:20am in EEB 042.

Instructor: Maryam Fazel, office: Paul Allen Center, Room CSE 230.

Office hours: Wednesdays 10:30-11:45am

Teaching Assistant: Reza Eghbali

Office hours: Tuesdays 2:30 pm - 4 pm, EE 431

Prerequisites: EE 578 (Convex optimization) or Math 516 (Numerical Optimization). If you have not taken the pre-requisite, you'll strictly need the permission of the instructor to take the course.

Course requirements:

  • Homeworks
  • Final project: project proposal, mid-way report, presentation or poster, and final report

  • Grading: homeworks 30%, final project 65%, participation 5%.


    References

    The lectures notes are largely based on the following books, and on the lectures notes of Lieven Vandenberghe for EE236C at UCLA, and Stephen Boyd for EE364B at Stanford University. Note that the second, third and fourth books are freely available online.