UBC Submodular Optimization CPSC 536S, Winter term 1, 2023

Course Outline

Submodular set functions have played a central role in the development of combinatorial optimization and could be viewed as the discrete analogue of convex functions. Submodularity has also been a useful model in areas such as economics, supply chain management and recently algorithmic game theory and machine learning. There has been a huge amount of work recently in approximation algorithms for various constrained submodular optimization models arising in practice, perhaps most prominently the social welfare maximization problem. We develop the basic properties of submodular functions and then present both classical methods and recent trends. Topics include: algorithms for unconstrained submodular maximization and minimization, polymatroids, local greedy algorithms, multilinear extensions and pipage rounding, Lovasz Extension and convex minimization, matroid constraints, multi-agent optimization, and many applications.

The course will present a number of fundamental optimization problems associated with submodular objective functions and constraints. For each model we either present an algorithm or an inapproximability bound (or both). The course is theoretically focused in that our objective is usually to give guarantees around the existence of good algorithms. Hence a reasonable mathematical background (say in linear algebra) is recommended.

Instructor

Bruce Shepherd Email: fbrucesh@cs.ubc.ca Webpage: bshepherd.ca Office hours: Thursday 14:00-15:00 (ICICS X839) or email to set up an appointment at another time.

Lectures

Time: Monday and Wednesday at 13:00-14:30 Ponderosa Commons 1011

Teaching Assistant

Abner Turkieltaub Office hours:

Grading

Course grade is based upon: 4 assignments (15% each), 3 quizzes (10% each), and class participation! (10%).

Quiz Dates

all in class. tentatively the last class day of September, October and November. The first quiz is now scheduled for Wed October 4.

Reading

There is no course textbook. The following may be useful supporting material in optimization generally.

The course materials were only shared on Piazza and Gradescope. The course has similar content to the 2017 (previous) version of the course although we dropped some topics (notably DR-Submodularity and Continuous Submodularity) and added some new lectures (notably Fujishige-Wolfe Algorithm).