Large scale convex and nonconvex optimization

Lecturer

Panos Patrinos, KU Leuven

Schedule and place

This 15-hour course will take place in 5 sessions over five days on April 8, 10, 12, 15, 17, 2019, at KU Leuven - Room ESAT B91.300 (AULA C), Kasteelpark Arenberg 10, Heverlee  

Schedule : 3 hours/day, from 9:30 to 12:45 (including a 15-minute coffee break at 10:30).

Parking is possible at Parking The Molen (see the map here ) , celestijnenlaan, Leuven. 
For the parking there is a daily code:
8 april : code 5967#
10 april : code 6327#
12 april : code 5455#
15 april : code 3363#
17 april : code 2751#
This code should be used to enter AND to leave the parking

Travel instructions are available here

Description

Optimization problems are ubiquitous in many engineering and science disciplines, such as machine learning, signal processing, data science, communications, control and robotics. Optimization methods have to cope well with the demanding requirements of modern applications. These include handling large numbers of variables and constraints, being amenable to distributed or even parallel implementation or being simple enough to be embedded into hardware devices with limited storing and computational capabilities. 

The purpose of the course is to introduce, derive and analyze a wide range of classical and modern optimization algorithms for convex and nonconvex structured, nonsmooth optimization. After a review on fundamental topics of convex analysis and duality, the course aims at presenting first-order algorithms under the unifying framework of monotone operators and fixed point theory. The students will familiarize with essential algorithms such as the proximal gradient, forward-backward and Douglas-Rachford splitting, proximal point, augmented Lagrangian, ADMM and several primal-dual proximal algorithms. Examples of problem formulations and templates from various disciplines of engineering will be demonstrated. Extensions of the algorithms to the nonconvex setting will also be presented. The course will also cover block-coordinate and incremental algorithms that are nowadays very popular for optimization problems arising in machine learning and data science.

 

Course material

To be determined

Evaluation

To be determined.