Dr. Bogdan Savchynskyy, WiSe 2024/25
Summary
This lecture substitutes the previously offered "Optimization for machine learning".
This lecture belongs to the Master in Physics (specialization Computational Physics, code "MVSpec"), Masters of Data and Computer Science, Applied Informatics, as well as Master Mathematics programs, but is also open for students of Scientific Computing and anyone interested.
The course is devoted to combinatorial optimization, which includes but not limited to algorithms on graphs, integer linear programming, pseudo-boolean optimization, matroids and submodularity.
A distinctive feature of this course is its motivation by machine learning applications, which shifts the optimization accent from attaining an optimal solution to a problem, to obtaining an accurate enough solution very fast. The reason for this shift is complexity of models used in modern artificial intelligence-related branches and the lesson they teach us: Better results can be easier attained by more accurate models rather than by more accurate optimization.
To build an accurate problem model, the latter must be learnable. To be used in learning pipelines, combinatorial algorithms must be fast. To attain the best practical results, the algorithms must be accurate enough.
Fast, accurate enough and learnable algorithms are three aspects we address in this lecture.
The material selection for the course emerges from lecturer's experience of combinatorial optimization in computer vision and machine learning.
The goals of the course:
- competent modeling of real-life combinatorial problems and usage of existing program packages to solve them;
- learn typical combinatorial optimization techniques and have a sufficient background for an independent literature search;
- learn how scalable, fast and accurate combinatorial solvers are build;
- understand the basics of convex analysis, convex optimization, convex duality theory, (integer) linear programs and their geometry.
- learn the cutting-edge results in the area of learning combinatorial solvers, i.e. estimating cost vectors and problem structure based on the training set.
Schedule and Information
The lectures and exercises will be given in English .
Venue:
All lectures and exercises will take place in Mathematikon B (Berliner Str. 43), SR B128 Entrance through the door at the side of Berlinerstrasse. Ring the door bell labelled "HCI am IWR" to be let in. The seminar room is on the 3rd floor.
Lecture schedule:
- Lecture: Mo, 11:00 – 13:00,
- Lecture: Thu, 11:00 – 13:00, the first lecture will be given on October 14
- Exercises: Tue, 11:00 – 13:00, the first exercise will take place on October 22
Contact: Dr. Bogdan Savchynskyy , Sebastian Stricker (email available in Muesli)
In case you contact me via email, its subject should contain the tag [ACO]. Emails without this tag have a very high chance to be lost and get ignored therefore!
The seminar Combinatorial Optimization and Applicaitons complements this lecture by taking a closer look at recent results and developments. We highly recommend it to all students interested in the topic.
Registration
Please register for the course in Müsli.
Course Material and Exercises
Are regularly uploaded to HeiBox.
Errarta in the script can be uploaded to [https://heibox.uni-heidelberg.de/d/8bfde51d0c6e4b1d9b81/ ], the password is the same as for the general HeiBox repository.
Table of Contents
- Linear and integer linear programs and their geometry: Convexity, polyhedra, LP relaxation.
- Lifting of variables: Quadratic to linear problem transform, Sherali-Adams hierarchy
- Lagrange duality: Subgradient, optimality conditions, relation to LP relaxation, reduced costs.
- Systematic exact combinatorial methods: Branching and cutting.
- Scalable dual techniques: Non-smooth first order methods, smoothing, primal-dual algorithm.
- Greedy algorithms: (Sub-)Optimality, matroids.
- Quadratic pseudo-boolean optimization: Algorithms, applications, submodularity.
- Scalable primal heuristics: Greedy generation, local search and optimal recombination. Memetic algorithms.
- Min-cost-flow: Problem subclasses, theoretical properties and practical algorithms.
- Learning parameters of combinatorial problems from training data: Black-box differentiation and recent advances in the literature.