Dr. Bogdan Savchynskyy, WiSe 2025/26
Summary
This lecture will be offered as a block-course from September 29 till October 10, 2025
This lecture substitutes the previously offered "Optimization for machine learning".
This lecture belongs to the Master in Physics (specialization Computational Physics, code "MVSpec"), Masters of Data and Computer Science, Applied Informatics, as well as Master Mathematics programs, but is also open for students of Scientific Computing and anyone interested.
The course is devoted to combinatorial optimization, which includes but not limited to algorithms on graphs, integer linear programming, pseudo-boolean optimization, matroids and submodularity.
A distinctive feature of this course is its motivation by machine learning applications, which shifts the optimization accent from attaining an optimal solution to a problem, to obtaining an accurate enough solution very fast. The reason for this shift is complexity of models used in modern artificial intelligence-related branches and the lesson they teach us: Better results can be easier attained by more accurate models rather than by more accurate optimization.
To build an accurate problem model, the latter must be learnable. To be used in learning pipelines, combinatorial algorithms must be fast. To attain the best practical results, the algorithms must be accurate enough.
Fast, accurate enough and learnable algorithms are three aspects we address in this lecture.
The material selection for the course emerges from lecturer's experience of combinatorial optimization in computer vision and machine learning.
The goals of the course:
- competent modeling of real-life combinatorial problems and usage of existing program packages to solve them;
- learn typical combinatorial optimization techniques and have a sufficient background for an independent literature search;
- learn how scalable, fast and accurate combinatorial solvers are build;
- understand the basics of convex analysis, convex optimization, convex duality theory, (integer) linear programs and their geometry.
- learn the cutting-edge results in the area of learning combinatorial solvers, i.e. estimating cost vectors and problem structure based on the training set.
Schedule and Information
The lectures and exercises will be given in English .
Venue:
All lectures and exercises will take place in Mathematikon B (Berliner Str. 43), SR B128 Entrance through the door at the side of Berlinerstrasse. Ring the door bell labelled "HCI am IWR" to be let in. The lecture room is on the 3rd floor.
Lecture schedule:
- Lecture: Mo-Fr, 9:00 – 16:00, block-course, from September 29 till October 10, 2025
- Exercises: Tue, 11:00 – 13:00, each weak in WiSe 25/26, the first exercise will take place on October 14, 2025
Contact: Dr. Bogdan Savchynskyy , Sebastian Stricker.
In case you contact us via email, its subject should contain the tag [ACO]. Emails without this tag have a very high chance to be lost and get ignored therefore!
The seminar Optimization in Machine Learning and Vision complements this lecture by taking a closer look at recent results and developments. We highly recommend it to all students interested in the topic.
Registration
Please register for the course in Müsli.
Course Material and Exercises
Will be uploaded to HeiBox.
Table of Contents
- Linear and integer linear programs and their geometry: Convexity, polyhedra, LP relaxation.
- Lifting of variables: Quadratic to linear problem transform, Sherali-Adams method
- Lagrange duality: Subgradient, optimality conditions, relation to LP relaxation, reduced costs.
- Systematic exact combinatorial methods: Branching and cutting.
- Scalable dual techniques: Non-smooth first order methods, smoothing, primal-dual algorithm.
- Greedy algorithms and their (sub-)optimality.
- Quadratic pseudo-boolean optimization: Algorithms, applications, submodularity.
- Scalable primal heuristics: Greedy generation, local search and optimal recombination. Memetic algorithms.
- Learning parameters of combinatorial problems from training data: Bayesian optimization, Black-box differentiation and recent advances in the literature.