Dr. Bogdan Savchynskyy, Prof. Dr. Carsten Rother, SoSe 2020
Summary
Machine learning techniques are tightly coupled with optimization methods. Many techniques become practical only if there exists a supporting optimization tool.
In the seminar we will discuss a number of recent articles on combinatorial optimization with applications in computer vision and machine learning.
The topic of this semester is
Neural Networks meet Combinatorial Optimization
In particular, we will consider methods for
- training parameters of combinatorial optimization algorithms with the machine learning techniques,
- combinatorial optimization based loss-functions for deep learning
General Information
The seminars will be held on-line via HeiConf. Technical details will be provided soon via Müsli.
The first seminar will take place on Wednesday, April 22 at 16:00. Please make sure to participate!
- Seminar: Wed, 16:00 – 18:00
- Credits: 2 / 4 / 6 CP depending on course of study
Schedule
The seminars will always start at 16:00 (strict).
20.05.2020
Talk Do's and Don'ts
Bogdan Savchynskyy and Lisa Kruse
10.06.2020
Vlastelica et al. - Differentiation of blackbox combinatorial solvers
Kevin Kiefer (30 minutes)
Weed - An explicit analysis of the entropic penalty in linear programming
Isabel Vinterbladh (45 minutes)
17.06.2020
Zanfir and Sminchisescu - Deep learning of graph matching
Christopher Klugmann (30 minutes)
Wang et al. - Learning combinatorial embedding networks for deep graph matching
Wenzhe Yin (45 minutes)
24.06.2020
Wang et al. - Neural graph matching network: Learning lawler’s quadratic assignment problem with extension to hypergraph and multiple-graph matching
Yiwen Lu (45 minutes)
Jiang et al. - GLMNet: Graph Learning-Matching Networks for Feature Matching
Max Hirsch (30 minutes)
01.07.2020
Yu et al. - Learning deep graph matching with channel-independent embedding and hungarian attention
Marinko Balikic (30 minutes)
08.07.2020
Deep graph matching wrap-up
Rolinek et al - Deep graph matching via black-box differentiation of combinatorial solvers
Bogdan Savchynskyy (30 minutes)
Knöbelreiter et al. - End-to-end training of hybrid CNN-CRF models for stereo
Erdi Düzel (30 minutes)
15.07.2020
Lorberbom et al. - Direct optimization through arg max for discrete variational auto-encoder
Armand Rousselot (30 minutes)
Zheng et al. - Conditional random fields as recurrent neural networks
Dzelila Siljak (45 minutes)
22.07.2020
Schulter et al. - Deep network flow for multi-object trackings
Christopher Klammt (30 minutes)
Amos and Kolter - Optnet: Differentiable optimization as a layer in neural networks
Achita Prasertwaree (45 minutes)
Registration
Please register for the seminar in Müsli. If you have trouble registering, drop an email to lisa.kruse@iwr.uni-heidelberg.de.
Topics
You can find the papers together with the introductory presentation in the HeiBOX.
Papers for presentation and discussion are in general pre-selected. A short introduction will be given at the first seminar session. The paper assignment will also take place during this seminar.
The following list of papers is incomplete and can be complemented to fit the number of enrolled students:
M. Vlastelica et al. - Differentiation of blackbox combinatorial solvers
Y. Kim et al. - Structured attention networks
P. Knöbelreiter et al. - End-to-end training of hybrid CNN-CRF models for stereo
P. Mohapatra et al. - Efficient optimization for rank-based loss functions
D. McAllester et al. - Direct loss minimization for structured prediction
G. Lorberbom et al. - Direct optimization through arg max for discrete variational auto-encoder
Y. Song et al. - Training deep neural networks via direct loss minimization
D. Marin et al. - Beyond gradient descent for regularized segmentation losses
M. Tang et al. - On regularized losses for weakly-supervised CNN segmentation
S. Zheng et al. - Conditional random fields as recurrent neural networks
J. Song et al. - End-to-end learning for graph decomposition
S. Wang et al. - End-to-end training of CNN-CRF via differentiable dual-decomposition
L. Chen et al. - Learning deep structured models
S. Schulter et al. - Deep network flow for multi-object trackings
M. Rolínek et al. - Optimizing Rank-based Metrics with Blackbox Differentiation
Amos and Kolter.- Optnet: Differentiable optimization as a layer in neural networks
Subtopic: Learnable Graph Matching
Rolinek et al - Deep graph matching via black-box differentiation of combinatorial solvers
Weed - An explicit analysis of the entropic penalty in linear programming
Zanfir and Sminchisescu - Deep learning of graph matching.
Wang et al. - Learning combinatorial embedding networks for deep graph matching.
Wang et al. - Neural graph matching network: Learning lawler’s quadratic assignment problem with extension to hypergraph and multiple-graph matching.
Jiang et al. - GLMNet: Graph Learning-Matching Networks for Feature Matching
Yu, Wang et al. - Learning deep graph matching with channel-independent embedding and hungarian attention.
Fey - Deep graph matching consensus