Dr. Bogdan Savchynskyy, WiSe 2025/26
This seminar belongs to the Master in Physics (specialization Computational Physics, code "MVSem") and Master of Data and Computer Science, Applied Informatics (code "IS") , but is also open for students of Scientific Computing and anyone interested.
Summary
The topic of this semester is
Efficient Parameter Tuning with Bayesian Optimization. Bayesian optimization is a well-established and powerful framework for tuning parameters and hyperparameters in settings where evaluating the objective function is expensive, time-consuming, or noisy. It is particularly valuable when the search space is continuous or high-dimensional, and gradient information is unavailable or unreliable. By modeling the objective function probabilistically—typically using Gaussian processes or other surrogate models—Bayesian optimization balances exploration (searching uncertain regions) and exploitation (refining promising areas), often achieving near-optimal results with far fewer evaluations than exhaustive search.
Its applications span a broad spectrum:
- Machine learning – hyperparameter tuning for models like neural networks, SVMs, and gradient boosting.
- Engineering – optimizing design parameters in simulations or experiments.
- Robotics – tuning control policies where real-world trials are costly.
- Scientific computing – calibrating complex physical models.
- Finance and operations research – optimizing trading strategies or decision processes.
Numerous ready-to-use software packages, such as Spearmint, GPyOpt, scikit-optimize, and Ax, make these techniques widely accessible. The papers we discuss in the seminar provide a deep dive into the theoretical foundations, algorithmic variations, and practical considerations of Bayesian optimization, enabling you to apply it effectively to diverse real-world problems.
General Information
Please register for the seminar in Müsli. The first seminar will take place on Thursday, October 16 at 11:00. Please make sure to participate!
- Seminar: Thu, 11:00 – 13:00 in Mathematikon B (Berliner Str. 43), SR B128
Entrance through the door at the side of Berlinerstrasse. Ring the door bell labelled "HCI am IWR" to be let in. The seminar room is on the 3rd floor. - Credits: 4/ 6 CP depending on course of study.
Seminar Repository:
Slides and schedule of the seminar will be placed in [HeiBox] .
Papers to Choose from (The list will be expanded soon):
Tutorials
Sampling strategies
- Tree-structured Parzen Estimator (TPE), 2011 – Yoshua Bengio, modern method
- Multi-objective TPE, 2022
Heuristic studies
- Random Search for Hyper-Parameter Optimization, 2012 – J. Bergstra (TPE author, highest cited paper)
- Bayesian optimization is superior to random search for machine learning hyperparameter tuning: Analysis of the black-box optimization challenge, 2021
Speeding up / Pruning
- Speeding up Automatic Hyperparameter Optimization of Deep Neural Networks by Extrapolation of Learning Curves, 2015
- Learning Curve Prediction with Bayesian Neural Networks, 2017
Contact
Dr. Bogdan Savchynskyy
In case you contact me via email, its subject should contain the tag [SemOMLV]. Emails without this tag have a very high chance to be lost and get ignored therefore!