WP5: Optimization
WP5 advances exascale-ready optimization methods for continuous, combinatorial, and mixed problems, with surrogate-based and AI-driven approaches.
Recent Highlights (2024-2025)
Bayesian at LUMI
Bayesian optimization at 8000 GPUs on LUMI supercomputer with fractal decomposition
WP2 Integration
Surrogate-based workflows coupled with WP2 ML models
Simulation-Based
High-performance simulation-based optimization
Deep GPs
Multi-fidelity modeling with deep Gaussian processes
AutoML for HPC
AutoML experiments on HPC software parameter tuning
1. Objectives
WP5 focuses on designing and implementing exascale optimization algorithms for large-scale problems:
-
Combinatorial, continuous and mixed optimization using exact and approximate algorithms
-
Surrogate-based optimization using multi-fidelity models
-
Shape optimization for multiphysics problems
-
AutoML for automatic design of deep neural networks
3. Key Tasks
T5.1: Exascale Combinatorial and Continuous Optimization
-
Design general exascale optimization algorithms
-
Exact algorithms: branch and bound, tree search
-
Approximate algorithms: evolutionary algorithms, swarm intelligence
-
Decomposition-based exascale optimization for large-scale problems (LOPs)
-
Define efficient decomposition strategies in decision and objective spaces
T5.2: Exascale Surrogate-Based Optimization
-
Adapt SBO algorithms to exploit exascale HPC systems
-
Evaluate multiple candidate solutions in each iteration
-
Exploit multi-fidelity models (MFMs)
-
Design new parallel SBO algorithms considering:
-
The surrogate
-
The optimizer and its sampling strategy
-
The coupling between them
-
-
Use machine learning for expensive functions using SBO and MFMs
T5.3: Exascale Shape Optimization
-
Develop toolbox for shape optimization at exascale
-
Study different meshes for state variable and design variable
-
Use reduced models from WP2 (e.g., neural networks) for faster evaluations
-
Find trade-off between cost and accuracy
T5.4: Exascale Optimization for AutoML
-
Develop optimization approaches for automatic design of deep neural networks (DNNs)
-
Optimize hyper-parameters (AutoML)
-
Address complex problems (dataset and network size)
-
Improve DNN accuracy, reduce energy and inference time
-
Improve robustness and solve large-scale/complex learning tasks
5. Addressed Exascale Bottlenecks
WP5 targets bottlenecks B7, B9, B10, B13:
-
B7 (Exascale Algorithms): Redesigning optimization algorithms to improve scalability
-
B9 (Resilience, robustness and accuracy): Ensuring robust optimization with verifiable results
-
B10 (Scientific productivity): Providing tools for scientists to use exascale systems productively
-
B13 (Opportunity to integrate uncertainties): Optimization under uncertainty