Continuous Benchmarking on HPC Systems

From Git repository to massive runs: Exa-MA industrializes HPC application deployment compliant with NumPEx guidelines — HPC as a Service.

Identity Card

Authors

  • Christophe Prud’homme (Unistra–Cemosis)

  • Vincent Chabannes (Unistra–Cemosis)

  • Javier Cladellas (Unistra–Cemosis)

  • Exa-MA WP7 Team

Collaboration

  • CoE HiDALGO2

  • PEPR NumPEx / Exa-MA

  • National & EuroHPC centers (Karolina, LUMI, MeluXina, Discoverer)

Date: September 2025

Context and Objective

HPC applications require reproducibility, portability, and large-scale testing, but the path from code to supercomputer remains long and heterogeneous across sites.

Objective: Unify the Exa-MA application framework and automate build, tests, and deployments in compliance with NumPEx guidelines.

Key Result and Innovation

Development of an Exa-MA application framework and HPC CI/CD pipeline:

Framework Components

  • Templates: Standardized application structure

  • Metadata: Machine-readable configuration

  • V&V: Verification & Validation tests

  • Packaging: Spack + Apptainer/Singularity

Performance Gains

  • Code → large-scale execution: days → < 24h

  • Zero manual intervention on target site

  • Automated non-regression tests (strong/weak scaling)

  • Future: profiling artifacts

Pipeline: GitHub Actions → Reframe/SLURM submission → French & EuroHPC supercomputers

Benchmarking Platform

The platform provides three main entry points for exploring benchmark results: Applications, Supercomputers, and Use Cases.

Benchmarking Dashboard
Figure 1. Benchmarking dashboard with toolboxes for Applications, Supercomputers, and Use Cases

Multi-Site Deployment

Benchmarks are automatically deployed across multiple EuroHPC and national HPC centers: Discoverer (Sofia), Gaya (Strasbourg), Karolina (Ostrava), LUMI (Kajaani), MeluXina (Bissen), and more.

Machine Views
Figure 2. Auto-generated page providing access to benchmark reports by machine

Hundreds of benchmarks deployed automatically with performance tracking across all sites.

Automated Benchmarks
Figure 3. Benchmark reports on EuroHPC/Karolina with performance graphs and scaling analysis

Deployment Workflow

Deployment Workflow
Figure 4. Feel++ application deployment workflow on EuroHPC systems

The workflow automates the entire pipeline from new Apptainer images through GitHub Actions to benchmark execution on multiple HPC systems, with results feeding into the dashboard website.

Impact and Next Steps

Current Impact

  • Accelerated onboarding of Exa-MA applications

  • Quality improvement through systematic testing

  • Full traceability of builds and runs

Next Steps

  • Publication of all Exa-MA applications

  • Multi-site performance dashboard

  • Extension to additional EuroHPC centers

Valorization

Documentation

  • Exa-MA Guides (WP7)

  • Multi-site demonstration materials

  • NumPEx compliance checklists

Resources

WP7

Showroom & Benchmarking — CI/CD, demonstrators, co-design