WP7: Showroom, Benchmarking, and Co-Design Coordination
WP7 integrates, tests, and benchmarks methods and software from all WPs, coordinates co-design activities with ExaDIP, and delivers the showroom, CI/CD infrastructure, and training materials.
Recent Highlights (2024-2025)
CI/CD Automation
CI/CD pipelines validating new methods and software deliveries
24h Turnaround
Automated code → EuroHPC runs in < 24h
Showroom Updated
Showroom content updated with meshing, solver, and ML workflows
Containers Published
Containers and packages published (Spack, Guix, Docker, Apptainer)
Training Available
Training modules on CI/CD and benchmarking best practices
1. Objectives
WP7 provides the infrastructure and coordination for:
-
Software testing from simple to advanced benchmarking
-
Verification of exascale capabilities and handling of identified challenges (B1-B13)
-
Delivery of software packages with CI/CD framework
-
Coordination of co-design activities within Exa-MA and with ExaDIP
-
Showroom of Exa-MA results and training material
3. Key Tasks
T7.1: Testing and Benchmarking Environment
-
Identify each relevant demonstrator to define a validation laboratory
-
Three types of demonstrators:
-
Level 1: Covers one to two WPs (e.g., AMR techniques)
-
Level 2: Covers three to four WPs
-
Level 3: Potentially covers all WPs
-
-
Some demonstrators retained by PC5 benefit from "mini apps" development and support
T7.2: Co-design Activities Coordination
-
Integration process based on all demonstrators
-
Broken down into non-regression, verification and validation processes
-
Each demonstrator must deliver current test process to guarantee integrity
-
Guarantee non-regression while evaluating new features
-
Record cases as new non-regression, verification or validation tests
-
Add new tests based solely on functionality tested
-
Enrich original base if necessary to better evaluate contribution of new methods
T7.3: Showroom Coordination
-
Describe obtained results in unique format
-
Present in dedicated web page
-
Compare results with initial objectives in terms of performance
-
Illustrate with figure of merit and raw data (clock time, resources used, computer)
-
Systematically compare with initial performance of the demonstrator
T7.4: Training
-
Produce training material on exascale toolboxes and mini-apps
-
Document best practices for exascale computing
-
Create reproducible examples and tutorials
4. Leads & Partners
Lead Institution |
CEA (initially UNISTRA) |
Co-Leaders |
UNISTRA, Inria, École Polytechnique, Sorbonne Université |
Duration |
Months 1-60+ |
5. Addressed Exascale Bottlenecks
WP7 addresses ALL bottlenecks (B1-B13) through:
-
Benchmarking to verify exascale capabilities
-
Handling of all identified challenges
-
Non-regression, verification and validation testing
-
CI/CD integration with ExaDIP
Bottlenecks: B1 (Energy efficiency), B2 (Interconnect), B3 (Memory), B4 (System software), B5 (Programming systems), B6 (Data Management), B7 (Exascale Algorithms), B8 (Discovery/design/decision), B9 (Resilience/robustness/accuracy), B10 (Scientific productivity), B11 (Reproducibility/replicability), B12 (Pre/Post processing), B13 (Integrate uncertainties)