GNBG-III Competition GECCO 2026 & WCCI 2026
🏆 Competition • 24 GNBG-III instances • 30 runs each • Black-box

Numerical Global Optimization Competition on GNBG–III Property Aware Test Suite

A property-controlled test suite spanning unimodal to multi-component multimodal landscapes (conditioning, asymmetry, interactions, basin response, deceptiveness).

Stop criteria: f1–f24: 500k FEs Code: MATLAB / Python

News

News

Benchmark code (MATLAB / Python)

GNBG-III instances + wrappers. Treat instances as black-box.

Note: keep parameters unchanged across all GNBG-III instances.
Call

Call for abstracts

Submit your algorithm entry and short abstract describing the method, settings, and reproducibility details.

  • Include algorithm name + short description.
  • Provide parameter settings (fixed across all instances).
  • Attach result files + runnable code package for verification.
Dates

Important dates

  • Submission deadline: 31 March 2026
  • Verification period: late April 2026
  • Results: June-July 2026
Separate submissions are required for the WCCI and GECCO. All abstracts must be submitted no later than 31 March 2026, and full papers must be submitted by the conference proceedings deadline.

Overview

Overview

Competition overview

This competition invites researchers to test their global optimization algorithms against a meticulously curated set of 24 problem instances from the Generalized Numerical Benchmark Generator (GNBG). The GNBG-III competition introduces the next generation of property-aware and computationally hard numerical benchmarks, designed to expose the behavior of global optimization algorithms under controlled structural conditions. The suite encompasses:

24 GNBG-III generated instances: unimodal → single-component multimodal → multi-component multimodal.

This competition presents problems with diverse characteristics such as modality, ruggedness, asymmetry, conditioning, and deceptiveness, providing a thorough test of algorithmic performance. Beyond solution quality, the emphasis is on understanding how algorithms reach solutions. Participants will explore how algorithms handle deceptive landscapes, traverse valleys, and adapt to varying problem difficulties, offering deeper insight into optimization in complex numerical environments.

Reference: Rohit Salgotra, Kalyanmoy Deb, Amir H. Gandomi (2026). Numerical Global Optimization Competition on GNBG–III Property Aware Test Suite. Submitted In Proceedings of the Genetic and Evolutionary Computation Conference Companion.
Overview

Quick links

Controlled properties

conditioning asymmetry basin linearity variable interactions deceptiveness

The suite composition

Foundational Tests: f1–f6 Coupled Interactions Evaluation: f7–f10 Multimodality and Asymmetry: f11–f16 Hard Static Hybrids: f17–f22 Robustness and Dynamic Adaptations: f23–f24

Submission Details

Rules

Rules & compliance

  • Open to all researchers in continuous numerical optimization.
  • Unpublished or previously published algorithms are allowed for participation.
  • No per-instance tuning; parameters must remain consistent across all instances.
  • Direct use of GNBG internal parameters (instances must be treated as black-box) is forbidden.
  • No modifications are permitted to parameter settings of the instances in the .mat files.
  • Winners must share source code for verification (kept confidential).
Evaluation

Evaluation metrics

Algorithms are tested on 24 GNBG-III benchmark problems with a budget of 500,000 function evaluations (FEs) per run (fixed seeds for reproducibility). Performance is measured using best-so-far error: e(FE) = |f_best(FE) − f*|.

  • Checkpoint error: 10K, 50K, 100K, …, 500K FEs
  • FE-to-target & success rate: 1e−1, 1e−3, 1e−5, 1e−8
  • ERT: expected running time per target (failures count as full budget)
Full harness (PDF)
Submission

Submission package

One zipped folder named with your algorithm.

Include

  • Documentation: title, authors, affiliations, emails, summary + mean±std tables.
  • 24 result files: f10.dat etc.
  • Each file contains 30 runs with 2 columns: absolute error and FEs-to-threshold.

Submit to

Deadline: 31 March 2026
(CEC/GECCO Conference deadlines for paper submission)

Results

Results

Competition results

To be announced in June-July 2026 at the WCCI/GECCO Conference Venues

Organizing Committee

Contact