Competition on Dynamic Multiobjective Optimisation

Rio de Janeiro, Brazil
July 8-13, 2018

Introduction

The past decade has witnessed a growing amount of research interest in dynamic multiobjective optimisation, a challenging yet very important topic that deals with problems with multi-objective and time-dependent properties. Due to the presence of dynamics, dynamic multiobjective problems (DMOPs) are more complex and challenging than static multiobjective problems. As a result, evolutionary algorithms (EAs) face great difficulties in solving them. Generally speaking, DMOPs pose at least three main challenges. First, environmental changes can exhibit any dynamics. A variety of dynamics pose different levels of difficulties to algorithms, and there is no single change reaction mechanism that can handle all dynamics. Second, diversity, the key driving force of population-based algorithms, is sensitive to dynamics and therefore difficult to be well maintained. Finally, often than not the response time for environmental changes is rather tight for algorithms. Time restriction on DMOPs requires algorithms to reach a good balance between diversity and convergence such that any environmental changes can be promptly handled in order to closely track time-varying Pareto fronts or sets. All these suggest there be a great need for new methodologies for tacking DMOPs.

Benchmark problems are of great importance to algorithm analysis, which helps algorithm designers and practitioners to better understand the strengths and weaknesses of evolutionary algorithms. In dynamic multi-objective optimisation, there exist several widely used test suites, including FDA and dMOP. However, these problem suites only represent one or several aspects of real-world scenarios. For example, the FDA and dMOP functions have no detection difficulty for algorithms. Environmental changes involved in these problems can be easily detected with one re-evaluation of a random population member. Real-life environmental changes should not be so simple. It has also been recognised that most existing DMOPs are a direct modification of popular static test suites, e.g. ZDT and DTLZ. As a result, the DMOPs are more or less the same regarding their problem properties, and therefore are of limited use when a comprehensive algorithm analysis is pursued. Furthermore, another worrying characteristic of most existing DMOPs is that static problem properties overweigh too much dynamics. A problem property (e.g. strong variable dependency) that is challenging for static multiobjective optimisation may not be a good candidate for dynamic multiobjective optimisation. One reason for this is that a failure of algorithms for DMOPs is not due to the presence of dynamics, but rather the static property. It is therefore likely to get a misleading conclusion on the performance of algorithms when such DMOPs are used. In a nutshell, a set of diverse and unbiased benchmark test problems for a systematic study of evolutionary algorithms are greatly needed in the area.

Competition Guideline

In this competition, a total of 14 benchmark functions are introduced, covering diverse properties which nicely represent various real-world scenarios, such as time-dependent PF/PS geometries, irregular PF shapes, disconnectivity, knee, and so on. Through suggesting a set of benchmark functions with a good representation of various real-world scenarios, we aim to promote the research on evolutionary dynamic multiobjective optimisation.

The benchmarks used for competition are detailed in the following technical report (download here), which includes the necessary information to understand the problem, how the solutions are represented, and how the fitness function is evaluated. Please inform me (math4neu@gmail.com) if you encounter any problem.

S. Jiang, S. Yang, X. Yao, K.C. Tan, M. Kaiser, and N. Krasnogor, " Benchmark Problems for CEC’2018 Competition on Dynamic Multiobjective Optimisation," technical Report, Newcastle University, U.K., December 2017.

All the benchmark functions have been implemented in MATLAB code (download here) and C++ code (download here). Source code for sampling on the PF is also provided (download here). Please note that, the sampled points of PF should be truncated properly to guarantee uniformity (the k-nearest neighbor method in SPEA2 could be used for this purpose). Interested participants are welcome to report their approaches and results in a paper submitted to the submission system of CEC'2018. Or alternatively, the results can be submitted in the form of a brief technical report, which should be sent directly to Shouyong Jiang. Submissions in both forms will be considered as entries, therefore be ranked according to the competition evaluation criteria.

Important Dates

For paticipants who want to submit a paper to the 2018 IEEE Congress on Evolutionary Computation
For other participants (only result entry but without a paper)

Organisers


Last update: .   All rights reserved.