2019
DOI: 10.1109/tevc.2018.2865495
|View full text |Cite
|
Sign up to set email alerts
|

Learning From a Stream of Nonstationary and Dependent Data in Multiobjective Evolutionary Optimization

Abstract: Evolutionary algorithms (EAs) have been well acknowledged as a promising paradigm for solving optimisation problems with multiple conflicting objectives in the sense that they are able to locate a set of diverse approximations of Pareto optimal solutions in a single run. EAs drive the search for approximated solutions through maintaining a diverse population of solutions and by recombining promising solutions selected from the population. Combining machine learning techniques has shown great potentials since t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
10
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
7
2

Relationship

2
7

Authors

Journals

citations
Cited by 29 publications
(11 citation statements)
references
References 55 publications
1
10
0
Order By: Relevance
“…Considering the non-stationary nature of multi-objective evolutionary algorithm data, Sun et al [26] designed an environment selection operator for online clustering of non-stationary and dependent data learning. Considering the non-stationary nature of multi-objective evolutionary algorithm data, Sun et al [26] designed an environment selection operator for online clustering of non-stationary data learning. Its operator combines the iterative process of online agglomeration clustering with the evolution process of evolutionary algorithms.…”
Section: Clustering Based Reproduction Operatorsmentioning
confidence: 99%
See 1 more Smart Citation
“…Considering the non-stationary nature of multi-objective evolutionary algorithm data, Sun et al [26] designed an environment selection operator for online clustering of non-stationary and dependent data learning. Considering the non-stationary nature of multi-objective evolutionary algorithm data, Sun et al [26] designed an environment selection operator for online clustering of non-stationary data learning. Its operator combines the iterative process of online agglomeration clustering with the evolution process of evolutionary algorithms.…”
Section: Clustering Based Reproduction Operatorsmentioning
confidence: 99%
“…An efficient EMO method should make use of thus problem knowledge to guide its search directions. With this consideration, the clustering learning-based mating restriction strategies are popular practices [25][26][27]. In our previous work, the adaptive population structure learning and multi-source mating restriction [28,29] both have well performance while solving complex MOPs, however the clustering operation also brings a lot of computing overhead.…”
Section: Introductionmentioning
confidence: 99%
“…Hundreds of metrics are reportedly used for a single experiment at Microsoft (Kevic et al 2017;Machmouchi and Buscher 2016). For optimization, multi-objective optimization (Nardi et al 2019;Sun et al 2018) can be applied offline at compile time, where someone can manually make a trade-off between metrics from a Pareto front. However, for the online bandit optimization algorithms, a single metric is required to serve as rewards (though that metric can be a scalar index).…”
Section: Considerations For What Metric and Changes To Optimize Formentioning
confidence: 99%
“…In this experiment, the adaptive refining and the evolution memory strategy are not used to amplify the effect of the adaptive selection strategy. The algorithm without and with the adaptive mutation strategy are denoted as AMODEEM 3 and AMODEEM 4 respectively, and in AMODEEM 3 a mutation operator is randomly selected from the four ones. The mean and standard deviation of the IGD metric for the two algorithms are presented in Table 2.…”
Section: B Efficiency Analysis Of Adaptive Selection On Mutation Opementioning
confidence: 99%