2023
DOI: 10.1109/jas.2022.105437
|View full text |Cite
|
Sign up to set email alerts
|

A Fast Clustering Based Evolutionary Algorithm for Super-Large-Scale Sparse Multi-Objective Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 36 publications
(5 citation statements)
references
References 54 publications
0
5
0
Order By: Relevance
“…Moreover, owing to the sparsity of the correlation matrix of channels, the channel selection problem discussed in this paper also falls within the category of sparse large-scale MOPs. To assess the effectiveness of the proposed algorithm, TS-MOEA is compared with several advanced large-scale MOEAs, containing SpaseEA2 (Zhang et al, 2021 ), SLMEA (Tian et al, 2023 ), S-ECSO (Wang et al, 2022 ), CMMO (Ming et al, 2023 ), and S-NSGA-II (Kropp et al, 2023 ). Among these comparison algorithms, SparseEA2 is an effective sparse multi-objective optimization algorithm, whereas S-NSGA-II and S-ECSO are specialized for large-scale multi-objective optimization tasks.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…Moreover, owing to the sparsity of the correlation matrix of channels, the channel selection problem discussed in this paper also falls within the category of sparse large-scale MOPs. To assess the effectiveness of the proposed algorithm, TS-MOEA is compared with several advanced large-scale MOEAs, containing SpaseEA2 (Zhang et al, 2021 ), SLMEA (Tian et al, 2023 ), S-ECSO (Wang et al, 2022 ), CMMO (Ming et al, 2023 ), and S-NSGA-II (Kropp et al, 2023 ). Among these comparison algorithms, SparseEA2 is an effective sparse multi-objective optimization algorithm, whereas S-NSGA-II and S-ECSO are specialized for large-scale multi-objective optimization tasks.…”
Section: Resultsmentioning
confidence: 99%
“…The crossover probability is 1, while the mutation probability is 1/D; Tian et al, 2023 Both crossover and mutation have a distribution index of 20.…”
Section: Slmeamentioning
confidence: 99%
See 1 more Smart Citation
“…• Some other operators can be embedded to extend this framework to solve other kinds of MOPs. For example, the CSO [13] or other enhanced operators [52] can be embedded to solve large-scale CMOPs [53]. In addition, some advanced learning-based optimizers such as switching particle swarm optimizers [54], [55] can be used as actions to enhance performance.…”
Section: Discussionmentioning
confidence: 99%
“…In this section, we compared the performance of the DSAG algorithm with eight MOEAs on the DTLZ and MaF benchmark functions, including NSGA-III (Deb and Jain 2013), MOEA/D (Zhang and Li 2007), RVEA (Cheng, Jin et al 2016), MOEA/D-D (Li, Deb et al 2014), NSGA-II-SDR (Tian, Cheng et al 2018), 2REA-VCEM (Liang, Zeng et al 2021), DGEA (He, Cheng et al 2020), SLMEA (Tian, Feng et al 2022). Furthermore, we also recorded the IGD results of each algorithm in 30 independent experiments for different numbers of objective functions in each benchmark function.…”
Section: Compared With Excellent Traditional Moeasmentioning
confidence: 99%