2023
DOI: 10.1109/tpami.2022.3230414
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Label Classification via Adaptive Resonance Theory-Based Clustering

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 12 publications
(6 citation statements)
references
References 62 publications
0
6
0
Order By: Relevance
“…The time complexity for training CA (Algorithm 1) is O(ndK) [30], where n is the number of d-dimensional training instances, and K is the number of nodes. In addition, RVEA-CA generates a CA network on the M -dimensional objective space by using the union of the population and the archive set as O(N ) training instances.…”
Section: Computational Complexity 1) Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…The time complexity for training CA (Algorithm 1) is O(ndK) [30], where n is the number of d-dimensional training instances, and K is the number of nodes. In addition, RVEA-CA generates a CA network on the M -dimensional objective space by using the union of the population and the archive set as O(N ) training instances.…”
Section: Computational Complexity 1) Algorithmmentioning
confidence: 99%
“…RVEA-CA has two hyperparameters, i.e., the ratio (probability) p mat for the mating selection and the interval λ for the learning process of CA. Among hyperparameters in the learning procedure of CA (see Algorithm 1), the similarity threshold V has a much larger impact on the performance of CA than the interval λ [30], and we introduce an adjusting procedure for the similarity threshold V into RVEA-CA. On the other hand, we used a fixed ratio p mat = 0.5 for the computational experiments resulted in Section V-D.…”
Section: ) Performance Analysis With Convergence and Diversitymentioning
confidence: 99%
“…In general, a clustering algorithm can be applied to classification tasks by using a clustering result (e.g., cluster centroids, topological networks) as a classifier [16], [40], [41], [42]. Note that, in many cases, a clustering-based classification algorithm can inherit the continual learning ability from a clustering algorithm [14], [24], [43].…”
Section: Clustering-based Classification Algorithms Capable Of Contin...mentioning
confidence: 99%
“…However, similar to ARTbased clustering algorithms, ART-based classification algorithms also suffer from the difficulty of the specification of parameters for maintaining good classification performance. A small number of studies have focused on specification/adjustment methods for significantly data-dependent parameters [16], [48], [49], [50], [51]. However, these studies still have parameters that need to be adjusted for each dataset.…”
Section: Clustering-based Classification Algorithms Capable Of Contin...mentioning
confidence: 99%
See 1 more Smart Citation