2022
DOI: 10.1007/s13042-022-01565-z
|View full text |Cite
|
Sign up to set email alerts
|

Stochastic configuration networks for imbalanced data classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
1
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(3 citation statements)
references
References 41 publications
0
1
0
Order By: Relevance
“…Deep binary descriptors (DeepBit) [19] uses VGGNet [20] to extract the features of images and learns the hashing codes with a combined object function of quantization loss, balanced regularization and rotation invariant objective. Stochastic generative hashing (SGH) [21] learns hashing codes by minimum description length principle so as to maximally compress the dataset as well as regenerate outputs from the codes. Semantic structure-based unsupervised hashing (SSDH) [22] uses two half Gaussian distributions to estimate pairwise cosine distances of data points and assign any two data points with obviously smaller distance as semantically similar pair.…”
Section: A Hashing Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Deep binary descriptors (DeepBit) [19] uses VGGNet [20] to extract the features of images and learns the hashing codes with a combined object function of quantization loss, balanced regularization and rotation invariant objective. Stochastic generative hashing (SGH) [21] learns hashing codes by minimum description length principle so as to maximally compress the dataset as well as regenerate outputs from the codes. Semantic structure-based unsupervised hashing (SSDH) [22] uses two half Gaussian distributions to estimate pairwise cosine distances of data points and assign any two data points with obviously smaller distance as semantically similar pair.…”
Section: A Hashing Methodsmentioning
confidence: 99%
“…We incorporate five state-of-the-art traditional hashing methods to our scheme, i.e. ITQ [3], GHS [34], IMH [12], [21] and SSDH [22]. ITQ, GHS and IMH are shallow hashing methods, while SGH and SSDH adopts deep neural networks.…”
Section: B Baselinesmentioning
confidence: 99%
“…In order to extract more information through DNNs, some researchers have introduced trendy self-supervised techniques in deep unsupervised hashing, such as Autoencoders in SGH [9], TBH [10] and BDNN [11], and the application of Generative Adversarial Networks (GAN) in HashGAN [12], BGAN [13] and BinGAN [14] etc.…”
Section: Deep Self-supervised Image Retrievalmentioning
confidence: 99%