2022
DOI: 10.1007/978-3-031-16437-8_19
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic Bank Learning for Semi-supervised Federated Image Diagnosis with Class Imbalance

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 20 publications
(4 citation statements)
references
References 16 publications
0
4
0
Order By: Relevance
“…To counter this, the authors decomposed the model parameters θ to two variables θ = ψ + ρ and utilize a separate updating strategy, where only ψ is updated during unsupervised learning, and similarly, ρ is updated for supervised learning. In a real-world scenario, Jiang et al [76] addressed the challenge of imbalanced class distributions among unlabeled clients in the context of medical image diagnosis. They proposed a novel scheme called dynamic bank learning, which aims to collect confident samples and subsequently divide them into sub-banks with varying class proportions.…”
Section: Federated Semi-supervised Learningmentioning
confidence: 99%
“…To counter this, the authors decomposed the model parameters θ to two variables θ = ψ + ρ and utilize a separate updating strategy, where only ψ is updated during unsupervised learning, and similarly, ρ is updated for supervised learning. In a real-world scenario, Jiang et al [76] addressed the challenge of imbalanced class distributions among unlabeled clients in the context of medical image diagnosis. They proposed a novel scheme called dynamic bank learning, which aims to collect confident samples and subsequently divide them into sub-banks with varying class proportions.…”
Section: Federated Semi-supervised Learningmentioning
confidence: 99%
“…CReRF [33] recalibrates the federated classifier in the server using synthesized balanced features. FedIRM [63] and imFedSemi [62] resolves class imbalance in semi-supervised settings by sharing class relation matrices and highly confident unlabeled samples respectively. However, sharing additional information beyond model weight update is not desirable in medical domains due to the potential risk of privacy leakage [71].…”
Section: B Class Imbalance Learningmentioning
confidence: 99%
“…A smooth optimum is one in which a neighbourhood region surrounding the local optimal solution has a loss that is consistent over the entire neighbourhood. In addition, the authors of [21] proposed an FL approach that can perform inside and outside model personalization via an insubstantialgradient method to make use of the local customized model by gathering global updates for general expertise and local updates for client-given optimization. Moreover, the authors of [22] identified and solved a novel issue setting known as federated domain generalization to enable learning a federated model from several remote source domains in such a way that it can directly generalise to domains that have not been encountered before.…”
Section: B Federated Learning For Domain Adaptionmentioning
confidence: 99%