2023
DOI: 10.1109/tmi.2023.3235757
|View full text |Cite
|
Sign up to set email alerts
|

FedDM: Federated Weakly Supervised Segmentation via Annotation Calibration and Gradient De-Conflicting

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(9 citation statements)
references
References 41 publications
0
9
0
Order By: Relevance
“…For a comprehensive evaluation of FedA 3 I on addressing heterogeneous annotation noise, a set of SOTA methods are selected for comparison, including GCE (Zhang and Sabuncu 2018), SCE (Wang et al 2019), ELR (Liu et al 2020), ADELE (Liu et al 2022), RMD (Fang et al 2023), NR-Dice (Wang et al 2020), FedProx , FedMix (Wicaksana et al 2022), FedDM (Zhu, Chen, and Yuan 2023), FedCorr (Xu et al 2022b), and FedNoRo (Wu et al 2023a). The introduction and implementation details of these methods can be found in Appendix B.…”
Section: Comparison With Sota Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…For a comprehensive evaluation of FedA 3 I on addressing heterogeneous annotation noise, a set of SOTA methods are selected for comparison, including GCE (Zhang and Sabuncu 2018), SCE (Wang et al 2019), ELR (Liu et al 2020), ADELE (Liu et al 2022), RMD (Fang et al 2023), NR-Dice (Wang et al 2020), FedProx , FedMix (Wicaksana et al 2022), FedDM (Zhu, Chen, and Yuan 2023), FedCorr (Xu et al 2022b), and FedNoRo (Wu et al 2023a). The introduction and implementation details of these methods can be found in Appendix B.…”
Section: Comparison With Sota Methodsmentioning
confidence: 99%
“…It confirms the necessity of specific designs for the two-level Non-IID annotation noise in FMIS. In addition, different from those methods introducing complex correction (Xu et al 2022b;Liu et al 2022;Zhu, Chen, and Yuan 2023) or regularization (Liu et al 2020;Wu et al 2023a), no additional training overhead is introduced in FedA 3 I, making it more easy-todeploy in practice.…”
Section: Comparison With Sota Methodsmentioning
confidence: 99%
See 3 more Smart Citations