Proceedings of the ACM Web Conference 2023 2023
DOI: 10.1145/3543507.3583500
|View full text |Cite
|
Sign up to set email alerts
|

FedACK: Federated Adversarial Contrastive Knowledge Distillation for Cross-Lingual and Cross-Model Social Bot Detection

Abstract: Social bot detection is of paramount importance to the resilience and security of online social platforms. The state-of-the-art detection models are siloed and have largely overlooked a variety of data characteristics from multiple cross-lingual platforms. Meanwhile, the heterogeneity of data distribution and model architecture make it intricate to devise an efficient cross-platform and cross-model detection framework. In this paper, we propose FedACK, a new federated adversarial contrastive knowledge distilla… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 16 publications
(1 citation statement)
references
References 33 publications
0
1
0
Order By: Relevance
“…Wu et al [70] specifically designed an algorithm tailored for multi-access edge computing in a real-world scenario, leveraging knowledge distillation as a key component. FedACK [68] applies knowledge distillation in the cross-lingual social bot detection domain, showcasing a novel application that combines knowledge distillation and federated learning. This application demonstrates the potential for knowledge distillation to inspire more useful applications within this emerging field.…”
Section: Federated Learning With Knowledge Distillationmentioning
confidence: 99%
“…Wu et al [70] specifically designed an algorithm tailored for multi-access edge computing in a real-world scenario, leveraging knowledge distillation as a key component. FedACK [68] applies knowledge distillation in the cross-lingual social bot detection domain, showcasing a novel application that combines knowledge distillation and federated learning. This application demonstrates the potential for knowledge distillation to inspire more useful applications within this emerging field.…”
Section: Federated Learning With Knowledge Distillationmentioning
confidence: 99%