2021
DOI: 10.48550/arxiv.2107.12603
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Federated Learning Meets Natural Language Processing: A Survey

Ming Liu,
Stella Ho,
Mengqi Wang
et al.

Abstract: Federated Learning aims to learn machine learning models from multiple decentralized edge devices (e.g. mobiles) or servers without sacrificing local data privacy. Recent Natural Language Processing techniques rely on deep learning and large pre-trained language models. However, both big deep neural and language models are trained with huge amounts of data which often lies on the server side. Since text data is widely originated from end users, in this work, we look into recent NLP models and techniques which … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
24
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 17 publications
(24 citation statements)
references
References 56 publications
0
24
0
Order By: Relevance
“…Even though many federated training approaches have been proposed most of them focus in the context of computer vision and only a handful of them provide a federated solution tailored for NLP applications (Liu et al, 2021;Lin et al, 2021). For instance, federated learning has been applied to tackle the problem of next word keyboard prediction (Hard et al, 2018;Stremmel and Singh, 2021;Yang et al, 2018), speech recognition (Leroy et al, 2019) and health text mining .…”
Section: Related Workmentioning
confidence: 99%
“…Even though many federated training approaches have been proposed most of them focus in the context of computer vision and only a handful of them provide a federated solution tailored for NLP applications (Liu et al, 2021;Lin et al, 2021). For instance, federated learning has been applied to tackle the problem of next word keyboard prediction (Hard et al, 2018;Stremmel and Singh, 2021;Yang et al, 2018), speech recognition (Leroy et al, 2019) and health text mining .…”
Section: Related Workmentioning
confidence: 99%
“…The predictive data is sent only the local model parameters to the FL server. Most centralized setups have just the IID assumption for train test data but in a federated learning based decentralized setup, non-IID poses the problem of high skewness of different devices due to different data distribution (Liu et al, 2021).…”
Section: Federated Learningmentioning
confidence: 99%
“…Text Classification can be extended to many NLP applications including sentiment analysis, question answering, and topic labeling . For example, financial or government institutions that wish to train a chatbot for their clients cannot be allowed to upload all text data from the client-side to their central server due to strict privacy protection statements (Liu et al, 2021). At this point, applying the federated learning paradigm presents an approach to solve the dilemma due to its advances in privacy preservation and collaborative training where the central server can train a powerful model with different local labeled data at client devices without uploading the raw data considering increasing privacy concerns in public.…”
Section: Introductionmentioning
confidence: 99%
“…Its emergence provides a promising solution to solve the data silos, without compromising the privacy of local data. Since federated learning was proposed in 2016 [4], it has been applied to many applications, such as natural language processing (NLP) [5][6][7][8], healthcare [9][10][11][12][13][14][15][16], and the Internet of Things (IoT) [17][18][19].…”
Section: Introductionmentioning
confidence: 99%