Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing 2021
DOI: 10.18653/v1/2021.emnlp-main.672
|View full text |Cite
|
Sign up to set email alerts
|

Discrete and Soft Prompting for Multilingual Models

Abstract: It has been shown for English that discrete and soft prompting perform strongly in fewshot learning with pretrained language models (PLMs). In this paper, we show that discrete and soft prompting perform better than finetuning in multilingual cases: Crosslingual transfer and in-language training of multilingual natural language inference. For example, with 48 English training examples, finetuning obtains 33.74% accuracy in crosslingual transfer, barely surpassing the majority baseline (33.33%). In contrast, di… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
23
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 23 publications
(25 citation statements)
references
References 21 publications
(40 reference statements)
2
23
0
Order By: Relevance
“…As suggested by Perez et al (2021) and Zhao & Schütze (2021), we randomly select the same number of samples from training and validation set, making it a reasonable fewshot scenario. Checkpoints are selected via early stopping on the selected validation set, and the stopping metric is the default metric for each task.…”
Section: Few-shot Domain Transfermentioning
confidence: 99%
“…As suggested by Perez et al (2021) and Zhao & Schütze (2021), we randomly select the same number of samples from training and validation set, making it a reasonable fewshot scenario. Checkpoints are selected via early stopping on the selected validation set, and the stopping metric is the default metric for each task.…”
Section: Few-shot Domain Transfermentioning
confidence: 99%
“…Other work exploring cross-lingual transfer learning and parameter-efficient methods includes Zhao and Schütze (2021). They find that prompts can effectively be used in cross-lingual settings, but their work is constrained to classification while ours focuses on generative tasks.…”
Section: Related Workmentioning
confidence: 99%
“…Prompting Methods Prompting is a technique that aims to make better use of pre-trained knowledge by reformulating tasks at hand (Liu et al, 2021b). The advent of prompting allows us to do more with one system, i.e., unifying signals cross tasks (Sanh et al, 2021), languages (Zhao and Schütze, 2021), even modality . In this paper, we continue to expand what one system can do by proposing multilingual multitask learning with prompting methods, connecting geographically diverse languages and linguistically different tasks.…”
Section: Multi-lingual Learning With Pretrained Modelsmentioning
confidence: 99%
“…Although prompting methods have proven effective in many NLP scenarios, its effectiveness comes at the cost of prompt engineering (Liu et al, 2021b), as there are usually various factors that influence the prompt design process. Existing works have studied the manual prompt (Schick and Schütze, 2021), soft (trainable) prompt (Lester et al, 2021), and mix prompt (mixing the manual and soft prompt) (Gu et al, 2021;Zhao and Schütze, 2021). The situation becomes more complicated in the multilingual situation, and in this paper, we care about the languages and uniformity of prompt templates design.…”
Section: Prompt Designmentioning
confidence: 99%
See 1 more Smart Citation