2022
DOI: 10.48550/arxiv.2210.16489
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

STPrompt: Semantic-guided and Task-driven prompts for Effective Few-shot Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 0 publications
0
1
0
Order By: Relevance
“…We compare our proposed model with several state-of-theart baseline methods, which are described as follows: 1)RNN [4]: This model is based on a recurrent neural network (RNN) with gated recurrent units (GRU) for learning relevant post features over time in rumor detection; 2)AttLSTM [26]: This is a long short-term memory (LSTM) model that uses attention mechanism to consider the importance of words in the relevant posts; 3)FNDML [13]:This model employs Multitask Learning (ML) methodologies to train reliable classifiers to detect fake news; 4)FT+ERINE [27]: We use an existing fine-tuning technique based on the ERNIE pretrained language model; 5)FakeBERT [7]: This model combines different parallel blocks of the single-layer deep CNN having different kernel sizes and filters with the BERT; 6)ParallelBERT [14]: This model uses two parallel BERT networks to perform fake news detection. One of the BERT networks encodes news, and another encodes news-related knowledge; 7)PT-* [28]: We improve an existing prompt-based tuning technique on the ERNIE PLM for fake news detection and extend it for our task. The * in PT-represents different extensions, including knowledgeable prompt learning (KPL) [20], supervised contrastive learning (SCL) [29], and our proposed Heterogeneous Graph Augmented (HGA) framework.…”
Section: B Baseline Modelmentioning
confidence: 99%
“…We compare our proposed model with several state-of-theart baseline methods, which are described as follows: 1)RNN [4]: This model is based on a recurrent neural network (RNN) with gated recurrent units (GRU) for learning relevant post features over time in rumor detection; 2)AttLSTM [26]: This is a long short-term memory (LSTM) model that uses attention mechanism to consider the importance of words in the relevant posts; 3)FNDML [13]:This model employs Multitask Learning (ML) methodologies to train reliable classifiers to detect fake news; 4)FT+ERINE [27]: We use an existing fine-tuning technique based on the ERNIE pretrained language model; 5)FakeBERT [7]: This model combines different parallel blocks of the single-layer deep CNN having different kernel sizes and filters with the BERT; 6)ParallelBERT [14]: This model uses two parallel BERT networks to perform fake news detection. One of the BERT networks encodes news, and another encodes news-related knowledge; 7)PT-* [28]: We improve an existing prompt-based tuning technique on the ERNIE PLM for fake news detection and extend it for our task. The * in PT-represents different extensions, including knowledgeable prompt learning (KPL) [20], supervised contrastive learning (SCL) [29], and our proposed Heterogeneous Graph Augmented (HGA) framework.…”
Section: B Baseline Modelmentioning
confidence: 99%