2021
DOI: 10.1109/lsp.2021.3097241
|View full text |Cite
|
Sign up to set email alerts
|

Real-Time Text Steganalysis Based on Multi-Stage Transfer Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
3
2

Relationship

2
6

Authors

Journals

citations
Cited by 34 publications
(6 citation statements)
references
References 17 publications
0
6
0
Order By: Relevance
“…48 Most researchers focus on the teacher-student interaction behavior. Offline Distillation is widely applied in TSC, 24 image classification, 48 facial expression recognition, 49 text steganography, 50 and so on.…”
Section: Offline Distillationmentioning
confidence: 99%
“…48 Most researchers focus on the teacher-student interaction behavior. Offline Distillation is widely applied in TSC, 24 image classification, 48 facial expression recognition, 49 text steganography, 50 and so on.…”
Section: Offline Distillationmentioning
confidence: 99%
“…Following [45], we finetune the BERT base-cased model with the default settings in Hugging Face Transformers 4 as the detection part. For each experiment, we randomly split 10,000 original natural texts and their stego versions, to three disjoint subsets: training set (60%), validation set (10%), and testing set (30%).…”
Section: B Evaluation Metricsmentioning
confidence: 99%
“…Zou et al [59] employed the BERT and Glove component to extract the word features, then a Bi-LSTM with attention mechanism was used to extract contextual relation. Peng et al [36] proposed a real-time text steganalysis method and used transfer learning to improve detection accuracy. Yang et al [48] applied a densely connected LSTM with feature pyramids to integrate additional low-level features to detect the generation-based text steganography.…”
Section: Text Steganalysismentioning
confidence: 99%
“…Yang et al [48] applied a densely connected LSTM with feature pyramids to incorporate extra low-level features to deal with generation-based text steganography. Peng et al [36] modeled the discrepancy between normal and stego texts by a ne-tuned BERT. These methods tackle the limitation of traditional feature extraction and achieve preferable detection results, demonstrating that neural models have good potential in text steganalysis.…”
Section: Introductionmentioning
confidence: 99%