2021
DOI: 10.1109/lsp.2020.3042413
|View full text |Cite
|
Sign up to set email alerts
|

Linguistic Steganography: From Symbolic Space to Semantic Space

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(5 citation statements)
references
References 16 publications
0
5
0
Order By: Relevance
“…Different structures correspond to different spaces, such as the vector space, Euclidean space and so on. In this paper, we mainly deal with the semantic space [36,37], signal space and channel space.…”
Section: "N + 1 Dimensionality" Eaj Methodsmentioning
confidence: 99%
“…Different structures correspond to different spaces, such as the vector space, Euclidean space and so on. In this paper, we mainly deal with the semantic space [36,37], signal space and channel space.…”
Section: "N + 1 Dimensionality" Eaj Methodsmentioning
confidence: 99%
“…With the development of deep learning technology, advanced neural language models are able to generate more natural text by encoding the conditional probability and selecting words from the candidate pool according to the secret bitstream [4,10,12,44,54,57,58]. To further improve the model's security and imperceptibility, studies are focused on seeking more compelling models or building a dynamic encoding method to embed secret information [50,56]. In particular, stego text distribution di ers signi cantly based on di erent steganography algorithms, embedding capacities, and domains of the language models, presenting a massive challenge for text steganalysis.…”
Section: Preliminaries 21 Text Steganographymentioning
confidence: 99%
“…Latent Space Steganography Apart from utilizing the ConProc framework to generate steganographic text, a recent work proposed utilizing a latent semantic space to encode secrets (Zhang et al 2020). The model maps the secret message to a discrete semantic space, defined by natural language semantemes (themes/topics), and the corresponding semantic vector α is fed to a conditional text generation model, where the model generates stegotext x conditioned on α.…”
Section: Conditional Probability Based Framework (Conproc)mentioning
confidence: 99%
“…This work is one of the first approaches towards latent space steganography, where the secret message is encoded with a latent space and mapped to the symbolic space. The authors propose that hiding secrets in an implicit manner can lead to better concealment, as long as the prior distribution of the latent space remains unchanged (Zhang et al 2020).…”
Section: Conditional Probability Based Framework (Conproc)mentioning
confidence: 99%