2019
DOI: 10.29304/jqcm.2019.11.2.573
|View full text |Cite
|
Sign up to set email alerts
|

Convolution Neural Networks for Blind Image Steganalysis: A Comprehensive Study

Abstract: Recently, Convolution Neural Network is widely applied in Image Classification, Object Detection, Scene labeling, Speech, Natural Language Processing and other fields. In this comprehensive study a variety of scenarios and efforts are surveyed since 2014 at yet, in order to provide a guide to further improve future researchers what CNN-based blind image steganalysis are presented its architecture, performance and limitations. Long-standing and important problem in image steganalysis difficulties mainly lie in … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 13 publications
0
3
0
Order By: Relevance
“…Multi-Layer Perceptron (MLP) It is a forward feeding structure for artificial neural networks containing more hidden layers, and each layer consists of a simple, mathematically connected combination of the contract known as neuro cells as shown in Fig. 2 [12][13] [14].…”
Section: The Multi-layer Perceptron Structure (Mlp)mentioning
confidence: 99%
See 2 more Smart Citations
“…Multi-Layer Perceptron (MLP) It is a forward feeding structure for artificial neural networks containing more hidden layers, and each layer consists of a simple, mathematically connected combination of the contract known as neuro cells as shown in Fig. 2 [12][13] [14].…”
Section: The Multi-layer Perceptron Structure (Mlp)mentioning
confidence: 99%
“…The main purpose of using the activation function is to give the irritation to extract nervous cells, for learning and carrying out complex tasks. Back Propagation makes it possible the most active function used is the corrected written unit (ReLU), and equation ( 3) illustrates the representation of the RLU [14].…”
Section: Activation Functionsmentioning
confidence: 99%
See 1 more Smart Citation