2019 IEEE/CVF International Conference on Computer Vision (ICCV) 2019
DOI: 10.1109/iccv.2019.00350
|View full text |Cite
|
Sign up to set email alerts
|

Information Entropy Based Feature Pooling for Convolutional Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 29 publications
(15 citation statements)
references
References 24 publications
0
15
0
Order By: Relevance
“…In Moldovan et al (2020 ), the authors use the so-called transfer entropy between network nodes to guide backpropagation. Other works use statistics for feature extraction ( Finnegan and Song, 2017 ), feature pooling ( Wan et al, 2019 ), or network compression ( Wiedemann et al, 2019 ). Related to that is the research on the distribution of activations, which often treats all neurons as independent stochastic variables and has proven helpful for derivations of initialization schemes and methods to help with training ( Glorot and Bengio, 2010 ; He et al, 2015 ; Ioffe and Szegedy, 2015 ; Salimans and Kingma, 2016 ).…”
Section: Discussionmentioning
confidence: 99%
“…In Moldovan et al (2020 ), the authors use the so-called transfer entropy between network nodes to guide backpropagation. Other works use statistics for feature extraction ( Finnegan and Song, 2017 ), feature pooling ( Wan et al, 2019 ), or network compression ( Wiedemann et al, 2019 ). Related to that is the research on the distribution of activations, which often treats all neurons as independent stochastic variables and has proven helpful for derivations of initialization schemes and methods to help with training ( Glorot and Bengio, 2010 ; He et al, 2015 ; Ioffe and Szegedy, 2015 ; Salimans and Kingma, 2016 ).…”
Section: Discussionmentioning
confidence: 99%
“…The idea of entropy has been introduced to convolutional neural networks for different purposes. For instance, in [19] the authors use information entropy for semantic-aware feature pooling. In [20], an entropy measure is employed for the quantization of different deep learning models, including CNNs.…”
Section: Related Workmentioning
confidence: 99%
“…where F m i and F d i stand for the multi-scale mean feature and std feature of frame i. However, it may not be feasible to concatenate the two pooled features straightforwardly for quality regression, due to the high relevance of F m i with the semantic information [46]. As a result, the learned model tends to overfit to the specific scenes in the training set.…”
Section: A Attention Based Multi-scale Feature Extractionmentioning
confidence: 99%