Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing 2018
DOI: 10.18653/v1/d18-1109
|View full text |Cite
|
Sign up to set email alerts
|

Convolutional Neural Networks with Recurrent Neural Filters

Abstract: We introduce a class of convolutional neural networks (CNNs) that utilize recurrent neural networks (RNNs) as convolution filters. A convolution filter is typically implemented as a linear affine transformation followed by a nonlinear function, which fails to account for language compositionality. As a result, it limits the use of high-order filters that are often warranted for natural language processing tasks. In this work, we model convolution filters with RNNs that naturally capture compositionality and lo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
20
0
6

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 27 publications
(27 citation statements)
references
References 9 publications
1
20
0
6
Order By: Relevance
“…Wang et al [27] further introduced the discrete cosine transform (DCT) bases and converted convolution filters into the frequency domain to achieve higher compression and speed-up ratios. Yang et al [28] used a set of Lego filters to build efficient CNNs.…”
Section: Data-driven Network Compressionmentioning
confidence: 99%
“…Wang et al [27] further introduced the discrete cosine transform (DCT) bases and converted convolution filters into the frequency domain to achieve higher compression and speed-up ratios. Yang et al [28] used a set of Lego filters to build efficient CNNs.…”
Section: Data-driven Network Compressionmentioning
confidence: 99%
“…On the ImageNet dataset, this network can achieve a better classification effect with lower computational cost. LegoNet [28] designs the new filter modules that are assembled by a set of shared Lego filters, which are usually much smaller in size. Using the LegoNet as part of the network structure can reduce parameters and speed up the network.…”
Section: Related Workmentioning
confidence: 99%
“…Wang et al [19] proposed RNN-Capsule, a capsule model based on recurrent neural network (RNN) for sentiment analysis. Yang [20] presented RNFs, a new class of convolution filters based on recurrent neural networks. McCann et al [21] introduced an approach for transferring knowledge from an encoder pretrained on machine translation to a variety of downstream natural language processing (NLP) tasks.…”
Section: Literature Reviewmentioning
confidence: 99%