2015
DOI: 10.1016/j.csl.2014.09.005
|View full text |Cite
|
Sign up to set email alerts
|

A survey on the application of recurrent neural networks to statistical language modeling

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
116
0
5

Year Published

2016
2016
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 222 publications
(121 citation statements)
references
References 34 publications
0
116
0
5
Order By: Relevance
“…A better option might be to extend the approach presented here in the direction of convolutional or recurrent neural networks. Artificial recurrent neural networks have been applied successfully to sound recognition problems [60], and it is well known that feedback projections are common features of the auditory pathway. Developing recurrent versions of the NRFs introduced here is therefore likely to be important, particularly if we hope to develop successful models of higher order auditory cortical neurons.…”
Section: Discussionmentioning
confidence: 99%
“…A better option might be to extend the approach presented here in the direction of convolutional or recurrent neural networks. Artificial recurrent neural networks have been applied successfully to sound recognition problems [60], and it is well known that feedback projections are common features of the auditory pathway. Developing recurrent versions of the NRFs introduced here is therefore likely to be important, particularly if we hope to develop successful models of higher order auditory cortical neurons.…”
Section: Discussionmentioning
confidence: 99%
“…[26], [27], [28]), object recognition (e.g. [29]), speech modeling [30], and in medical image analysis (e.g. [31], [32], [33], [34]).…”
Section: Methodsmentioning
confidence: 99%
“…Liu et al developed a recursive recurrent neural network ( R 2 NN ) to fit the end‐to‐end decoding process for an application of statistical machine translation. Mulder et al presented a survey that covers the applications of RNN in statistical language modeling and introduced some recent significant extensions of RNN to overcome the long training period and the constraints on the context words of RNN. Socher and Lin developed a max‐margin structure based on RNN as a syntactic parser to predict natural language sentences from a typical dataset of Penn Treebank.…”
Section: Introductionmentioning
confidence: 99%