2021
DOI: 10.1007/978-3-030-86380-7_29
|View full text |Cite
|
Sign up to set email alerts
|

Separation of Memory and Processing in Dual Recurrent Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 23 publications
0
3
0
Order By: Relevance
“…The dual architecture was firstly proposed as an alternative that reduces the computational load on the recurrent layer, letting it concentrate on modeling the temporal dependencies only. From a more abstract point of view, it has been argued that the dual architecture can be understood as a sort of Mealy machine, where the output explicitly depends on both the hidden state and the input [16]. Our results show that this explicit dependence on the input can indeed lead to better performance on language modeling tasks.…”
Section: Discussionmentioning
confidence: 63%
See 2 more Smart Citations
“…The dual architecture was firstly proposed as an alternative that reduces the computational load on the recurrent layer, letting it concentrate on modeling the temporal dependencies only. From a more abstract point of view, it has been argued that the dual architecture can be understood as a sort of Mealy machine, where the output explicitly depends on both the hidden state and the input [16]. Our results show that this explicit dependence on the input can indeed lead to better performance on language modeling tasks.…”
Section: Discussionmentioning
confidence: 63%
“…In this work, we have presented a new network design for the Language Modeling task based on the dual network proposed by [16]. This network adds a direct connection between the input and the output, skipping the recurrent module, and can be adapted to any of the traditional Embedding-Recurrent-Softmax (ERS) models, opening the way to new approaches for this task.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation