2018
DOI: 10.48550/arxiv.1807.03402
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

IGLOO: Slicing the Features Space to Represent Sequences

Vsevolod Sourkov

Abstract: Up until recently Recurrent neural networks (RNNs) have been the standard go-to component when processing sequential data with neural networks. Issues relative to vanishing gradient have been partly addressed by Long short-term memory (LSTM) and gated recurrent unit (GRU), but in practice experiments show that very long terms dependencies (beyond 1000 time steps) are difficult to learn. We introduce IGLOO, a new neural network architecture which aims at being faster than both LSTM and GRU but also their respec… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 12 publications
0
2
0
Order By: Relevance
“…To generate vector representations of the inputs, geNomad employs an encoder based on the IGLOO architecture 11 , which is able to extract patterns that are useful for classification from the sequence data and encode them into an embedding space (Figure 1B, Supplementary Figure 1). The IGLOO encoder begins processing one-hot-encoded matrices by applying 128 convolutional filters to generate sequence feature maps.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…To generate vector representations of the inputs, geNomad employs an encoder based on the IGLOO architecture 11 , which is able to extract patterns that are useful for classification from the sequence data and encode them into an embedding space (Figure 1B, Supplementary Figure 1). The IGLOO encoder begins processing one-hot-encoded matrices by applying 128 convolutional filters to generate sequence feature maps.…”
Section: Resultsmentioning
confidence: 99%
“…The IGLOO architecture has demonstrated superior performance compared to traditional alternatives (such as recurrent neural networks and convolutional neural networks) when applied to sequence data. This is attributed to its capability to gather information from non-local relationships across the entire sequence to create a global representation 11, 12 .…”
Section: Resultsmentioning
confidence: 99%