1999
DOI: 10.1002/(sici)1520-684x(199904)30:4<20::aid-scj3>3.0.co;2-e
|View full text |Cite
|
Sign up to set email alerts
|

Phoneme boundary estimation using bidirectional recurrent neural networks and its applications

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2005
2005
2023
2023

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 12 publications
(6 citation statements)
references
References 18 publications
0
6
0
Order By: Relevance
“…Bi-directional RNNs (BRNNs) (Schuster and Paliwal, 1997;Schuster, 1999) are designed for input sequences whose starts and ends are known in advance, such as spoken sentences to be labeled by their phonemes; compare (Fukada et al, 1999). To take both past and future context of each sequence element into account, one RNN processes the sequence from start to end, the other backwards from end to start.…”
Section: : Supervised Recurrent Very Deep Learner (Lstm Rnn)mentioning
confidence: 99%
“…Bi-directional RNNs (BRNNs) (Schuster and Paliwal, 1997;Schuster, 1999) are designed for input sequences whose starts and ends are known in advance, such as spoken sentences to be labeled by their phonemes; compare (Fukada et al, 1999). To take both past and future context of each sequence element into account, one RNN processes the sequence from start to end, the other backwards from end to start.…”
Section: : Supervised Recurrent Very Deep Learner (Lstm Rnn)mentioning
confidence: 99%
“…This provides the network with complete, symmetrical, past and future context for every point in the input sequence, without displacing the inputs from the relevant targets. BRNNs have previously given improved results in various domains, notably protein secondary structure prediction (Baldi et al, 2001;Chen and Chaudhari, 2004) and speech processing (Schuster, 1999;Fukada et al, 1999). In this thesis we find that BRNNs consistently outperform unidirectional RNNs on real-world sequence labelling tasks.…”
Section: Bidirectional Rnnsmentioning
confidence: 99%
“…Both hidden layers are connected to the same output layer, providing it with access to the past and future context of every point in the sequence. BRNNs have outperformed standard RNNs in several sequence learning tasks, notably protein structure prediction [43] and speech processing [41], [44].…”
Section: Bidirectional Recurrent Neural Networkmentioning
confidence: 99%