1998
DOI: 10.1109/3477.662756
|View full text |Cite
|
Sign up to set email alerts
|

Toward the border between neural and Markovian paradigms

Abstract: A new tendency in the design of modern signal processing methods is the creation of hybrid algorithms. This paper gives an overview of different signal processing algorithms situated halfway between Markovian and neural paradigms. A new systematic way to classify these algorithms is proposed. Four specific classes of models are described. The first one is made up of algorithms based upon either one of the two paradigms, but including some parts of the other one. The second class includes algorithms proposing a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
8
0

Year Published

1999
1999
2019
2019

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(8 citation statements)
references
References 64 publications
0
8
0
Order By: Relevance
“…In the early 1990s this fact led to the idea of combining HMMs and ANNs within a single, novel model, broadly known as hybrid HMM/ANN [38,70,15,87,93,44,118,8]. A recently published survey paper [125] investigates the`bordera between ANNs and Markovian models (HMMs for ASR and Markov random xelds for image processing), summarizing approaches where ANNs emulate Markov models and vice versa, and reporting hybrid architectures. The hybrid paradigm relies on maintaining an underlying HMM structure, capable of modeling longterm dependencies, with the integration of ANNs, which provide non-parametric universal approximation, probability estimation, discriminative training algorithms, fewer parameters to estimate than those usually required in standard HMMs, e$cient computation of outputs at recognition time, and e$cient hardware implementability.…”
Section: Introductionmentioning
confidence: 99%
“…In the early 1990s this fact led to the idea of combining HMMs and ANNs within a single, novel model, broadly known as hybrid HMM/ANN [38,70,15,87,93,44,118,8]. A recently published survey paper [125] investigates the`bordera between ANNs and Markovian models (HMMs for ASR and Markov random xelds for image processing), summarizing approaches where ANNs emulate Markov models and vice versa, and reporting hybrid architectures. The hybrid paradigm relies on maintaining an underlying HMM structure, capable of modeling longterm dependencies, with the integration of ANNs, which provide non-parametric universal approximation, probability estimation, discriminative training algorithms, fewer parameters to estimate than those usually required in standard HMMs, e$cient computation of outputs at recognition time, and e$cient hardware implementability.…”
Section: Introductionmentioning
confidence: 99%
“…Exactly like in (6), we can use gradient ascent to maximize by taking (23) for a generic weight . From (22)…”
Section: B Imposing a Probabilistic Constraint On The Ann Weightsmentioning
confidence: 99%
“…A number of significant, different hybrid paradigms for ASR were proposed in the literature, e.g., [14]- [22]. Surveys on the topic can be found in [5], [23]. The approach presented in this paper is related to some extent to Bourlard and Morgan's hybrid [4], [24], [25], as well as to Bengio's optimization scheme [26].…”
Section: Introductionmentioning
confidence: 99%
“…Because individual methods are not perfect for attaining the goal, two or more methodologies are combined in a way that their weaknesses are compensated by each other, and advantages of both methodologies are used to achieve the solution. Earlier approaches to such models are summarized in Willinski et al (1998). In this research, both the methods are distinct, and only intermediary results are exchanged.…”
Section: Introductionmentioning
confidence: 99%
“…In this research, both the methods are distinct, and only intermediary results are exchanged. Earlier approaches to such models are summarized in Willinski et al (1998). The authors have used ANNs and FISs as the soft computing methods for the prediction of time series.…”
Section: Introductionmentioning
confidence: 99%