2010
DOI: 10.1007/s10618-010-0171-0
|View full text |Cite
|
Sign up to set email alerts
|

Using interesting sequences to interactively build Hidden Markov Models

Abstract: The paper presents a method of interactive construction of global Hidden Markov Models (HMMs) based on local sequence patterns discovered in data. The method is based on finding interesting sequences whose frequency in the database differs from that predicted by the model. The patterns are then presented to the user who updates the model using their intelligence and their understanding of the modelled domain. It is demonstrated that such an approach leads to more understandable models than automated approaches… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2010
2010
2021
2021

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 24 publications
0
3
0
Order By: Relevance
“…Finally, Jaroszewicz (2010) describes the use of local patterns in the form of event sequences for the interactive construction of a global Hidden Markov Model. The local pattern discovery focuses on sequences that occur more frequently than it would be expected from the constructed model.…”
Section: The Special Issuementioning
confidence: 99%
“…Finally, Jaroszewicz (2010) describes the use of local patterns in the form of event sequences for the interactive construction of a global Hidden Markov Model. The local pattern discovery focuses on sequences that occur more frequently than it would be expected from the constructed model.…”
Section: The Special Issuementioning
confidence: 99%
“…Markov models in all their variants are frequently used to mine patterns in sequential data. Consider, for instance, 1 st order Markov chains (Wilks 1999;Pirolli and Pitkow 1999;Sarukkai 2000), Hidden Markov Models (HMM) (Jaroszewicz 2010;Peharz et al 2014;Meier et al 2015;Bueno et al 2019) and Dynamic Bayesian Networks (DBN) (Dagum et al 1992;Bueno et al 2020). All these models are called memoryless if they satisfy the Markov property: given the data at time t − 1, the data at time t is independent of the data before time t − 1.…”
Section: Introductionmentioning
confidence: 99%
“…Markov models in all their variants are frequently used to mine patterns in sequential data. Consider, for instance, 1 st order Markov chains (Wilks 1999;Pirolli and Pitkow 1999;Sarukkai 2000), Hidden Markov Models (HMM) (Jaroszewicz 2010;Peharz et al 2014;Meier et al 2015;Bueno et al 2019) and Dynamic Bayesian Networks (DBN) (Dagum et al 1992;Bueno et al 2020). All these models are called memoryless if they satisfy the Markov property: given the data at time t − 1, the data at time t is independent of the data before time t − 1.…”
Section: Introductionmentioning
confidence: 99%