2015
DOI: 10.1016/j.ymssp.2014.07.010
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian system identification of a nonlinear dynamical system using a novel variant of Simulated Annealing

Abstract: a b s t r a c tThis work details the Bayesian identification of a nonlinear dynamical system using a novel MCMC algorithm: 'Data Annealing'. Data Annealing is similar to Simulated Annealing in that it allows the Markov chain to easily clear 'local traps' in the target distribution. To achieve this, training data is fed into the likelihood such that its influence over the posterior is introduced gradually -this allows the annealing procedure to be conducted with reduced computational expense. Additionally, Data… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
34
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 63 publications
(34 citation statements)
references
References 27 publications
0
34
0
Order By: Relevance
“…A possible solution to this problem is to use the Data Annealing algorithm [14]. Noting that, when employing a variant of Simulated Annealing, one is essentially using β to modulate (and increase) the influence of the data on the target distribution, Data Annealing achieves a similar result simply via the gradual introduction of data points into the likelihood.…”
Section: Motivationmentioning
confidence: 99%
See 2 more Smart Citations
“…A possible solution to this problem is to use the Data Annealing algorithm [14]. Noting that, when employing a variant of Simulated Annealing, one is essentially using β to modulate (and increase) the influence of the data on the target distribution, Data Annealing achieves a similar result simply via the gradual introduction of data points into the likelihood.…”
Section: Motivationmentioning
confidence: 99%
“…} , using MCMC (a standard Metropolis update was employed in [14]). Once a sufficient number of samples have been generated, N can then be increased such that additional data points are included in the likelihood.…”
Section: Motivationmentioning
confidence: 99%
See 1 more Smart Citation
“…Consequently, the accuracy of the information estimates will depend on the accuracy of one's prior knowledge -this is something which seems intrinsically Bayesian. Throughout this work, with the aim of improving one's prior estimates of θ 0 the Data Annealing (DA) algorithm [6] has been utilised. Essentially, the DA algorithm is similar to the well-known Simulated Annealing algorithm except that, to save computational cost, the annealing procedure is achieved through the gradual introduction of training data into the likelihood.…”
Section: Most Probable Parameter Estimatesmentioning
confidence: 99%
“…Furthermore, they are very computationally expensive for medium to large dimensional problems. Ongoing research focuses on deriving novel algorithms that more efficiently explore the parameter space [Data Annealing principle, see Green (2015), sampling from a sequence of intermediate distributions which converge to the posterior, see, e.g., Beck and Au (2002), Ching and Chen (2007), and Beck and Zuev (2013)]. …”
Section: Introductionmentioning
confidence: 99%