2019
DOI: 10.1109/tit.2018.2876863
|View full text |Cite
|
Sign up to set email alerts
|

Asymptotic Optimality of Mixture Rules for Detecting Changes in General Stochastic Models

Abstract: The paper addresses a sequential changepoint detection problem for a general stochastic model, assuming that the observed data may be non-i.i.d. (i.e., dependent and non-identically distributed) and the prior distribution of the change point is arbitrary. Tartakovsky and Veeravalli (2005), Baron and Tartakovsky (2006), and, more recently, Tartakovsky (2017) developed a general asymptotic theory of changepoint detection for non-i.i.d. stochastic models, assuming the certain stability of the log-likelihood ratio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
18
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 12 publications
(18 citation statements)
references
References 44 publications
0
18
0
Order By: Relevance
“…In order to synthesize the algorithm for detecting the abrupt change in the band center of the process ( ) t ξ , let us single out two possible hypotheses [5], [12] . For the specified alternatives the analytical expressions should be found for the decision statistics (logarithms of the functionals of the likelihood ratio).…”
Section: The Synthesis Of the Detection Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…In order to synthesize the algorithm for detecting the abrupt change in the band center of the process ( ) t ξ , let us single out two possible hypotheses [5], [12] . For the specified alternatives the analytical expressions should be found for the decision statistics (logarithms of the functionals of the likelihood ratio).…”
Section: The Synthesis Of the Detection Algorithmmentioning
confidence: 99%
“…The problem of detecting the moments of changes in the properties of random processes arises when the specific tasks are to be solved including control and monitoring, technical and medical diagnostics, measurement data processing, etc. [1]- [5]. Often enough, the observed random process is the fast-fluctuating one and has the same intensity within the working frequency band [6]- [9].…”
Section: Introductionmentioning
confidence: 99%
“…A substantial part of the development of quickest (sequential) change-point detection has been directed towards establishing optimality and asymptotic optimality of certain detection procedures such as CUSUM, Shiryaev, Shiryaev-Roberts, EWMA and their mutual comparison in various settings (Bayesian, minimax, etc.). See, e.g., [1,2,4,7,8,[10][11][12][14][15][16][17][18][19][20][21][22][23][24][25][26]. The present article is concerned with the problem of minimizing the moments of the detection delay, R r ν,θ (τ ) = E ν,θ [(τ − ν) r | τ > ν], in pointwise (i.e., for all change points ν) and minimax (i.e., for a worst change point) settings among all procedures for which the probability of a false alarm P ∞ (k τ < k + m|τ k) is fixed and small.…”
Section: Introduction and Basic Notationmentioning
confidence: 99%
“…In Section 3, we consider the Bayesian version of the problem in the class of procedures with the given weighted probability of false alarm. Based on the recent results of [23] we establish asymptotic pointwise and minimax properties of the WSR procedure. These results allow us to establish main theoretical results in Section 4 regarding asymptotic optimality in the class of procedures with the local false alarm probability constraint (in a fixed window).…”
Section: Introduction and Basic Notationmentioning
confidence: 99%
“…(ii) If the head-start ω c and the mean of the prior distributionν c approach infinity at such rate that(6.4) lim c→0 log(ω c +ν c ) | log c| = 0and if threshold A = A c,r of the procedure T W A is the solution of the equation(6.5) rD 0,r A(log A) r−1 = (ω c b c +ν c )/c, then ρ c,r π,p,W ( T Ac,r ) ∼ D 0,r c | log c| r as c → 0, (6.6)i.e., T W Ac,r is first-order asymptotically Bayes as c → 0 in the class of priors C(µ = 0).PROOF. The proof is based on the technique used by Tartakovsky for proving Theorems 5 and 6 in[12] for the single stream problem with an unknown post-change parameter for the prior with the fixed µ c = µ for all c in condition (3.8) and with positive µ c which vanishes when c → 0. A more general prior considered in this article is handled analogously.The proof of part (i).…”
mentioning
confidence: 99%