“…Also for Gaussian f 0 and f 1 , θ ∈ [a 1 , a 2 ] , S n = n k=1 X k,l ,θ n = max{a 1 , min[S n /n, a 2 ]}. At time N decide upon H 0 or H 1 according asθ N ≤ θ * orθ N ≥ θ * , where θ * is obtained by solving I(θ * , θ 0 ) = I(θ * , θ 1 ), and I(θ, λ) is the Kullback-Leibler information number, which is the K-L Divergence I(f θ , f λ ) in (9). Here, as the threshold g(cn) is a time varying and decreasing function, the quantisation (7) is changed in the following way:…”