The problem of estimating a parameter of a quantum system through a series of measurements performed sequentially on a quantum probe is analyzed in the general setting where the underlying statistics is explicitly non-i.i.d. We present a generalization of the central limit theorem in the present context, which under fairly general assumptions shows that as the number N of measurement data increases the probability distribution of functionals of the data (e.g., the average of the data) through which the target parameter is estimated becomes asymptotically normal and independent of the initial state of the probe. At variance with the previous studies (Guţă M 2011 Phys. Rev. A 83 062324; van Horssen M and Guţă M 2015 J. Math. Phys. 56 022109) we take a diagrammatic approach, which allows one to compute not only the leading orders in N of the moments of the average of the data but also those of the correlations among subsequent measurement outcomes. In particular our analysis points out that the latter, which are not available in usual i.i.d. data, can be exploited in order to improve the accuracy of the parameter estimation. An explicit application of our scheme is discussed by studying how the temperature of a thermal reservoir can be estimated via sequential measurements on a quantum probe in contact with the reservoir.See figure 1(a). In this context, the ultimate limits on the attainable precision in the estimation of g, optimized with respect to the general detection strategy, can be computed, resulting in the so-called quantum Cramér-Rao bound, which exhibits the functional dependence upon g r via the quantum Fisher information. See e.g. [2,3,[7][8][9][10][11][12].In many situations of physical interest, however, the possibility of reinitializing the setup to the same state is not necessarily guaranteed. In the present study we are going to consider a different scheme, in which a single probing system undergoes multiple applications of g L while being monitored during the process without being