We introduce an entropy analysis of time series, repeated measurements of statistical observables, based on an Eulerian homogeneous degree-one entropy function Φ(t, n) of time t and number of events n. The duality of Φ, in terms of conjugate variables η = −Φ t and µ = Φ n , yields an "equation of state" (EoS) in differential form that resembles the Gibbs-Duhem relation in classical thermodynamics: tdη − ndµ = 0. For simple Poisson counting with rate r, η = r(e µ − 1). The conjugate variable η is then identified as being equal to the Hamiltonian function in a Hamilton-Jacobi equation for Φ(t, n). Applying the same logic to the entropy function of time correlated events yields a Hamiltonian as the principal eigenvalue of a matrix. For time reversible case it is the sum of a symmetric Markovian part √ πiqij/ √ πj and the conjugate variables µiδij. The eigenvector, as a posterior to the naive counting measure used as the prior, suggests a set of intrinsic characteristics of Markov states.