Long-term persistence (LTP) in geophysical time series, or the tendency of above or below average runs of years to be unusually long, was first quantified by Hurst (Hurst, 1951(Hurst, , 1956 using a coefficient H which characterizes LTP in the range 0.5 < H < 1, with H = 0.5 corresponding to the independent (white noise) case, that is, no persistence. Hurst analyzed a wide set of geophysical times series and found an average value of H = 0.73, with a standard deviation of 0.09. In particular, he reported a value of H = 0.9 for annual river Nile flows at Aswan, reflecting strong LTP. The disparity between these results, and the then current theory that predicted H = 0.5 based on the increments of Brownian motion, has come to be known as the Hurst Phenomenon. Over the years, a number of stochastic approaches to modeling LTP have emerged (e.g., fractional Gaussian noise (Mandelbrot & Wallis, 1968, 1969, ARMA models (O'Connell, 1974a(O'Connell, , 1974b, shifting mean models (Boes & Salas, 1978) and fractionally differenced models (Hosking, 1984)). Koutsoyiannis (2011a) has shown that those models exhibiting Hurst behavior asymptotically can be encapsulated within a Hurst -Kolmogorov (HK) stochastic dynamics framework characterized by a simple scaling law, acknowledging the contribution