The estimation of generative structures for sequences is becoming increasingly important for preventing such data sources from becoming a flood of disorganized information. Obtaining Hidden Markov models (HMMs) has been a central method for structuring such data. However, users have been aware of the slow speed of this algorithm. In this study, we devise generalized and fast estimation methods for HMMs by employing a geometric information measure that is associated with a function called the alpha-logarithm. Using the alphalogarithmic likelihood ratio, we exploit prior iterations to guide rapid convergence. The parameter alpha is used to adjust the utilization of previous information. A fixed-point approach using a causal shift and a series expansion is responsible for this gain. For software implementations, we present probability scaling to avoid underflow, where we generalize flaw corrections to the de facto standard. For the update mechanism, we begin with a method called shotgun surrogates, in relation to the parameter alpha. Then, we obtain a dynamic version that employs the controlling and undoing of alpha. Experiments on biological sequences and brain signals for practical state models demonstrate that a significant speedup is achieved compared to the Baum-Welch method. The effects of restricting the state models are also reported.Index Terms-Alpha-hidden Markov model estimation, message passing, shotgun surrogates, dynamic surrogate, convergence speedup.1053-587X (c)