The time-variant Lyapunov equation (TVLE) has played an important role in many fields due to its ubiquity and many neural dynamics models have been developed to obtain the online solution of the TVLE. In prevalent methods, the gradient neural dynamics (GND) models suffer from the large residual error due to the lack of predictive computing, while the zero neural dynamics (ZND) models have large computing complexity because of the inverse of the mass matrix in models. To mitigate these deficiencies, an adaptive parameter containing the time-derivative of time-variant parameters in the TVLE is added to the GND model to form the adaptive GND (AGND) model, which enables the AGND model predictive computing as ZND models and inherits the free of matrix inverse from the GND models. Moreover, two strategies are proposed to design the accelerated AGND (AAGND) models that enjoy a faster convergence rate. The accuracy and the convergence rate of AAGND models are theoretically analyzed, indicating that AAGND models achieve zero residual error and a faster convergence rate. In addition, numerical simulations and two applications are provided to verify the theoretical analyses and the efficiency of AAGND models. The experimental results demonstrate that the AAGND model can solve the TVLE with high accuracy and have great potentialities in applications.