Wave particle interactions are very important to understand the intricate evolution of the Earth's radiation belt electrons. Kinetic simulations, in terms of solving the Fokker‐Planck equation based on the quasilinear theory, are usually used to simulate the radiation belt electron dynamic evolution. However, the global wave and plasma density distributions adopted in the kinetic simulations are very difficult to be directly obtained by satellites. Here we present a new model, by integrating the machine learning technique and kinetic simulations, to analyze the spatiotemporal evolution of radiation belt electrons scattered by lower band chorus (LBC). Compared to the observations, our integrated model produces effectively the global distribution of plasmapause location, plasma density, and LBC intensity, and assesses quantitatively the scattering effect driven by LBC waves at different magnetic local times (MLT), L‐shell (the Mcllwain L‐parameter), and time. Incorporating the effect of radiation electron drift, we further use the 2‐D Fokker‐Planck equation to simulate the variations of electron phase space density in different MLT sectors at a fixed L, and find that the integrated model replicates reasonably the multi‐MeV electron acceleration at L = 4.5 during the period from the main phase to the early recovery phase of the storm. Our results demonstrate that such an integrated model, on basis of a combination of the machine learning technique and kinetic simulations, provides valuable means for improved understanding of the global dynamic evolution of the Earth's radiation belt electrons.