Fiber amplifiers are essential devices for optical communication and laser physics, yet the intricate nonlinear dynamics they exhibit pose significant challenges for numerical modeling. In this study, we propose using a bi-LSTM neural network to predict the evolution of optical pulses along a fiber amplifier, accounting for the dynamically changing gain profile and the Raman scattering. The neural network can learn information from both past and future data, adhering to the fundamental principles of physics governing pulse evolution over time. We conducted experiments with a diverse range of initial pulse parameters, covering the variation in the ratio between dispersion and nonlinear length, ranging from 0.25 to 250. This deliberate choice has resulted in a wide variety of propagation regimes, ranging from smooth attractor-like to noise-like behaviors. Through a comprehensive evaluation of the neural network performance, we demonstrated its ability to generalize across the various propagation regimes. Notably, our results showcase a relative speedup of 2000 times for evaluating the intensity evolution map using our proposed neural network compared to the NLSE numerical solution employing the split-step Fourier method.