Physics-Informed Neural Network (PINN) is a data-driven solver for partial and ordinary differential equations (ODEs/PDEs). It provides a unified framework to address both forward and inverse problems. However, the complexity of the objective function often leads to training failures. This issue is particularly prominent when solving high-frequency and multi-scale problems. We proposed using transfer learning to boost the robustness and convergence of training PINN, starting training from low-frequency problems and gradually approaching high-frequency problems through fine-tuning. Through two case studies, we discovered that transfer learning can effectively train PINNs to approximate solutions from low-frequency problems to high-frequency problems without increasing network parameters. Furthermore, it requires fewer data points and less training time. We compare the PINN results using direct differences and L2 relative error showing the advantage of using transfer learning techniques. We describe our training strategy in detail, including optimizer selection, and suggest guidelines for using transfer learning to train neural networks to solve more complex problems.