With the recent progress made in areas such as head-mounted displays and vision-correcting devices, there is a growing interest in fast and personalized algorithms for simulating aberrated human vision. Existing vision-simulating approaches are generally hindered by the lack of personalization, computational cost of rendering, and limited types of supported aberrations. This paper presents a fast vision simulation method with interactive personalization capabilities for simulating arbitrary central and peripheral aberrations of the human eye. First, we describe a novel, neural network-based solution for efficiently estimating the physical structure of the simulated eye and calculating the necessary Zernike aberration coefficients for computing the point-spread functions with varying pupil sizes, focus distances, and incidence angles. Our new approach operates in the sub-second regime and produces highly accurate outputs, facilitating the interactive personalization of vision simulation. Next, we present an improved PSF interpolation method for an existing tiled PSF splatting algorithm for rendering. The proposed algorithm significantly improves the computational performance and memory efficiency of the previous approach, allowing the simulation of peripheral vision with arbitrary visual aberrations in low-latency applications. Following the description of our new techniques, we evaluate their performance characteristics and simulation accuracies on several different eye conditions and test scenarios and compare our results to several previous vision simulation algorithms.