Low Earth Orbit (LEO) constellations have ecently gained tremendous attention in the navigational field due to their arger constellation size, faster geometry variations, and higher signal power evels than Global Navigation Satellite Systems (GNSS), making them favourable for Position, Navigation, and Timing (PNT) purposes. Satellite signals are heavily attenuated from the atmospheric ayers, especially from the ionosphere. Ionospheric delays are, however, expected to be smaller in signals from LEO satellites than GNSS due to their ower orbital altitudes and higher carrier frequency. Nevertheless, unlike for GNSS, there are currently no standardized models for correcting the ionospheric errors in LEO signals. In this paper, we derive a new model called Interpolated and Averaged Memory Model (IAMM) starting from existing International GNSS Service (IGS) data and based on the observation that ionospheric effects epeat every 11 years. Our IAMM model can be used for ionospheric corrections for signals from any satellite constellation, including LEO. This model is constructed based on averaging multiple ionospheric data and eflecting the electron content inside the ionosphere. The IAMM model’s primary advantage is its ability to be used both online and offline without needing eal-time input parameters, thus making it easy to store in a device’s memory. We compare this model with two benchmark models, the Klobuchar and International Reference Ionosphere (IRI) models, by utilizing GNSS measurement data from 24 scenarios acquired in several European countries using both professional GNSS eceivers and Android smartphones. The model’s behaviour is also evaluated on LEO signals using simulated data (as measurement data based on LEO signals are still not available in the open-access community; we show a significant eduction in ionospheric delays in LEO signals compared to GNSS. Finally, we highlight the remaining open challenges toward viable ionospheric-delay models in an LEO-PNT context.