Despite the proven effectiveness of Transformer neural networks across multiple domains, their performance with Electronic Health Records (EHR) can be nuanced. The unique, multidimensional sequential nature of EHR data can sometimes make even simple linear models with carefully engineered features more competitive. Thus, the advantages of Transformers, such as efficient transfer learning and improved scalability are not always fully exploited in EHR applications.In this work, we aim to forecast the demand for healthcare services, by predicting the number of patient visits to healthcare facilities. The challenge amplifies when dealing with divergent patient subgroups, like those with rare diseases, which are characterized by unique health trajectories and are typically smaller in size. To address this, we employ a self-supervised pretraining strategy, Generative Summary Pretraining (GSP), which predicts future summary statistics based on past health records of a patient. Our models are pretrained on a health registry of nearly one million patients, then fine-tuned for specific subgroup prediction tasks, showcasing the potential to handle the multifaceted nature of EHR data.In evaluation, SANSformer consistently surpasses robust EHR baselines, with our GSP pretraining method notably amplifying model performance, particularly within smaller patient subgroups. Our results illuminate the promising potential of tailored attention-free models and self-supervised pretraining in refining healthcare utilization predictions across various patient demographics.Impact Statement-Large neural networks have demonstrated success in various predictive tasks using Electronic Health Records (EHR). However, their performance in small divergent patient cohorts, such as those with rare diseases, often falls short of simpler linear models due to the substantial data requirements of large models. To address this limitation, we introduce the SANSformers architecture, specifically designed for forecasting healthcare utilization within EHR. Distinct from traditional transformers, SANSformers utilize attention-free mechanisms, thereby reducing complexity. We also present Generative Summary Pretraining (GSP), a self-supervised learning technique that enables large neural networks to maintain predictive efficiency even with smaller patient subgroups. Through extensive evaluation across two real-world datasets, we provide a comparative analysis with existing state-of-the-art EHR prediction models, offering a new perspective on predicting healthcare utilization.