Keeping a memory of evolving stimuli is ubiquitous in biology, an example of which is immune memory for evolving pathogens. However, learning and memory storage for dynamic patterns still pose challenges in machine learning. Here, we introduce an analytical energy-based framework to address this problem. By accounting for the tradeoff between utility in keeping a high-affinity memory and the risk in forgetting some of the diverse stimuli, we show that a moderate tolerance for risk enables a repertoire to robustly classify evolving patterns, without much fine-tuning. Our approach offers a general guideline for learning and memory storage in systems interacting with diverse and evolving stimuli.