Among silicon-based solar cells, heterojunction cells hold the world efficiency record. However, their market acceptance is hindered by an initial 0.5% per year degradation of their open circuit voltage which doubles the overall cell degradation rate. Here, we study the performance degradation of crystalline-Si/amorphous-Si:H heterojunction stacks. First, we experimentally measure the interface defect density over a year, the primary driver of the degradation. Second, we develop SolDeg, a multiscale, hierarchical simulator to analyze this degradation by combining Machine Learning, Molecular Dynamics, Density Functional Theory, and Nudged Elastic Band methods with analytical modeling. We discover that the chemical potential for mobile hydrogen develops a gradient, forcing the hydrogen to drift from the interface, leaving behind recombination-active defects. We find quantitative correspondence between the calculated and experimentally determined defect generation dynamics. Finally, we propose a reversed Si-density gradient architecture for the amorphous-Si:H layer that promises to reduce the initial open circuit voltage degradation from 0.5% per year to 0.1% per year.