Automated driving promises great possibilities in traffic safety advancement, frequently assuming that human error is the main cause of accidents, and promising a significant decrease in road accidents through automation. However, this assumption is too simplistic and does not consider potential side effects and adaptations in the socio-technical system that traffic represents. Thus, a differentiated analysis, including the understanding of road system mechanisms regarding accident development and accident avoidance, is required to avoid adverse automation surprises, which is currently lacking. This paper, therefore, argues in favour of Resilience Engineering using the functional resonance analysis method (FRAM) to reveal these mechanisms in an overtaking scenario on a rural road to compare the contributions between the human driver and potential automation, in order to derive system design recommendations. Finally, this serves to demonstrate how FRAM can be used for a systemic function allocation for the driving task between humans and automation. Thus, an in-depth FRAM model was developed for both agents based on document knowledge elicitation and observations and interviews in a driving simulator, which was validated by a focus group with peers. Further, the performance variabilities were identified by structured interviews with human drivers as well as automation experts and observations in the driving simulator. Then, the aggregation and propagation of variability were analysed focusing on the interaction and complexity in the system by a semi-quantitative approach combined with a Space-Time/Agency framework. Finally, design recommendations for managing performance variability were proposed in order to enhance system safety. The outcomes show that the current automation strategy should focus on adaptive automation based on a human-automation collaboration, rather than full automation. In conclusion, the FRAM analysis supports decision-makers in enhancing safety enriched by the identification of non-linear and complex risks.