Summary
Directly affecting the user experience, performance is a crucial aspect of today's software applications. Representative load testing allows to effectively test and preserve the performance before delivery by mimicking the actually expected workload. In the literature, various approaches have been proposed for extracting representative load tests from recorded user sessions. However, these approaches require manual parameterization for specifying input data and adjusting static properties such as a request's domain name. This manual effort accumulates when load tests need to be updated due to changing production workloads and APIs.
In this paper, we address the reduction of the maintenance effort for representative load testing. We introduce input data and properties annotations (IDPAs) that store manual parameterizations and can be evolved automatically. Experts only have to parameterize extracted load tests initially. For dealing with API changes, we develop approaches to evolve IDPAs for the types of changes described in the literature. We evaluated our approach in two experimental studies, by deriving effort estimation models, and in an industrial case study including four different software projects. Our evaluation shows that IDPAs can parameterize generated load tests for restoring the representativeness, especially for applications with workloads dominated by request orders and rates. The maintenance effort can be reduced from a quadratic cumulative effort over time to a linear cumulative effort for a typical mix of API changes. Furthermore, we were able to express all parameterizations required by the industrial projects using the IDPA but also had to integrate extensions using the provided extension mechanisms.