Testers face challenges to have their test suites up-to-update related to the application source code evolution to be tested. These challenges are greater in a global distributed software development context. The growing daily testing demand also makes difficult the maintenance of such suites. These test suites are also continuously automated to save the time of human testers, but they require maintenance as well. Exploratory testing comes as a trade-off between test case maintenance and human expertise and flexibility. Unfortunately, it is a manual task in general. In this work, we specifically designed a strategy called AETing, which automatically performs screen navigations and Monkey testing aiming to maximize coverage of code changes evolution between two versions of a given Android application. We developed and evaluated our approach in a real testing operation environment related to Motorola Mobility, through an agreement between CIn-UFPE and this company supported by the Informatics Law. The evaluation consisted of testing four different Motorola Android applications. Through the evaluation, we obtained promising results concerning the comparison between AETing and expert exploratory testers’ code coverage. We discuss in detail how AETing works and the results achieved.