Context] Automated test case design and execution at the GUI level of applications is not a fact in industrial practice. Tests are still mainly designed and executed manually. In previous work we have described TESTAR, a tool which allows to set-up fully automatic testing at the GUI level of applications to find severe faults such as crashes or non-responsiveness. [Method] This paper aims at the evaluation of TESTAR with an industrial case study. The case study was conducted at SOFTEAM, a French software company, while testing their Modelio SaaS system, a cloud-based system to manage virtual machines that run their popular graphical UML editor Modelio.[Goal] The goal of the study was to evaluate how the tool would perform within the context of SOFTEAM and on their software application. On the other hand, we were interested to see how easy or di cult it is to learn and implant our academic prototype within an industrial setting.[Results] The e↵ectiveness and e ciency of the automated tests generated with TESTAR can definitely compete with that of the manual test suite. [Conclusions] The training materials as well as the user and installation manual of TESTAR need to be improved using the feedback received during the study. Finally, the need to program Java-code to create sophisticated oracles for testing created some initial problems and some resistance. However, it became clear that this could be solved by explaining the need for these oracles and compare them to the alternative of more expensive and complex human oracles. The need to raise consciousness that automated testing means programming solved most of the initial problems.