Learning SQL can be surprisingly difficult, given the relative simplicity of its syntax. Automated tools for teaching and assessing SQL have existed for over two decades. Early tools were only designed for teaching and offered increased feedback and personalised learning, but not summative assessment. More recently, however, the trend has turned towards automated assessment, with learning as a side-effect. These tools offer more limited feedback and are not personalised. In this paper, we present SQL Tester, an online assessment tool and an assessment of its impact. We show that students engaged with SQL Tester as a learning tool, taking an average of 10 practice tests each and spending over 4 hours actively engaged in those tests. A student survey also found that over 90% of students agreed that they wanted to keep trying practice tests until they got a "good" mark. Finally, we present some evidence that taking practice tests increased student achievement, with a strong correlation between the number of practice tests a student took and their score on the assessed test.