Automatic assessment can reduce teacher workload and offer flexibility for students, but if the teacher does not assess the exams manually, the teacher's view of the students' competence and exam-related behaviours will be meagre. This drawback can be mitigated through appropriate analytics. To support the design of an analytics module for an electronic assessment system, this paper investigates what kinds of analyses are useful for multiplechoice exams and how the analysis can be implemented. Three types of analysis were found useful: 1) descriptive statistics of exam answers, 2) analysis of errors in answers and 3) analysis of students' exam-taking behaviours. Though these analyses are to some extent generalisable, analysis needs vary, for example, by time, exam type and user. Therefore, it is suggested that to enable user-specific analyses in a resource-efficient manner, assessment software providers should facilitate access to assessment data in a structured format.