In this article we propose a framework for reporting mathematics results from national assessment surveys (NAS) such that effective use of the resulting reports can enhance teaching and learning. We explored literature on factors that may contribute to non-utilisation of assessment data as a basis for decision-making. In the context of South Africa, we identified the form and formats in which results of NAS are reported as a possible limiting factor to the effective use of summative assessment results for formative purposes. As an alternative, we propose a standards-based reporting framework that will ensure accurate measurement of, and meaningful feedback on, what learners know and can do. We illustrate how, within a properly designed reporting framework, the results of a NAS in mathematics can be used for formative purposes to enhance teaching and learning and, possibly, improve learner performance.
This paper reports on the perceptions and experiences of primary school teachers of the challenges they faced and the prospects of using data from the Annual National Assessments (ANAs). While the majority stated that information from the ANAs can assist teachers to improve learning, responses on the use of the ANAs in the classroom were mixed, with most reporting that teachers did not know how to use ANA results to improve learning, and that no plans were in place at their schools for the use of ANA data. A significant proportion also indicated that they received little or no support from the school district on how to use ANA results. These findings were consistent across the school quintiles as well as the foundation and intermediate phases. Given the potential value of the ANAs, the paper highlights two initiatives aimed at enhancing the meaningful use of ANA results to improve learning and teaching in schools.
The study reported on here contributes to the growing body of knowledge on the use of standard setting methods for improving the reporting and utility value of assessment results in South Africa as well as for addressing the conceptual shortcomings of the Curriculum and Assessment Policy Statement (CAPS) reporting framework. Using data from the “verification” version of the Annual National Assessments (ANAs), we explored relevant technical and conceptual factors to consider for the application of standard setting methods. Two sets of panellists were trained to generate cut scores for Grade 6 mathematics and English First Additional Language (FAL), one using the Angoff method and the other the Objective Standard Setting (OSS) method. The findings indicate that the 2 methods generated different sets of cut scores across the performance levels for both subjects. While these cut scores had significant implications for the percentage of learners classified at each performance level, they were consistent with findings from other studies. We also identified 4 key factors to address when undertaking standard setting exercises: engagement with test content, resource requirements, requisite expertise and software, and collective accountability. We conclude that standard setting approaches should be the preferred option to the CAPS reporting framework when reporting assessment results in South Africa. More importantly, the decision on the most appropriate method for the South African context depends largely on the extent to which the 4 key factors identified can be addressed.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.