In a mathematical examination on paper, partial credit is normally awarded for an answer that is not correct, but, nevertheless, contains some of the correct working. Assessment on computer normally marks an incorrect answer wrong and awards no marks. This can lead to discrepancies between marks awarded for the same examination given in the two different media. The current paper presents possible solutions to this problem and the results of experiments designed to test how successful these solutions are in practice. In light of the findings, developments to the assessment engine have been made and some questions redesigned for use in real automated examinations. The results were obtained as part of the Project for Assessments in Scotland using Information Technology (PASS-IT): a major collaborative programme involving the leading educational agencies in Scotland (see http://www.pass-it.org.uk for more details).PASS-IT has demonstrated that the computer can help measure lower order student skill profiles provided the computer assessment package is sophisticated enough. Optional steps are required to mimic partial credit; randomisation of parameters is needed for practice and the avoidance of copying; the ability to capture and mark automatically mathematical expressions and short free text responses; and the delivery in a number of feedback modes are all vital ingredients of an automatic assessment system. PASS-IT has shown how to ensure that education drives technology and not vice versa. Finally, collaboration has been paramount within PASS-IT and should continue. No single group has all the keys to unlock the future of computer-aided assessment. Scotland is well placed to move forward and eassess where its students e-learn in a large range of subjects such as those delivered via the SCHOLAR Programme (see http://scholar.hw.ac.uk). It should be done too by ensuring that teachers remain central to the learning cycle by supporting the demanding work they do through the supply of suitably filtered data on student performance. Special educational needs can also be addressed more effectively by the use of technology. BackgroundWhen changing from assessment on paper to assessment by computer, many differences may be introduced; for example, it may be necessary to reword the question for computer delivery. Alternatively, the keyboard and/or mouse must be employed to answer the question rather than a pen on paper (see . If the outcome of the assessment on paper turns out to be different from that of the examination delivered by computer, it would not be clear which (if any) of these factors was the cause. Because of the large number of factors involved in the change from paper-based testing to electronic assessment, it is necessary to separate the differences. Experiments can then be designed to determine whether any particular difference did contribute to a change in performance. One experiment undertaken in this investigation on mathematics examinations was by Fiddes et al (2002) in which the effect of the delivery m...
This article presents results of a comparison between paper and computer tests of ability in Chemistry and Computing. A statistical model is employed to analyse the experimental data from almost 200 candidates. It is shown that there is no medium effect when specific traditional paper examinations in Chemistry and Computing are transferred into electronic format. The effect of rewording for computer-delivered test questions is also investigated and again the conclusion is that no evidence of a difference could be found. These results were obtained as part of the Project for Assessments in Scotland using Information Technology (PASS-IT).
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org.
At present most examinations are delivered on paper but there is a growing trend in many subjects to deliver some or part of these examinations by computer. It is therefore important to know whether there are any differences in the results obtained by candidates sitting examinations taken by computer compared with those obtained by candidates sitting conventional examinations using pen and paper. The purpose of this article is to describe the outcome of a pilot study designed to* investigate possible causes of any differences in results from the use of different modes of delivery in a mathematics examination. One outcome of this study was that the process of translating examination questions into a format required for use on the computer (but keeping this as a pen and paper test)can have a significant effect on examination results. However, the main conclusion is that changing the medium only has no effect on the results in mathematics examinations.
At present most examinations are delivered on paper but there is a growing trend in many subjects to deliver some or part of these examinations by computer. It is therefore important to know whether there are any differences in the results obtained by candidates sitting examinations taken by computer compared with those obtained by candidates sitting conventional examinations using pen and paper. The purpose of this article is to describe the outcome of a pilot study designed to* investigate possible causes of any differences in results from the use of different modes of delivery in a mathematics examination. One outcome of this study was that the process of translating examination questions into a format required for use on the computer (but keeping this as a pen and paper test)can have a significant effect on examination results. However, the main conclusion is that changing the medium only has no effect on the results in mathematics examinations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.