2001
DOI: 10.21061/jcte.v18i1.600
|View full text |Cite
|
Sign up to set email alerts
|

Testing in a Computer Technology Course: An Investigation of Equivalency in Performance Between Online and Paper and Pencil Methods

Abstract: This experiment sought to examine the equivalence of online and paper and pencil testing methods as related to student performance in a computer technology course. Test score and completion time were the dependent variables that were used to assess students' performance. The study utilized a quasi-experimental design. Test scores were not significantly different on the variables of pretest, age, class standing, ethnicity, and gender. The findings showed that test scores were equivalent in both groups; however,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
28
0
2

Year Published

2005
2005
2022
2022

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 32 publications
(30 citation statements)
references
References 8 publications
0
28
0
2
Order By: Relevance
“…First, given no practical differences in students' perceptions of web-based testing based on the variables investigated, physics educators who are working with similar groups may find online testing to be a viable alternative to traditional paper and pencil testing. Several studies have reported no differences in student performance when online and paper and pencil test scores have been compared (e.g., Alexander et al, 2001;Bicanich, Slivinski, Hardwicke, & Kapes, 1997;Bonham at al., 2003). Second, while students' perceptions of web-based testing are generally positive, in some areas physics educators may have to adapt the online testing process to better fit the desires of students.…”
Section: Implications and Discussionmentioning
confidence: 99%
“…First, given no practical differences in students' perceptions of web-based testing based on the variables investigated, physics educators who are working with similar groups may find online testing to be a viable alternative to traditional paper and pencil testing. Several studies have reported no differences in student performance when online and paper and pencil test scores have been compared (e.g., Alexander et al, 2001;Bicanich, Slivinski, Hardwicke, & Kapes, 1997;Bonham at al., 2003). Second, while students' perceptions of web-based testing are generally positive, in some areas physics educators may have to adapt the online testing process to better fit the desires of students.…”
Section: Implications and Discussionmentioning
confidence: 99%
“…However, the pervasion of computers in schools reverses the concerns educators and assessors once had about the use of computers in writing assessment. With computers being extensively used for most students' writing, projects, and assignments (Alexander et al, 2001), many students have become adept computer users. This situation calls researchers to question the fairness and validity of results of writing assessments of students who are adroit computer users constrained to handwrite their essays (Way, Davis, & Strain-Seymour, 2008;Lee, 2004;Russell & Haney, 1997).…”
Section: The Need For Transitioningmentioning
confidence: 99%
“…Several studies report instances where students found word processors enabled them to make revisions that improved the quality of their writings more easily than they could when they handwrote their essays (Li, 2006). Also, computer-based testing saves costs associated with printing and shipping test paper materials (Way, Davis, & Fitzpatrick, 2006) and eliminates the constraints of time and geographical location in test administration (Alexander et al, 2001). Hence, it seems logical that transitioning to computer-based writing assessment is the right way to go.…”
Section: The Need For Transitioningmentioning
confidence: 99%
“…This research interest in computer-based testing is likely a result of the many advantages associated with its use (Goldberg & Pedulla, 2002). A number of researchers have reported on the advantages of computer-based testing (e.g., Alderson, 2000;Alexander, Bartlett, Truell, & Ouwenga, 2001;Barkley, 2002;Bocij & Greasley, 1999;DeSouza & Fleming, 2003;Goldberg & Pedulla, 2002;Greenberg, 1998;Shermis & Lombard, 1998;Shermis, Mzumara, & Bublitz, 2001;Song, 1998;Stephens, 2001;Truell & Davis, 2003). Often cited advantages of computer-based testing include decreased testing costs, effective records management, increased assessment options, improved scoring precision, instant feedback to students, more instructional time, more test administration choices, and reduced testing time.…”
Section: Introductionmentioning
confidence: 99%
“…The majority of the test item exposure control research focused on the impact of test items selected to be exposed to a test taker from large test item pools. Further, computer-based testing systems have caused some researchers to express concern that its equivalency with traditional testing techniques be confirmed (Alexander et al, 2001;Bugbee & Bernt, 1990;Bugbee, 1996;Truell & Joyner, 2003;Truell, 2005). Finally, Truell (2005) recommended that research was needed regarding the various settable interface formats available to faculty using computer-based testing systems.…”
Section: Introductionmentioning
confidence: 99%