2020
DOI: 10.1080/2372966x.2020.1844548
|View full text |Cite
|
Sign up to set email alerts
|

Comparing Paper and Tablet Modalities of Math Assessment for Multiplication and Addition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
16
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 14 publications
(18 citation statements)
references
References 25 publications
2
16
0
Order By: Relevance
“…For both measures, average computer-based scores were lower than scores on the parallel paper/pencil measures. For the fact measure, average score on the paper/pencil probe was 30% higher, and for complex addition, it was 99% higher, similar in pattern and magnitude to previous studies (Aspiranti et al, 2021; Hensley et al, 2017). Results suggested that within-condition generalizability and dependability was strong.…”
Section: Discussionsupporting
confidence: 84%
See 1 more Smart Citation
“…For both measures, average computer-based scores were lower than scores on the parallel paper/pencil measures. For the fact measure, average score on the paper/pencil probe was 30% higher, and for complex addition, it was 99% higher, similar in pattern and magnitude to previous studies (Aspiranti et al, 2021; Hensley et al, 2017). Results suggested that within-condition generalizability and dependability was strong.…”
Section: Discussionsupporting
confidence: 84%
“…Although computer-based assessment has been in use in schools for some time, only a small number of studies have been conducted to directly examine computer-based versus paper/pencil administration of curriculum-based measurements (CBMs). Limited evidence to date suggests that paper/pencil assessment generally results in higher scores than assessment conducted via computer or other learning device and that gains attained in one format of instruction (e.g., computer-based) may not generalize to the other format (e.g., in-person; Aspiranti et al, 2021; Hensley et al, 2017). Hensley et al examined scores on identical measures of a fact skill (multiplication 0–12) with a large sample of students in Grades 3–5.…”
Section: Prior Researchmentioning
confidence: 99%
“…Further, the students completed all assessments online, which may have negatively impacted data collection, particularly for the correct digits per minute on fluency probes. As noted, previous researchers documented students performed less well on computer-based assessments than paper-pencil assessments, which may have contributed to the lower rates of cdpm (Aspiranti et al, 2020; Hensley et al, 2017). Researchers may seek to replicate this study but focus on paper-pencil assessments.…”
Section: Discussionmentioning
confidence: 86%
“…Although the rates are considerably lower, suggesting the VRA instructional sequence is not designed to support fluency as measured on timed computational assessments, one must also consider students took these assessments online. Previous researchers found lower rates of mathematical fluency in elementary students when they solved multiplication or additional problems on a computer-based word document or a on a tablet with a stylus than with paper-and-pencil (Aspiranti et al, 2020; Hensley et al, 2017).…”
Section: Discussionmentioning
confidence: 93%
“…In addition to the five articles featured in this current issue of School Psychology Review, there were four articles featured in 2020 (Aspiranti et al, 2020;Song et al, 2020;Stifel et al, 2020;Wendel et al, 2020) and additional articles are available online (Anderson et al, 2021;Gregus et al, 2021;King et al, 2021;McIntyre et al, 2021;Ogg et al, 2021;Ye et al, 2021) (see Table 1). School Psychology Review will continue to feature contemporary scholarship focused on COVID-19 that further advances and informs the field of school psychology.…”
Section: Psychosocial and Mental Health Concerns And Special Topic Contributionsmentioning
confidence: 99%