2021
DOI: 10.1021/acs.jchemed.1c00749
|View full text |Cite
|
Sign up to set email alerts
|

Response Process Validity Evidence in Chemistry Education Research

Abstract: Response process validity evidence can provide researchers with insight into how and why participants interpret items on instruments (e.g., tests, questionnaires). In the chemistry education research literature and in the social sciences more broadly, there has been variable use and reporting of response process aspects of studies. This manuscript's objective is to support researchers in developing purposeful, theory-driven protocols to investigate response processes. We highlight key considerations for resear… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
22
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 25 publications
(22 citation statements)
references
References 73 publications
0
22
0
Order By: Relevance
“…The inclusion of these two items, paired with follow-up interviews, was determined to be the best methodology to approach this topic, as a broad overview of students' attitudes was desired. Follow-up interviews served the purpose of allowing the researchers to ask clarifying questions as needed (Deng et al, 2021). The ATOC also included questions regarding demographic information, including gender identity, racial identity, age, career aspirations, if the student had taken organic chemistry before, first-generation college student status, and if the student had any non-traditional aspects about their college experience (i.e., has dependents, works 40 hours a week, is financially self-supporting, is working an off-campus job).…”
Section: Instrument Designmentioning
confidence: 99%
See 1 more Smart Citation
“…The inclusion of these two items, paired with follow-up interviews, was determined to be the best methodology to approach this topic, as a broad overview of students' attitudes was desired. Follow-up interviews served the purpose of allowing the researchers to ask clarifying questions as needed (Deng et al, 2021). The ATOC also included questions regarding demographic information, including gender identity, racial identity, age, career aspirations, if the student had taken organic chemistry before, first-generation college student status, and if the student had any non-traditional aspects about their college experience (i.e., has dependents, works 40 hours a week, is financially self-supporting, is working an off-campus job).…”
Section: Instrument Designmentioning
confidence: 99%
“…Follow-up, response process interviews were also conducted to gain a more robust understanding of students' responses (Deng et al, 2021). Students were randomly selected and invited to participate for compensation in the form of a nominal gift card.…”
Section: Follow-up Interviewsmentioning
confidence: 99%
“…For example, Arjoon, Xu, and Lewis clearly describe the different types of validity and reliability as well as methods to establish them for surveys and assessment tools . Deng, Streja, and Flynn take a deep dive on methods to establish response process validity (i.e., ensuring that responders understand the constructs targeted in the instrument as intended by the designers of the instrument) . Watts and Finkenstaedt-Quinn focus on processes to establish the reliability of qualitative data .…”
Section: Common Criteria Considered By Reviewersmentioning
confidence: 99%
“…For example, several past Associate Editors for this Journal edited how-to books . In recent years, we have seen a rise in peer-reviewed manuscripts focused on educating the community about particular methodological and analytical approaches (for example, refs and ). During my time as an Associate Editor for this Journal , the need to compile and make these resources more readily available became apparent.…”
mentioning
confidence: 99%
“…Therefore, the information about a given assessment instrument may appear across numerous publications, further increasing the time required to make a decision about the instrument’s implementation based on the most comprehensive information possible. Lastly, the chemistry education community continues to grow in its use of different frameworks/models for assessment instrument development and in the research methods used to evaluate the data generated by these tools. Additionally, the terminology used when describing validity and reliability has evolved and has not always been standardized, which complicates the review and synthesis of studies published across time. , Up to now, no coordinated effort has been made in the chemistry education community to develop a centralized list of assessment instruments or to compile the evidence supporting the administration of these tools.…”
Section: Introductionmentioning
confidence: 99%