2021
DOI: 10.31234/osf.io/rsdwg
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

MouseView.js: Reliable and valid attention tracking in web-based experiments using a cursor-directed aperture

Abstract: Psychological research is increasingly moving online, where web-based studies allow for data collection at scale. Behavioural researchers are well-supported by existing tools for participant recruitment, and for building and running experiments with decent timing. However, not all techniques are portable to the internet: While eye tracking works in tightly controlled conditions in the lab, webcam-based eye tracking suffers from high attrition and poorer quality due to basic limitations like webcam availability… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
8
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(8 citation statements)
references
References 35 publications
0
8
0
Order By: Relevance
“…By moving their cursor around, participants can continuously uncover small sections of the target image at a time, imitating foveation. MouseView measures have been shown to significantly predict real human fixation measures (Anwyl-Irvine et al, 2021). In the current experiment, all MouseView images were presented at a size of 500 × 500 pixels with an aperture of 5% of the viewing window.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…By moving their cursor around, participants can continuously uncover small sections of the target image at a time, imitating foveation. MouseView measures have been shown to significantly predict real human fixation measures (Anwyl-Irvine et al, 2021). In the current experiment, all MouseView images were presented at a size of 500 × 500 pixels with an aperture of 5% of the viewing window.…”
Section: Methodsmentioning
confidence: 99%
“…Procedure. To determine how people viewed these images, we used MouseView, a mouse-tracking method analogous to eye tracking (Anwyl-Irvine et al, 2021). Generally, a target image is obscured by a white overlay, and MouseView creates a circular aperture that moves with the computer mouse to reveal the image underneath it (Fig.…”
Section: Methodsmentioning
confidence: 99%
“…This experiment employed MouseView.js (Anwyl-Irvine, Armstrong, & Dalmaijer, 2021), which allowed us to record the incremental reading of sentences in an online experiment.…”
Section: Methodsmentioning
confidence: 99%
“…In Experiment 2, we investigated whether this folk etymology effect (the association of words that overlap in form and meaning) also influences sentence comprehension. Participants made semantic relatedness decisions about pairs of sentences containing those cue and target words (e.g., The kids dodged something followed by She tried to evade/elude the officer), and we measured fixation duration on the target words (using MouseView.js; Anwyl-Irvine, Armstrong & Dalmaijer, 2021). If people make use of similar-sounding attractor words to facilitate meaning access at the sentence level, then participants should read the target word faster (as measured in first-pass fixation) when the second sentence contains a non-arbitrary target compared to an arbitrary target.…”
Section: The Present Studymentioning
confidence: 99%
“…While artificial attention collected in this way cannot be considered equivalent to eye-tracking metrics, this cursor-controlled spotlight method of capturing spatial attention in online experiments has been explored in other work. In particular, a validation study was performed by the authors of MouseView.js, a JavaScript library supporting experiments of this kind, and finding that patterns of dwell time were similar to those collected with the same stimuli in an eye-tracking experiment(Anwyl-Irvine et al, 2021). Note that this study uses its own implementation of the cursor spotlight method since MouseView.js was released publicly after completion of data collection.Downloaded from http://direct.mit.edu/isal/proceedings-pdf/isal/33/85/1930020/isal_a_00419.pdf by guest on 13 December 2021…”
mentioning
confidence: 98%