Proceedings of the SIGCHI Conference on Human Factors in Computing Systems 2005
DOI: 10.1145/1054972.1055006
|View full text |Cite
|
Sign up to set email alerts
|

Individual differences in multimodal integration patterns

Abstract: Techniques for information fusion are at the heart of multimodal system design. To develop new user-adaptive approaches for multimodal fusion, the present research investigated the stability and underlying cause of major individual differences that have been documented between users in their multimodal integration pattern. Longitudinal data were collected from 25 adults as they interacted with a map system over six weeks. Analyses of 1,100 multimodal constructions revealed that everyone had a dominant integrat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

2
32
0
2

Year Published

2005
2005
2020
2020

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 60 publications
(36 citation statements)
references
References 12 publications
2
32
0
2
Order By: Relevance
“…This use of commands indicates that individual touch-first and speech-first response patterns which were relatively consistent and stable over time. This agrees with the results found by Oviatt, Lunsford and Coulston [8], who also found consistent, stable individual differences in submitting commands in multimodal displays. However, when describing these individual differences, Oviatt, Lunsford and Coulston [8], and Oviatt, Coulston and Lunsford [7] noted that individual differences in timing of responses (sequential or simultaneous responses) were a dominant trend in individual differences, but did not describe any trend involving preferred modality of order of touch or speech output.…”
Section: Discussionsupporting
confidence: 92%
See 3 more Smart Citations
“…This use of commands indicates that individual touch-first and speech-first response patterns which were relatively consistent and stable over time. This agrees with the results found by Oviatt, Lunsford and Coulston [8], who also found consistent, stable individual differences in submitting commands in multimodal displays. However, when describing these individual differences, Oviatt, Lunsford and Coulston [8], and Oviatt, Coulston and Lunsford [7] noted that individual differences in timing of responses (sequential or simultaneous responses) were a dominant trend in individual differences, but did not describe any trend involving preferred modality of order of touch or speech output.…”
Section: Discussionsupporting
confidence: 92%
“…The time between the onset of a first control action (e.g., a speech command) relative to the onset of a second sequential, dependent control action (e.g., a consequent touch command) can be defined operationally as temporal binding, and is critical for the system to correctly interpret the operator's intent, as well as to support a smoother fusion of commands to the system and reduce system error. Although Oviatt, Lunsford and Coulston [8] suggested that identifying time between control actions is important, neither they nor any other researcher used sequential speech and touch controls. In addition, few researchers have explored user issues that might affect temporal binding.…”
mentioning
confidence: 99%
See 2 more Smart Citations
“…When the system the user interacts with allows multimodality then more information about the user can be extracted in real-time. For example, the system may learn about interaction pattern preferences [1] or detect the user's emotional state and adapt its interaction behavior, its interface and its feedback according to them. The body and what the user is doing with his or her body is becoming important for the system and this is even more the case when the user is allowed to move around and interact from different positions and with various objects, maybe other users and parts of a computer-supported or monitored environment.…”
Section: Modeling Partners Participants and Inhabitantsmentioning
confidence: 99%