Proceedings of the 2006 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology 2006
DOI: 10.1145/1178823.1178882
|View full text |Cite
|
Sign up to set email alerts
|

Straw-like user interface

Abstract: The Straw-like User Interface is a novel interface system that allows us to virtually experience the sensations of drinking. These sensations are created based on referencing sample data of actual pressure, vibration and sound produced by drinking from an ordinary straw attached to the system. This research of presenting virtual drinking sensations to mouth and lips is the first in the world to have been attempted, and comes with high academic expectations. Moreover, due to the high sensitivity of the mouth an… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2012
2012
2024
2024

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 20 publications
(2 citation statements)
references
References 6 publications
0
2
0
Order By: Relevance
“…It is perhaps little wonder, then, that those trying to emulate more real-life experiences have focused on designing technologies that allow the integration and controllability of inputs associated with multiple sensory modalities (e.g., Kita and Rekimoto, 2013). For example, the “Straw-like User Interface (SUI)” augments the user’s drinking experiences based on multisensory inputs (e.g., using pressure, vibration, and sound, see Hashimoto et al, 2006; see also Ranasinghe et al, 2014). Another example comes from Ikeno et al (2013) who developed a system that combines vibrations and sounds (e.g., an auditory “glug” characteristic of a Sake bottle when a drink is poured) to influence the subjective impression of a liquid.…”
Section: Flavor Perception and Augmentationmentioning
confidence: 99%
“…It is perhaps little wonder, then, that those trying to emulate more real-life experiences have focused on designing technologies that allow the integration and controllability of inputs associated with multiple sensory modalities (e.g., Kita and Rekimoto, 2013). For example, the “Straw-like User Interface (SUI)” augments the user’s drinking experiences based on multisensory inputs (e.g., using pressure, vibration, and sound, see Hashimoto et al, 2006; see also Ranasinghe et al, 2014). Another example comes from Ikeno et al (2013) who developed a system that combines vibrations and sounds (e.g., an auditory “glug” characteristic of a Sake bottle when a drink is poured) to influence the subjective impression of a liquid.…”
Section: Flavor Perception and Augmentationmentioning
confidence: 99%
“…Such headsets might enable brands to deliver targeted experiences in VR. Whilst, at present, this approach appears more as a curiosity than anything else, we anticipate that it might 1 day become an extension of the total Kadomura et al, 2013 Tactile/haptic Gravitamine spice Cutlery weight Hirose et al, 2015 Vibration system Vibrations associated with beverage pouring Ikeno et al, 2015 Multi-sense Straw-like User Interface (SUI) Pressure, vibration, and sound during drinking Hashimoto et al, 2006 Audio-haptic rendering Vibrations and sounds during beverage pouring Ikeno et al, 2013 product experience, in that any given product might have its own customized multisensory experience(s) in VR (Lingle, 2017;Michail, 2017). Such experiences may be designed based on research showing the influence of visual atmospheric cues (e.g., lightning, environment) on flavor perception (Stroebele and De Castro, 2004;.…”
Section: Visual Augmentationmentioning
confidence: 99%