2020
DOI: 10.1109/toh.2020.2966192
|View full text |Cite
|
Sign up to set email alerts
|

Towards Multisensory Perception: Modeling and Rendering Sounds of Tool-Surface Interactions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(15 citation statements)
references
References 28 publications
0
3
0
Order By: Relevance
“…[25] Accordingly, there is an obvious need and opportunity to leverage vibration cues in tactile pseudo-haptics for rendering manual interactions with virtual objects. Moreover, multisensory pseudo-haptics may further benefit from the addition of auditory feedback, which conveys rich information about object-based interactions [26] and systematically shapes tactile vibration perception. [27,28] Finally, our experiments explored only how user performance with the combination of referred haptic feedback at the wrist and visual pseudo-haptics compares to each modality alone.…”
Section: Discussionmentioning
confidence: 99%
“…[25] Accordingly, there is an obvious need and opportunity to leverage vibration cues in tactile pseudo-haptics for rendering manual interactions with virtual objects. Moreover, multisensory pseudo-haptics may further benefit from the addition of auditory feedback, which conveys rich information about object-based interactions [26] and systematically shapes tactile vibration perception. [27,28] Finally, our experiments explored only how user performance with the combination of referred haptic feedback at the wrist and visual pseudo-haptics compares to each modality alone.…”
Section: Discussionmentioning
confidence: 99%
“…The actuator is placed inside the stylus under the user's grip position and presents the vibrotactile sensation to the user's finger through the stylus's body. The stylus device is compatible with other data-driven haptic rendering methods for stylusshaped devices [47], [48], [49], [50], [51]. They synthesize vibrotactile patterns of various textures using prerecorded vibration patterns.…”
Section: Haptic Devices Controlled By Pvlcmentioning
confidence: 99%
“…Due to the recent breakthroughs on computer graphics research, the transition from traditional 2D visual content to adaptive 3D mixed reality worlds is straightforward and showing promising results 1 . In haptics, most of the existing works for vibrotactile feedback rendering or sound rendering use a tool-based texture interaction approach, where the user moves the tool over the virtual texture, and vibrotactile feedback is rendered via an actuator 2 4 , or sound is rendered through headphones 5 . Independently, both vibrotactile feedback rendering and sound rendering demonstrated sufficient reconstruction accuracy when applying the data-driven paradigm.…”
Section: Introductionmentioning
confidence: 99%
“…A parameter estimation algorithm is presented for sound synthesis in 12 , which overcomes the limitation of linear modal synthesis (i.e., frictional multibody contact formulation 10 ). However, the introduction of more sophisticated textures (i.e., anisotropic textures) has led to challenges with the intricate physical models and computational demands in real-time physics-based simulations of tool-surface interactions 5 , 17 , 18 . Although simplification attempts have been made but at the cost of reducing perceptual realism, impacting virtual experience authenticity.…”
Section: Introductionmentioning
confidence: 99%