2017
DOI: 10.1016/j.ifacol.2017.08.1829
|View full text |Cite
|
Sign up to set email alerts
|

Interacting With a Mobile Robot with a Natural Infrastructure-Less Interface

Abstract: In this paper we introduce a novel approach that enables users to interact with a mobile robot in a natural manner. The proposed interaction system does not require any specific infrastructure or device, but relies on commonly utilized objects while leaving the user's hands free. Specifically, we propose to utilize a smartwatch (or a sensorized wristband) for recognizing the motion of the user's forearm. Measurements of accelerations and angular velocities are exploited to recognize user's gestures and define … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
13
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
3
3

Relationship

1
5

Authors

Journals

citations
Cited by 16 publications
(13 citation statements)
references
References 20 publications
0
13
0
Order By: Relevance
“…For what concerns gesture-based interaction with robots, Neto et al [4] proposed the use of five IMUs and an ultra-wideband positioning system to capture the human upper body shape and the relative position between the human and the robot. Villani et al [2], [12] used inertial data recorded with a smartwatch to control both wheeled and aerial robots. Gestures are used to provide high-level commands, such as take off, land, or stop, whereas robot velocity is determined by mapping user's wrist movements.…”
Section: Gesture Recognition For Hrimentioning
confidence: 99%
See 1 more Smart Citation
“…For what concerns gesture-based interaction with robots, Neto et al [4] proposed the use of five IMUs and an ultra-wideband positioning system to capture the human upper body shape and the relative position between the human and the robot. Villani et al [2], [12] used inertial data recorded with a smartwatch to control both wheeled and aerial robots. Gestures are used to provide high-level commands, such as take off, land, or stop, whereas robot velocity is determined by mapping user's wrist movements.…”
Section: Gesture Recognition For Hrimentioning
confidence: 99%
“…Then, having these constraints in mind, in Section III-B, we propose an algorithmic pipeline to implement gesture recognition in HRI. To show and discuss the application of the proposed pipeline, we consider the experimental scenario presented in [2], [12], and [13]. In particular, we assess the capability of the pipeline to generalize across multiple subjects having different level of acquaintance with the use of gestures.…”
mentioning
confidence: 99%
“…It is easier to understand and reduces users' cognitive burden as much as possible 6,7 . In the interactive system, 8 users do not need to use special interactive devices, but rely on commonly used smart watches to sense gestures and interact with mobile robots. Sunjie Chen et al developed a robot control system 9 based on gesture recognition.…”
Section: Related Wordmentioning
confidence: 99%
“…On the other side, the approach holds valid for any interaction system, provided that a suited mapping between user's forearm motion and interaction commands is found. Specifically, it has been proposed for single aerial and ground robots (Villani et al 2017a(Villani et al , b, 2018c and could be exploited for interacting with industrial automatic machines, as in Villani et al (2016), in an enhanced scenario of affective computing for inclusive work environment (Villani et al 2018b).…”
Section: Proposed Systemmentioning
confidence: 99%
“…where v and ω are the robots linear and angular velocity, K r > 0 and K p > 0 are constants defined in such a way that the maximum angle that is achievable with the motion of the wrist corresponds to the maximum velocity of the mobile robot, ϑ r ∈ [−π/2, π/2] is the roll angle and ϑ p ∈ [−π/2, π/2] is the pitch angle. Further details can be found in Villani et al (2017a).…”
Section: Natural Mapping: Teleoperationmentioning
confidence: 99%