One’s internal state is mainly communicated through nonverbal cues, such as facial expressions, gestures and tone of voice, which in turn shape the corresponding emotional state. Hence, emotions can be effectively used, in the long term, to form an opinion of an individual’s overall personality. The latter can be capitalized on in many human–robot interaction (HRI) scenarios, such as in the case of an assisted-living robotic platform, where a human’s mood may entail the adaptation of a robot’s actions. To that end, we introduce a novel approach that gradually maps and learns the personality of a human, by conceiving and tracking the individual’s emotional variations throughout their interaction. The proposed system extracts the facial landmarks of the subject, which are used to train a suitably designed deep recurrent neural network architecture. The above architecture is responsible for estimating the two continuous coefficients of emotion, i.e., arousal and valence, following the broadly known Russell’s model. Finally, a user-friendly dashboard is created, presenting both the momentary and the long-term fluctuations of a subject’s emotional state. Therefore, we propose a handy tool for HRI scenarios, where robot’s activity adaptation is needed for enhanced interaction performance and safety.
The integration of exponential technologies in the traditional manufacturing processes constitutes a noteworthy trend of the past two decades, aiming to reshape the industrial environment. This kind of digital transformation, which is driven by the Industry 4.0 initiative, not only affects the individual manufacturing assets, but the involved human workforce, as well. Since human operators should be placed in the centre of this revolution, they ought to be endowed with new tools and through-engineering solutions that improve their efficiency. In addition, vivid visualization techniques must be utilized, in order to support them during their daily operations in an auxiliary and comprehensive way. Towards this end, we describe a user-centered methodology, which utilizes augmented reality (AR) and computer vision (CV) techniques, supporting low-skilled operators in the maintenance procedures. The described mobile augmented reality maintenance assistant (MARMA) makes use of the handheld’s camera and locates the asset on the shop floor and generates AR maintenance instructions. We evaluate the performance of MARMA in a real use case scenario, using an automotive industrial asset provided by a collaborative manufacturer. During the evaluation procedure, manufacturer experts confirmed its contribution as an application that can effectively support the maintenance engineers.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.