As a consequence of the COVID-19 emergency, frail citizens felt isolated because of social isolation, suspended and/or strongly reduced home assistance, and limited access to hospitals. In this sense, assistive technology could play a pivotal role in empowering frail older adults reducing their isolation, as well as in reinforcing the work of formal caregivers and professionals. In this context, the goal of this paper is to present four pilot studies—conducted from March 2020 to April 2021—to promptly react to COVID-19 by providing assistive technology solutions, aiming to (1) guarantee high-quality service to older adults in-home or in residential facility contexts, (2) promote social inclusion, and (3) reduce the virus transmission. In particular, four services, namely, telepresence service, remote monitoring service, virtual visit, and environmental disinfection, were designed, implemented, and tested in real environments involving 85 end-users to assess the user experience and/or preliminary assess the technical feasibility. The results underlined that all the proposed services were generally accepted by older adults and professionals. Additionally, the results remarked that the use of telepresence robots in private homes and residential facilities increased enjoyment reducing anxiety, whereas the monitoring service supported the clinicians in monitoring the discharged COVID-19 patients. It is also worth mentioning that two new services/products were developed to disinfect the environment and to allow virtual visits within the framework of a hospital information system. The virtual visits service offered the opportunity to expand the portfolio of hospital services. The main barriers were found in education, technology interoperability, and ethical/legal/privacy compliance. It is also worth mentioning the key role played by an appropriate design and customer needs analysis since not all assistive devices were designed for older persons.
Notable advancements have been achieved in providing amputees with sensation through invasive and non-invasive haptic feedback systems such as mechano-, vibro-, electrotactile and hybrid systems. Purely mechanical-driven feedback approaches, however, have been little explored. In this paper, we now created a haptic feedback system that does not require any external power source (such as batteries) or other electronic components. The system is low-cost, lightweight, adaptable and robust against external impact (such as water). Hence, it will be sustainable in many aspects. We have made use of latest multimaterial 3D printing technology (Stratasys Objet500 Connex3) being able to fabricate a soft sensor and a mechano-tactile feedback actuator made of a rubber (TangoBlack Plus) and plastic (VeroClear) material. When forces are applied to the fingertip sensor, fluidic pressure inside the system acts on the membrane of the feedback actuator resulting in mechano-tactile sensation. We present the design, fabrication and validation of the proposed haptic feedback system. Our ∅7 mm feedback actuator is able to transmit a force range between 0.2 N (the median touch threshold) and 2.1 N (the maximum force transmitted by the feedback actuator at a 3 mm indentation) corresponding to force range exerted to the fingertip sensor of 1.2 − 18.49 N.
This paper makes the VISTA database, composed of inertial and visual data, publicly available for gesture and activity recognition. The inertial data were acquired with the SensHand, which can capture the movement of wrist, thumb, index and middle fingers, while the RGB-D visual data were acquired simultaneously from two different points of view, front and side. The VISTA database was acquired in two experimental phases: in the former, the participants have been asked to perform 10 different actions; in the latter, they had to execute five scenes of daily living, which corresponded to a combination of the actions of the selected actions. In both phase, Pepper interacted with participants. The two camera point of views mimic the different point of view of pepper. Overall, the dataset includes 7682 action instances for the training phase and 3361 action instances for the testing phase. It can be seen as a framework for future studies on artificial intelligence techniques for activity recognition, including inertial-only data, visual-only data, or a sensor fusion approach.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.