The recent proliferation of virtual reality (VR) technology applications in the autism therapy to promote learning and positive behavior among such children has produced optimistic results in developing a variety of skills and abilities in them. Dolphin-assisted therapy has also become a topic of public and research interest for autism intervention and treatment. This paper will present an innovative design and development of a Virtual Dolphinarium for potential autism intervention. Instead of emulating the swimming with dolphins, our virtual dolphin interaction program will allow children with autism to act as dolphin trainers at the poolside and to learn (nonverbal) communication through hand gestures with the virtual dolphins. Immersive visualization and gesture-based interaction are implemented to engage children with autism within an immersive room equipped with a curved screen spanning a 320(°) and a high-end five-panel projection system. This paper will also report a pilot study to establish trial protocol of autism screening to explore the participants' readiness for the virtual dolphin interaction. This research will have two potential benefits in the sense of helping children with autism and protecting the endangered species.
This paper presents a variational algorithm for feature-preserved mesh denoising. At the heart of the algorithm is a novel variational model composed of three components: fidelity, regularization and fairness, which are specifically designed to have their intuitive roles. In particular, the fidelity is formulated as an L 1 data term, which makes the regularization process be less dependent on the exact value of outliers and noise. The regularization is formulated as the total absolute edge-lengthed supplementary angle of the dihedral angle, making the model capable of reconstructing meshes with sharp features. In addition, an augmented Lagrange method is provided to efficiently solve the proposed variational model. Compared to the prior art, the new algorithm has crucial advantages in handling large scale noise, noise along random directions, and different kinds of noise, including random impulsive noise, even in the presence of sharp features. Both visual and quantitative evaluation demonstrates the superiority of the new algorithm.
humanoids, [8] rehabilitation devices, [9] and augmented and virtual reality. [10] The advent of the Internet of actions (IoA) is accelerating the mechanosensory revolution. [11][12][13][14][15] This is because mechanosensation is believed to be the core of it. Over the last few years, various types of mechanical tactile sensors have been fabricated to emulate the functions of mechanoreceptors found in human skin. Based on the underlying sensing principle, they can be classified as resistive, [16][17][18][19][20][21] capacitive, [22][23][24][25] inductive, [26][27][28] piezoelectric, [29][30][31] and optical types. [32][33][34] Resistive type mechanical tactile sensors, in particular, have some advantages, including low cost, good durability, and a simple structure. [35] However, current limitations of these sensors include small pressure measurement range, slow response time, and high hysteresis. The majority of reported resistive tactile pressure sensors have a high sensitivity at low pressures (less than 10 kPa), allowing for ultra-sensitive detection. [36][37][38][39][40][41] However, in applications such as object manipulation pressures produced are more than 10 kPa. At high pressures (greater than 10 kPa), the sensitivity of these tactile sensors declines dramatically. The slow response time of these sensors (typically more than 30 ms) is also an issue when it comes to their practical usage. [25,35,40,41] Maintaining high sensitivity more than or comparable to human skin in a wide pressure range with a fast response time is required.Another critical issue that needs to be addressed with current skin-inspired sensors is how to achieve multi-point sensing capability, which is the simultaneous detection of touch at multiple different locations within a sensor. This necessitates a higher density of sensing nodes on the sensor. However, increasing the density of sensing nodes beyond a certain limit has drawbacks. First, it increases the number of interconnecting wires. These interconnecting wires restrict the motion of the robotic artifact because they can get entangled during motion. Second, it corrupts the sensor output signals because of crosstalk between the neighboring interconnects. [42] Array-type tactile sensor designs do solve the problem of interconnecting wires to some extent but crosstalk between the sensing nodes is still an issue with the majority of previously reported tactile sensors. [43,44] A possible solution to this problem is optimizing the density of sensing nodes and carefully planning the spatial distribution of these sensing nodes on the sensor.Skin-inspired sensors are all the rage in robotic applications. They take inspiration from the human skin's sensory abilities and use their abilities to sense things like temperature and pressure. Herein, fabrication of ultra-low-cost (<$1.5), ultra-thin, wide range, and crosstalk-free skin-inspired tactile sensors is presented. The sensors consist of piezoresistive pressure sensing elements sandwiched between 3D printed silver nanoparticle electr...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.