The first wireless camera pills created a revolutionary new perspective for engineers and physicians, demonstrating for the first time the feasibility of achieving medical objectives deep within the human body from a swallowable, wireless platform. The approximately 10 years since the first camera pill has been a period of great innovation in swallowable medical devices. Many modules and integrated systems have been devised to enable and enhance the diagnostic and even robotic capabilities of capsules working within the gastrointestinal (GI) tract. This article begins by reviewing the motivation and challenges of creating devices to work in the narrow, winding, and often inhospitable GI environment. Then the basic modules of modern swallowable wireless capsular devices are described, and the state of the art in each is discussed. This article is concluded with a perspective on the future potential of swallowable medical devices to enable advanced diagnostics beyond the capability of human visual perception, and even to directly deliver surgical tools and therapy non-invasively to interventional sites deep within the GI tract.
To the best of our knowledge, this is the first time that a robot has been used to perform a mastoidectomy. Although significant hurdles remain to translate this technology to clinical use, we have shown that it is feasible. The prospect of reducing surgical time and enhancing patient safety by replacing human hand-eye coordination with machine precision motivates future work toward translating this technique to clinical use.
Image-guided robots have been widely used for bone shaping and percutaneous access to interventional sites. However, due to high-accuracy requirements and proximity to sensitive nerves and brain tissues, the adoption of robots in inner-ear surgery has been slower. In this paper the authors present their recent work towards developing two image-guided industrial robot systems for accessing challenging inner-ear targets. Features of the systems include optical tracking of the robot base and tool relative to the patient and Kalman filter-based data fusion of redundant sensory information (from encoders and optical tracking systems) for enhanced patient safety. The approach enables control of differential robot positions rather than absolute positions, permitting simplified calibration procedures and reducing the reliance of the system on robot calibration in order to ensure overall accuracy. Lastly, the authors present the results of two phantom validation experiments simulating the use of image-guided robots in inner-ear surgeries such as cochlear implantation and petrous apex access.
We propose the use of a haptic touchscreen to convey graphical and mathematical concepts through aural and/or vibratory tactile feedback. We hypothesize that an important application of such a display will be in teaching visually impaired students concepts that are traditionally learned almost entirely visually. This paper describes initial feasibility studies using a commercially available haptic touchscreen to display grids, points, lines, and shapes -some of the first visual graphical entities students encounter in K-12 mathematics education, and from which more complex lessons can be constructed. We conducted user studies designed to evaluate perception of these objects through haptic feedback alone, auditory feedback alone, and combinations of the two. Our results indicate that both sensory channels can be valuable in user perception.
Swallowable capsule-based cameras (e.g., the Given Imaging PillCam and competitors) are rapidly becoming the gold standard for diagnosis in the gastrointestinal (GI) tract. However, definitive diagnosis is still often precluded by the inability to control capsule position and orientation. This has inspired a number of active positioning strategies including augmenting the capsule with legs or other appendages, or incorporating magnets which can apply forces and torques in response to an external magnetic field. Furthermore, the loose, mucous coated, elastic intestine is generally deflated during capsule passage, making it challenging to view the entire internal surface adequately without the insufflation that is relied upon in traditional endoscopy. To address these challenges, we propose a new fluid-powered system that permits insufflation from a wireless capsule platform. This is accomplished by carrying a small reservoir of biocompatible liquid onboard the capsule which vaporizes and expands when released through a small onboard solenoid valve. The internal components of the capsule consist of two 3V Lithium coin cell batteries (VL621, Panasonic, Inc.) which charge 3 Tantalum capacitors (TAJB157M006R, AVX Corporation, Inc.) that fire the solenoid valve (S120, Lee Company, Inc.). In our initial proof-of-concept study, we have packaged all components in a 26 mm long by 11 mm diameter capsule. The fluid used in initial experiments is biocompatible Perfluoropentane, although any of a variety of biocompatible fluids that can be liquefied with light pressurization may be used. Perfluoropentane, developed for lung lavage, is a liquid at room temperature and becomes gaseous at body temperature. We note that pneumatic pressure produced in this way may be used for a variety of objectives, including powering biopsy collection devices or other mechanisms within the capsule, or being vented to inflate the intestine. In initial experiments, we have harnessed the pressure to inflate a balloon at the front of the capsule which can distend tissue and thereby improve image quality. In experimental tests, only 0.2 ml of fluid was consumed in inflating the balloon to sufficient pressure to distend porcine intestine (see http://research.vuse.vanderbilt.edu/MEDLab for images of these experiments). Optimization of the capsule body and electrical components is currently underway. Including a wireless camera, all components are expected to fit within the dimensions of a commercial PillCam.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.