Lower esophageal sphincter (LES) pressure shows axial and circumferential asymmetry, the reasons for which are not clear. Our aim was to determine whether the muscle thickness and shape of the LES were the reasons for the axial and circumferential asymmetry in the LES pressure. High-frequency, catheter-based intraluminal ultrasonography was performed to obtain images of the human LES and esophagus. Station pull-through manometry was performed to record the axial and circumferential asymmetry of LES pressure. Circular and longitudinal muscle were thicker in the LES compared with in the esophagus. There was a close correlation between the axial asymmetry in LES pressure and circular muscle thickness. The thickness of LES muscle increased and decreased with an increase and decrease in the LES pressure, respectively. The circumferential asymmetry in resting LES pressure was related to the noncircular shape of the LES. A swallow-induced LES relaxation was followed by its contraction. During the contraction phase, the LES and esophagus showed relative symmetry in pressure and shape. We conclude that the axial asymmetry of LES pressure is explained by its muscle thickness. On the other hand, circumferential asymmetry is related to the noncircular shape of the LES.
This paper aims to integrate AI (Artificial Intelligence) with medical science to develop a classification tool to recognize Covid-19 infection and other lung ailments. Four conditions evaluated were Covid-19 pneumonia, non-Covid-19 pneumonia, pneumonia and normal lungs. The proposed AI system is divided into 2 stages. Stage 1 classifies chest X-Ray volumes into pneumonia and non-pneumonia. Stage 2 gets input from stage 1 if X-ray belongs to pneumonic class and further classifies it into Covid-19 positive and Covid-19 negative.
Low cost virtual reality (VR) headsets powered by smartphones are becoming ubiquitous. Their unique position on the user's face opens interesting opportunities for interactive sensing. In this paper, we describe EyeSpyVR, a software-only eye sensing approach for smartphone-based VR, which uses a phone's front facing camera as a sensor and its display as a passive illuminator. Our proof-of-concept system, using a commodity Apple iPhone, enables four sensing modalities: detecting when the VR head set is worn, detecting blinks, recognizing the wearer's identity, and coarse gaze tracking - features typically found in high-end or specialty VR headsets. We demonstrate the utility and accuracy of EyeSpyVR in a series of studies with 70 participants, finding a worn detection of 100%, blink detection rate of 95.3%, family user identification accuracy of 81.4%, and mean gaze tracking error of 10.8° when calibrated to the wearer (12.9° without calibration). These sensing abilities can be used by developers to enable new interactive features and more immersive VR experiences on existing, off-the-shelf hardware.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.