Background Pain is a complex experience that involves sensory-discriminative and cognitive-emotional neuronal processes. It has long been known across cultures that pain can be relieved by mindful breathing (MB). There is a common assumption that MB exerts its analgesic effect through interoception. Interoception refers to consciously refocusing the mind’s attention to the physical sensation of internal organ function. Objective In this study, we dissect the cortical analgesic processes by imaging the brains of healthy subjects exposed to traditional MB (TMB) and compare them with another group for which we augmented MB to an outside sensory experience via virtual reality breathing (VRB). Methods The VRB protocol involved in-house–developed virtual reality 3D lungs that synchronized with the participants’ breathing cycles in real time, providing them with an immersive visual-auditory exteroception of their breathing. Results We found that both breathing interventions led to a significant increase in pain thresholds after week-long practices, as measured by a thermal quantitative sensory test. However, the underlying analgesic brain mechanisms were opposite, as revealed by functional near-infrared spectroscopy data. In the TMB practice, the anterior prefrontal cortex uniquely modulated the premotor cortex. This increased its functional connection with the primary somatosensory cortex (S1), thereby facilitating the S1-based sensory-interoceptive processing of breathing but inhibiting its other role in sensory-discriminative pain processing. In contrast, virtual reality induced an immersive 3D exteroception with augmented visual-auditory cortical activations, which diminished the functional connection with the S1 and consequently weakened the pain processing function of the S1. Conclusions In summary, our study suggested two analgesic neuromechanisms of VRB and TMB practices—exteroception and interoception—that distinctively modulated the S1 processing of the ascending noxious inputs. This is in line with the concept of dualism (Yin and Yang).
BACKGROUND For many years, clinicians have been seeking for objective pain assessment solutions via neuroimaging techniques, focusing on the brain to detect human pain. Unfortunately, most of those techniques are not applicable in the clinical environment or lack accuracy. OBJECTIVE This study aimed to test the feasibility of a mobile neuroimaging-based clinical augmented reality (AR) and artificial intelligence (AI) framework, CLARAi, for objective pain detection and also localization direct from the patient’s brain in real time. METHODS Clinical dental pain was triggered in 21 patients by hypersensitive tooth stimulation with 20 consecutive descending cold stimulations (32°C-0°C). We used a portable optical neuroimaging technology, functional near-infrared spectroscopy, to gauge their cortical activity during evoked acute clinical pain. The data were decoded using a neural network (NN)–based AI algorithm to classify hemodynamic response data into pain and no-pain brain states in real time. We tested the performance of several networks (NN with 7 layers, 6 layers, 5 layers, 3 layers, recurrent NN, and long short-term memory network) upon reorganized data features on pain diction and localization in a simulated real-time environment. In addition, we also tested the feasibility of transmitting the neuroimaging data to an AR device, HoloLens, in the same simulated environment, allowing visualization of the ongoing cortical activity on a 3-dimensional brain template virtually plotted on the patients’ head during clinical consult. RESULTS The artificial neutral network (3-layer NN) achieved an optimal classification accuracy at 80.37% (126,000/156,680) for pain and no pain discrimination, with positive likelihood ratio (PLR) at 2.35. We further explored a 3-class localization task of left/right side pain and no-pain states, and convolutional NN-6 (6-layer NN) achieved highest classification accuracy at 74.23% (1040/1401) with PLR at 2.02. CONCLUSIONS Additional studies are needed to optimize and validate our prototype CLARAi framework for other pains and neurologic disorders. However, we presented an innovative and feasible neuroimaging-based AR/AI concept that can potentially transform the human brain into an objective target to visualize and precisely measure and localize pain in real time where it is most needed: in the doctor’s office. INTERNATIONAL REGISTERED REPOR RR1-10.2196/13594
UNSTRUCTURED Pain is a complex experience that involves sensory-discriminative and cognitive-emotional neuronal processes. It has long been known across cultures to be relieved by mindful breathing (MB). There is a common assumption that MB exerts its analgesic effect by interoception and distraction. Interoception means the conscious refocusing of the mind’s attention to the physical sensation of an organ function, while distraction consists of the competing attention of concurrent sensory experiences. In the current study, we dissected these central analgesic processes by imaging the brains of two groups of healthy subjects exposed to either a traditional MB (TMB) or a virtual reality breathing (VRB) protocol. The VRB protocol involved an in-house developed VR 3D-lungs that synchronized with the participants’ breathing cycles in real-time, providing the participant with an immersive visual-auditory experience. We found that both breathing techniques led to significant pain threshold increase after week-long practices, measured by a thermal quantitative sensory test. However, their underlying analgesic neural mechanisms were opposite as revealed by the functional near-infrared spectroscopy (fNIRS) data. The TMB technique induced a mind-body connection pattern in the brain. The anterior prefrontal cortex (aPFC) connected with and modulated other cortical regions to a state of mindfulness, reappraising the ascending noxious inputs. Whereas the VRB practice induced a mind-body disconnection pattern, in which the overpowered audio-visual cortical regions functionally disconnected from the primary somatosensory cortex (S1), disengaging the central sensory-discriminative processing of the ascending noxious inputs by the immersive 3D experience.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.