Current acute pain intensity assessment tools are mainly based on self-reporting by patients, which is impractical for non-communicative, sedated or critically ill patients. In previous studies, various physiological signals have been observed qualitatively as a potential pain intensity index. On the basis of that, this study aims at developing a continuous pain monitoring method with the classification of multiple physiological parameters. Heart rate (HR), breath rate (BR), galvanic skin response (GSR) and facial surface electromyogram were collected from 30 healthy volunteers under thermal and electrical pain stimuli. The collected samples were labelled as no pain, mild pain or moderate/severe pain based on a self-reported visual analogue scale. The patterns of these three classes were first observed from the distribution of the 13 processed physiological parameters. Then, artificial neural network classifiers were trained, validated and tested with the physiological parameters. The average classification accuracy was 70.6%. The same method was applied to the medians of each class in each test and accuracy was improved to 83.3%. With facial electromyogram, the adaptivity of this method to a new subject was improved as the recognition accuracy of moderate/severe pain in leave-one-subject-out cross-validation was promoted from 74.9 ± 21.0 to 76.3 ± 18.1%. Among healthy volunteers, GSR, HR and BR were better correlated to pain intensity variations than facial muscle activities. The classification of multiple accessible physiological parameters can potentially provide a way to differentiate among no, mild and moderate/severe acute experimental pain.
Detecting QRS complexes or R-peaks from the electrocardiogram (ECG) is the basis for heart rate determination and heart rate variability analysis. Over the years, multiple different methods have been proposed as solutions to this problem. Vast majority of the proposed methods are traditional rule based algorithms that are vulnerable to noise. We propose a new R-peak detection method that is based on the Long Short-Term Memory (LSTM) network. LSTM networks excel at temporal modelling tasks that include long-term dependencies, making it suitable for ECG analysis. Additionally, we propose data generator for creating noisy ECG data that is used to train the robust R-peak detector. Our initial testing shows that the proposed method outperforms traditional algorithms while the greatest competitive edge is achieved with the noisy ECG signals.
Background: Accurate detection of clinically significant prostate cancer (csPCa), Gleason Grade Group ≥ 2, remains a challenge. Prostate MRI radiomics and blood kallikreins have been proposed as tools to improve the performance of biparametric MRI (bpMRI). Purpose: To develop and validate radiomics and kallikrein models for the detection of csPCa. Study Type: Retrospective. Population: A total of 543 men with a clinical suspicion of csPCa, 411 (76%, 411/543) had kallikreins available and 360 (88%, 360/411) did not take 5-alpha-reductase inhibitors. Two data splits into training, validation (split 1: single center, n = 72; split 2: random 50% of pooled datasets from all four centers), and testing (split 1: 4 centers, n = 288; split 2: remaining 50%) were evaluated. Field strength/Sequence: A 3 T/1.5 T, TSE T2-weighted imaging, 3x SE DWI. Assessment: In total, 20,363 radiomic features calculated from manually delineated whole gland (WG) and bpMRI suspicion lesion masks were evaluated in addition to clinical parameters, prostate-specific antigen, four kallikreins, MRI-based qualitative (PI-RADSv2.1/IMPROD bpMRI Likert) scores.
The automatic detection of facial expressions of pain is needed to ensure accurate pain assessment of patients who are unable to self-report pain. To overcome the challenges of automatic systems for determining pain levels based on facial expressions in clinical patient monitoring, a surface electromyography method was tested for feasibility in healthy volunteers. In the current study, two types of experimental gradually increasing pain stimuli were induced in thirty-one healthy volunteers who attended the study. We used a surface electromyography method to measure the activity of five facial muscles to detect facial expressions during pain induction. Statistical tests were used to analyze the continuous electromyography data, and a supervised machine learning was applied for pain intensity prediction model. Muscle activation of corrugator supercilii was most strongly associated with selfreported pain, and the levator labii superioris and orbicularis oculi showed a statistically significant increase in muscle activation when the pain stimulus reached subjects' self-reported pain thresholds. The two strongest features associated with pain, the waveform length of the corrugator supercilii and levator labii superioris, were selected for a prediction model. The performance of the pain prediction model resulted in a c-index of 0.64. In the study results, the most detectable difference in muscle activity during the pain experience was connected to eyebrow lowering, nose wrinkling and upper lip raising. As the performance of the prediction model remains modest, yet with a statistically significant ordinal classification, we suggest testing with a larger sample size to further explore the variables that affect variation in expressiveness and subjective pain experience.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.