Deep learning has started to revolutionize several different industries, and the applications of these methods in medicine are now becoming more commonplace. This study focuses on investigating the feasibility of tracking patients and clinical staff wearing Bluetooth Low Energy (BLE) tags in a radiation oncology clinic using artificial neural networks (ANNs) and convolutional neural networks (CNNs). The performance of these networks was compared to relative received signal strength indicator (RSSI) thresholding and triangulation. By utilizing temporal information, a combined CNN+ANN network was capable of correctly identifying the location of the BLE tag with an accuracy of 99.9%. It outperformed a CNN model (accuracy = 94%), a thresholding model employing majority voting (accuracy = 95%), and a triangulation classifier utilizing majority voting (accuracy = 95%). Future studies will seek to deploy this affordable real time location system in hospitals to improve clinical workflow, efficiency, and patient safety.
General language model BERT pre-trained on cross-domain text corpus, BookCorpus and Wikipedia, achieves excellent performance on a couple of natural language processing tasks through the way of fine-tuning in the downstream tasks. But it still lacks of task-specific knowledge and domain-related knowledge for further improving the performance of BERT model and more detailed fine-tuning strategy analyses are necessary. To address these problem, a BERT-based text classification model BERT4TC is proposed via constructing auxiliary sentence to turn the classification task into a binary sentence-pair one, aiming to address the limited training data problem and task-awareness problem. The architecture and implementation details of BERT4TC are also presented, as well as a post-training approach for addressing the domain challenge of BERT. Finally, extensive experiments are conducted on seven public widelystudied datasets for analyzing the fine-tuning strategies from the perspectives of learning rate, sequence length and hidden state vector selection. After that, BERT4TC models with different auxiliary sentences and post-training objectives are compared and analyzed in depth. The experiment results show that BERT4TC with suitable auxiliary sentence significantly outperforms both typical feature-based methods and finetuning methods, and achieves new state-of-the-art performance on multi-class classification datasets. For binary sentiment classification datasets, our BERT4TC post-trained with suitable domain-related corpus also achieves better results compared with original BERT model. INDEX TERMS Natural language processing, text classification, bidirectional encoder representations from transformer, neural networks, language model.
Matching an appropriate response with its multi-turn context is a crucial challenge in retrievalbased chatbots. Current studies construct multiple representations of context and response to facilitate response selection, but they use these representations in isolation and ignore the relationships among representations. To address these problems, we propose a hierarchical aggregation network of multirepresentation (HAMR) to leverage abundant representations sufficiently and enhance valuable information. First, we employ bidirectional recurrent neural networks (BiRNN) to extract syntactic and semantic representations of sentences and use a self-aggregation mechanism to combine these representations. Second, we design a matching aggregation mechanism for fusing different matching information between each utterance in context and response, which is generated by an attention mechanism. By considering the candidate response as the real part of the context, we try to integrate all of them in chronological order and then accumulate the vectors to calculate the final matching degree. An extensive empirical study on two multi-turn response selection data sets indicates that our proposed model achieves a new state-of-the-art result.
Existing feature-based neural approaches for aspect-based sentiment analysis (ABSA) try to improve their performance with pre-trained word embeddings and by modeling the relations between the text sequence and the aspect (or category), thus heavily depending on the quality of word embeddings and task-specific architectures. Although the recently pre-trained language models, i.e., BERT and XLNet, have achieved state-of-the-art performance in a variety of natural language processing (NLP) tasks, they still subject to the aspect-specific, local feature-aware and task-agnostic challenges. To address these challenges, this paper proposes a XLNet and capsule network based model XLNetCN for ABSA. XLNetCN firstly constructs auxiliary sentence to model the sequence-aspect relation and generate global aspect-specific representations, which enables to enhance aspect-awareness and ensure the full pre-training of XLNet for improving task-agnostic capability. After that, XLNetCN also employs a capsule network with the dynamic routing algorithm to extract the local and spatial hierarchical relations of the text sequence, and yield its local feature representations, which are then merged with the global aspect-related representations for downstream classification via a softmax classifier. Experimental results show that XLNetCN outperforms significantly than the classical BERT, XLNet and traditional feature-based approaches on the two benchmark datasets of SemEval 2014, Laptop and Restaurant, and achieves new state-of-the-art results. INDEX TERMS Aspect-based sentiment analysis, natural language processing, text analysis, deep learning, neural network.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.