Magnetic resonance imaging has been used to determine graft integrity and study the remodeling process of anterior cruciate ligament grafts morphologically in humans. The goal of the present study was to compare graft signal intensity and morphologic characteristics on magnetic resonance imaging with biomechanical and histologic parameters in a long-term animal model. Thirty sheep underwent anterior cruciate ligament reconstruction with an autologous Achilles tendon split graft and were sacrificed after 6, 12, 24, 52, or 104 weeks. Before sacrifice, all animals underwent plain and contrast-enhanced (gadolinium-diethylenetriamine pentacetic acid) magnetic resonance imaging (1.5 T, proton density weighted, 2-mm sections) of their operated knees. The signal/noise quotient was calculated and data were correlated to the maximum load to failure, tensile strength, and stiffness of the grafts. The vascularity of the grafts was determined immunohistochemically by staining for endothelial cells (factor VIII). We found that high signal intensity on magnetic resonance imaging reflects a decrease of mechanical properties of the graft during early remodeling. Correlation analyses revealed significant negative linear correlations between the signal/noise quotient and the load to failure, stiffness, and tensile strength. In general, correlations for contrast-enhanced measurements of signal intensity were stronger than those for plain magnetic resonance imaging. Immunohistochemistry confirmed that contrast medium enhancement reflects the vascular status of the graft tissue during remodeling. We conclude that quantitatively determined magnetic resonance imaging signal intensity may be a useful tool for following the graft remodeling process in a noninvasive manner.
In the fifth generation (5G) of mobile broadband systems, Radio Resources Management (RRM) will reach unprecedented levels of complexity. To cope with the ever more sophisticated RRM functionalities and with the growing variety of scenarios, while carrying out the prompt decisions required in 5G, this manuscript presents a lean 5G RRM architecture that capitalizes on recent advances in the field of machine learning in combination with the large amount of data readily available in the network from measurements and system observations. The architecture relies on a single general-purpose learning framework conceived for RRM directly using the data gathered in the network. The complexity of RRM is shifted to the design of the framework, whilst the RRM algorithms derived from this framework are executed in a computationally efficient distributed manner at the radio access nodes. The potential of this approach is verified in a pair of pertinent scenarios and future directions on applications of machine learning to RRM are discussed.
Abstract-Optimizing radio transmission power and user data rates in wireless systems via power control requires an accurate and instantaneous knowledge of the system model. While this problem has been extensively studied in the literature, an efficient solution approaching optimality with the limited information available in practical systems is still lacking. This paper presents a reinforcement learning framework for power control and rate adaptation in the downlink of a radio access network that closes this gap. We present a comprehensive design of the learning framework that includes the characterization of the system state, the design of a general reward function, and the method to learn the control policy. System level simulations show that our design can quickly learn a power control policy that brings significant energy savings and fairness across users in the system. Index Terms-Power and rate control, reinforcement learning.
Prediction of user traffic in cellular networks has attracted profound attention for improving resource utilization. In this paper, we study the problem of network traffic traffic prediction and classification by employing standard machine learning and statistical learning time series prediction methods, including long short-term memory (LSTM) and autoregressive integrated moving average (ARIMA), respectively. We present an extensive experimental evaluation of the designed tools over a real network traffic dataset. Within this analysis, we explore the impact of different parameters to the effectiveness of the predictions. We further extend our analysis to the problem of network traffic classification and prediction of traffic bursts. The results, on the one hand, demonstrate superior performance of LSTM over ARIMA in general, especially when the length of the training time series is high enough, and it is augmented by a wisely-selected set of features. On the other hand, the results shed light on the circumstances in which, ARIMA performs close to the optimal with lower complexity.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.