Summary Path loss prediction models occupy a central role in wireless signal propagation because of the continuous need to achieve reliable and high quality of service for subscribers satisfaction. However, the adoption of deterministic and empirical models for pathloss characterization presents a highly contending trade‐off between simplicity and accuracy. On the one hand, empirical models are relatively simple to apply but are mostly inaccurate and inconsistent. Deterministic models are more accurate but quite complex to develop, time‐consuming, and possess nonadaptable characteristics. Toward this end, this paper proposes to address the problems associated with the existing models (empirical and deterministic) through the introduction of machine learning algorithms to path loss predictions. The contribution of this paper is in threefold. First, experimental data were collected in multitransmitter scenarios via drive test in six base transceiver stations, and the pathloss of the received signal level was derived and analyzed. Two machine learning‐based path loss prediction models were then developed using the measured data as input variables. The developed path loss prediction models are the radial basis function neural network (RBFNN) and the multilayer perception neural network (MLPNN). Further to this, the MLPNN and the RBFNN models were compared with the measured path loss, and the RBFNN appears to be more accurate with lower values of root mean squared errors (RMSEs) in comparison with the MLPNN. Finally, the proposed machine language‐based path loss prediction models (MLPNN and RBFNN) were compared against five existing empirical models, and again, the RBFNN shows the most accurate results.
Atmospheric impairment-induced attenuation is the prominent source of signal degradation in radio wave communication channels. The computation-based modeling of radio wave attenuation over the atmosphere is the stepwise application of relevant radio propagation models, data, and procedures to effectively and prognostically estimate the losses of the propagated radio signals that have been induced by atmospheric constituents. This contribution aims to perform a detailed prognostic evaluation of radio wave propagation attenuation due to rain, free space, gases, and cloud over the atmosphere at the ultra-high frequency band. This aim has been achieved by employing relevant empirical atmospheric data and suitable propagation models for robust prognostic modeling using experimental measurements. Additionally, the extrapolative attenuation estimation results and the performance analysis were accomplished by engaging different stepwise propagation models and computation parameters often utilized in Earth–satellite and terrestrial communications. Results indicate that steady attenuation loss levels rise with increasing signal carrier frequency where free space is more dominant. The attenuation levels attained due to rain, cloud, atmospheric gases, and free space are also dependent on droplet depths, sizes, composition, and statistical distribution. While moderate and heavy rain depths achieved 3 dB and 4 dB attenuations, the attenuation due to light rainfall attained a 2.5 dB level. The results also revealed that attenuation intensity levels induced by atmospheric gases and cloud effects are less than that of rain. The prognostic-based empirical attenuation modeling results can provide first-hand information to radio transmission engineers on link budgets concerning various atmospheric impairment effects during radio frequency network design, deployment, and management, essentially at the ultra-high frequency band.
Modern cellular communication networks are already being perturbed by large and steadily increasing mobile subscribers in high demand for better service quality. To constantly and reliably deploy and optimally manage such mobile cellular networks, the radio signal attenuation loss between the path lengths of a base transmitter and the mobile station receiver must be appropriately estimated. Although many log-distance-based linear models for path loss prediction in wireless cellular networks exist, radio frequency planning requires advanced non-linear models for more accurate predictive path loss estimation, particularly for complex microcellular environments. The precision of the conventional models on path loss prediction has been reported in several works, generally ranging from 8–12 dB in terms of Root Mean Square Error (RMSE), which is too high compared to the acceptable error limit between 0 and 6 dB. Toward this end, the need for near-precise machine learning-based path loss prediction models becomes imperative. This work develops a distinctive multi-layer perception (MLP) neural network-based path loss model with well-structured implementation network architecture, empowered with the grid search-based hyperparameter tuning method. The proposed model is designed for optimal path loss approximation between mobile station and base station. The hyperparameters examined include the neuron number, learning rate and hidden layers number. In detail, the developed MLP model prediction accuracy level using different learning and training algorithms with the tuned best values of the hyperparameters have been applied for extensive path loss experimental datasets. The experimental path loss data is acquired via a field drive test conducted over an operational 4G LTE network in an urban microcellular environment. The results were assessed using several first-order statistical performance indicators. The results show that prediction errors of the proposed MLP model compared favourably with measured data and were better than those obtained using conventional log-distance-based path loss models.
The construction industry is always a slow adopter of innovative technologies than other sectors of the economy. Although some technologies, such as Building Information Modelling (BIM), robotics, among others, have been implemented, their adoption has faced some challenges.Blockchain technology is also considered a game-changer for the construction sector with the functionalities and capabilities to improve the construction supply chain, improve transparency, sustainability, and the like. Hence, this study using the system dynamics approach aims to conceptualize the complex causal interrelationship of the key factors influencing blockchain technology adoption in the construction industry. The analytical findings revealed that stakeholders' awareness and satisfaction, support from top management, and the development of standardized and compatible blockchain solutions would enhance its adoption in construction firms and the construction industry. The study also emphasizes the need to integrate blockchain technology with the existing technologies towards facilitating the delivery of smart buildings and cities as well as enhancing the operation of modular integrated construction (MiC) projects both in Hong Kong and overseas
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.