Abstract-The availability of location information has become a key factor in today's communications systems allowing location based services. In outdoor scenarios, the mobile terminal position is obtained with high accuracy thanks to the Global Positioning System (GPS) or to the standalone cellular systems. However, the main problem of GPS and cellular systems resides in the indoor environment and in scenarios with deep shadowing effects where the satellite or cellular signals are broken. In this paper, we survey different technologies and methodologies for indoor and outdoor localization with an emphasis on indoor methodologies and concepts. Additionally, we discuss in this review different localization-based applications, where the location information is critical to estimate. Finally, a comprehensive discussion of the challenges in terms of accuracy, cost, complexity, security, scalability, etc. is given. The aim of this survey is to provide a comprehensive overview of existing efforts as well as auspicious and anticipated dimensions for future work in indoor localization techniques and applications.
Cognitive radios are expected to play a major role towards meeting the exploding traffic demand over wireless systems. A cognitive radio node senses the environment, analyzes the outdoor parameters, and then makes decisions for dynamic time-frequency-space resource allocation and management to improve the utilization of the radio spectrum. For efficient real-time process, the cognitive radio is usually combined with artificial intelligence and machine-learning techniques so that an adaptive and intelligent allocation is achieved. This paper firstly presents the cognitive radio networks, resources, objectives, constraints, and challenges. Then, it introduces artificial intelligence and machine-learning techniques and emphasizes the role of learning in cognitive radios. Then, a survey on the state-of-the-art of machine-learning techniques in cognitive radios is presented. The literature survey is organized based on different artificial intelligence techniques such as fuzzy logic, genetic algorithms, neural networks, game theory, reinforcement learning, support vector machine, case-based reasoning, entropy, Bayesian, Markov model, multi-agent systems, and artificial bee colony algorithm. This paper also discusses the cognitive radio implementation and the learning challenges foreseen in cognitive radio applications. IntroductionAccording to Cisco Visual Networking Index, the global IP traffic will reach 168 exabytes per month by 2019 [1], and the number of devices will be three times the global population. In addition, the resources in terms of power and bandwidth are scarce. Therefore, novel solutions are needed to minimize energy consumption and optimize resource allocation. Cognitive radio (CR) was introduced by Joseph Mitola III and Gerald Q. Maguire in 1999 for a flexible spectrum access [2]. Basically, they defined cognitive radio as the integration of model-based reasoning with software radio technologies [3]. In 2005, Simon Haykin had given a review of the cognitive radio concept and had treated it as brain-empowered wireless communications [4]. Cognitive radio is a radio or system that senses the environment, analyzes its transmission parameters, *Correspondence: nfa23@aub.edu.lb Department of Electrical and Computer Engineering, American University of Beirut, Beirut, Lebanon and then makes decisions for dynamic time-frequencyspace resource allocation and management to improve the utilization of the radio electromagnetic spectrum.Generally, radio resource management aims at optimizing the utilization of various radio resources such that the performance of the radio system is improved. For instance, the authors in [5] proposed an optimal resource (power and bandwidth) allocation in cognitive radio networks (CRNs), specifically in the scenario of spectrum underlay, while taking into consideration the limitations of interference temperature limits. The optimization formulations provide optimal solutions for resources allocation at, sometimes, the detriment of global convergence, computation time, and...
Driven by the special requirements of the Lowpower and Lossy Networks (LLNs), the IPv6 Routing Protocol for LLNs (RPL) was standardized by the IETF some six years ago to tackle the routing issue in such networks. Since its introduction, however, numerous studies have pointed out that, in its current form, RPL suffers from issues that limit its efficiency and domain of applicability. Thus, several solutions have been proposed in the literature in an attempt to overcome these identified limitations. In this survey, we aim mainly to provide a comprehensive review of these research proposals assessing whether such proposals have succeeded in overcoming the standard reported limitations related to its core operations. Although some of RPL's weaknesses have been addressed successfully, the study found that the proposed solutions remain deficient in overcoming several others. Hence, the study investigates where such proposals still fall short, the challenges and pitfalls to avoid, thus would help researchers formulate a clear foundation for the development of further successful extensions in future allowing the protocol to be applied more widely.
Medium Access Control (MAC) protocols based on Time Division Multiple Access (TDMA) can improve the reliability and efficiency of WBAN. However, traditional static TDMA techniques adopted by IEEE 802.15.4 and IEEE 802.15.6 do not sufficiently consider the channel status or the buffer requirements of the nodes within heterogeneous contexts. Although there are some solutions that have been proposed to alleviate the effect of the deep fade in WBAN channel by adopting dynamic slot allocation, these solutions still suffer from some reliability and energy efficiency issues and they do not avoid channel deep fading. This paper presents two novel and generic TDMA based techniques to improve WBAN reliability and energy efficiency. Both techniques synchronize nodes adaptively whilst tackling their channel and buffer status in normal and emergency contexts. Extensive simulation experiments using various traffic rates and time slot lengths demonstrate that the proposed techniques improve the reliability and the energy efficiency compared to IEEE 802.15.4 and IEEE 802.15.6 in both situations, the normal and emergency contexts. This improvement has been achieved in terms of packet loss, up to 90% and energy consumption, up to 13%, confirming the significant enhancements made by the developed scheduling techniques.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.