“…The methodology encompassed the evaluation of prospective technologies at all levels of the layered architecture, examination of design limitations specific to each layer, and the implementation of a uniform software development framework across multiple layers (utilizing identical code for different layers, known as SC-FDL). The proposed LA was implemented in a practical case study involving air/ground surveillance [15,16,18,19]. Subsequently, performance trials were carried out in various scenarios to evaluate its effectiveness using real-world data-intake situations.…”
Section: Discussionmentioning
confidence: 99%
“…Contemporary surveillance networks possess the ability to provide trajectories for various types of boats and aircraft in a global or, at the very least, expansive geographical range [18]. The two most commonly utilized systems for air and maritime surveillance are ADS-B and Automatic Identification System (AIS).…”
Section: Problem Definitionmentioning
confidence: 99%
“…Emerging technologies encompass activity-based intelligence and the identification of patterns of life. Advanced analysis of trajectories extracted by surveillance systems offers a potential approach to studying these technologies [18]. Cluster algorithms are utilized to partition trajectories into distinct segments of interest.…”
This study introduces a novel methodology designed to assess the accuracy of data processing in the Lambda Architecture (LA), an advanced big-data framework qualified for processing streaming (data in motion) and batch (data at rest) data. Distinct from prior studies that have focused on hardware performance and scalability evaluations, our research uniquely targets the intricate aspects of data-processing accuracy within the various layers of LA. The salient contribution of this study lies in its empirical approach. For the first time, we provide empirical evidence that validates previously theoretical assertions about LA, which have remained largely unexamined due to LA’s intricate design. Our methodology encompasses the evaluation of prospective technologies across all levels of LA, the examination of layer-specific design limitations, and the implementation of a uniform software development framework across multiple layers. Specifically, our methodology employs a unique set of metrics, including data latency and processing accuracy under various conditions, which serve as critical indicators of LA’s accurate data-processing performance. Our findings compellingly illustrate LA’s “eventual consistency”. Despite potential transient inconsistencies during real-time processing in the Speed Layer (SL), the system ultimately converges to deliver precise and reliable results, as informed by the comprehensive computations of the Batch Layer (BL). This empirical validation not only confirms but also quantifies the claims posited by previous theoretical discourse, with our results indicating a 100% accuracy rate under various severe data-ingestion scenarios. We applied this methodology in a practical case study involving air/ground surveillance, a domain where data accuracy is paramount. This application demonstrates the effectiveness of the methodology using real-world data-intake scenarios, therefore distinguishing this study from hardware-centric evaluations. This study not only contributes to the existing body of knowledge on LA but also addresses a significant literature gap. By offering a novel, empirically supported methodology for testing LA, a methodology with potential applicability to other big-data architectures, this study sets a precedent for future research in this area, advancing beyond previous work that lacked empirical validation.
“…The methodology encompassed the evaluation of prospective technologies at all levels of the layered architecture, examination of design limitations specific to each layer, and the implementation of a uniform software development framework across multiple layers (utilizing identical code for different layers, known as SC-FDL). The proposed LA was implemented in a practical case study involving air/ground surveillance [15,16,18,19]. Subsequently, performance trials were carried out in various scenarios to evaluate its effectiveness using real-world data-intake situations.…”
Section: Discussionmentioning
confidence: 99%
“…Contemporary surveillance networks possess the ability to provide trajectories for various types of boats and aircraft in a global or, at the very least, expansive geographical range [18]. The two most commonly utilized systems for air and maritime surveillance are ADS-B and Automatic Identification System (AIS).…”
Section: Problem Definitionmentioning
confidence: 99%
“…Emerging technologies encompass activity-based intelligence and the identification of patterns of life. Advanced analysis of trajectories extracted by surveillance systems offers a potential approach to studying these technologies [18]. Cluster algorithms are utilized to partition trajectories into distinct segments of interest.…”
This study introduces a novel methodology designed to assess the accuracy of data processing in the Lambda Architecture (LA), an advanced big-data framework qualified for processing streaming (data in motion) and batch (data at rest) data. Distinct from prior studies that have focused on hardware performance and scalability evaluations, our research uniquely targets the intricate aspects of data-processing accuracy within the various layers of LA. The salient contribution of this study lies in its empirical approach. For the first time, we provide empirical evidence that validates previously theoretical assertions about LA, which have remained largely unexamined due to LA’s intricate design. Our methodology encompasses the evaluation of prospective technologies across all levels of LA, the examination of layer-specific design limitations, and the implementation of a uniform software development framework across multiple layers. Specifically, our methodology employs a unique set of metrics, including data latency and processing accuracy under various conditions, which serve as critical indicators of LA’s accurate data-processing performance. Our findings compellingly illustrate LA’s “eventual consistency”. Despite potential transient inconsistencies during real-time processing in the Speed Layer (SL), the system ultimately converges to deliver precise and reliable results, as informed by the comprehensive computations of the Batch Layer (BL). This empirical validation not only confirms but also quantifies the claims posited by previous theoretical discourse, with our results indicating a 100% accuracy rate under various severe data-ingestion scenarios. We applied this methodology in a practical case study involving air/ground surveillance, a domain where data accuracy is paramount. This application demonstrates the effectiveness of the methodology using real-world data-intake scenarios, therefore distinguishing this study from hardware-centric evaluations. This study not only contributes to the existing body of knowledge on LA but also addresses a significant literature gap. By offering a novel, empirically supported methodology for testing LA, a methodology with potential applicability to other big-data architectures, this study sets a precedent for future research in this area, advancing beyond previous work that lacked empirical validation.
“…This analysis can be also part of a malicious cyberdetection mechanism. As with the algorithmic process of historical AIS data, it is possible to perform self-reporting, vessel trajectory reconstruction [48], or learning [49] or to detect suspicious vessel activities [50]. Historical data could be also used to understand the behavior of the ship [51,52].…”
Cybersecurity is becoming an increasingly important aspect in ensuring maritime data protection and operational continuity. Ships, ports, surveillance and navigation systems, industrial technology, cargo, and logistics systems all contribute to a complex maritime environment with a significant cyberattack surface. To that aim, a wide range of cyberattacks in the maritime domain are possible, with the potential to infect vulnerable information and communication systems, compromising safety and security. The use of navigation and surveillance systems, which are considered as part of the maritime OT sensors, can improve maritime cyber situational awareness. This survey critically investigates whether the fusion of OT data, which are used to provide maritime situational awareness, may also improve the ability to detect cyberincidents in real time or near-real time. It includes a thorough analysis of the relevant literature, emphasizing RF but also other sensors, and data fusion approaches that can help improve maritime cybersecurity.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.