2022 IEEE International Systems Conference (SysCon) 2022
DOI: 10.1109/syscon53536.2022.9773922
|View full text |Cite
|
Sign up to set email alerts
|

A new SysML Model for UAV Swarm Modeling: UavSwarmML

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 13 publications
(8 citation statements)
references
References 15 publications
0
8
0
Order By: Relevance
“…Actions allow a client to cancel the request before it is completed, track the progress of a request and obtain the final result. 43 The swarm developer can use the new diagram shown in Figure 8 to describe the mission of the swarm and, with this representation, implementation on ROS will be easier later. In this diagram, we choose the ellipse shape to represent the node and the rectangle shape to represent the topic.…”
Section: Phase 1: Modeling Using Mbse Methodsmentioning
confidence: 99%
“…Actions allow a client to cancel the request before it is completed, track the progress of a request and obtain the final result. 43 The swarm developer can use the new diagram shown in Figure 8 to describe the mission of the swarm and, with this representation, implementation on ROS will be easier later. In this diagram, we choose the ellipse shape to represent the node and the rectangle shape to represent the topic.…”
Section: Phase 1: Modeling Using Mbse Methodsmentioning
confidence: 99%
“…It plays an important role for the tracking and localization process, helping in eliminating drift errors for camera poses attached to the robot ( Sheng et al, 2019 ; Hsiao et al, 2017 ). Subsequently, this keyframe is sent for further processing in the next stage, where it will be shaped into a preliminary map, a crucial part for the third stage of the workflow ( Aloui et al, 2022 ; Zhang et al, 2020 ).…”
Section: Visual Slam Paradigmmentioning
confidence: 99%
“…Recently, visual SLAM has changed a lot and made a big impact on robotics and computer vision ( Khoyani and Amini, 2023 ). Along this journey, different V-SLAM methods have been created to tackle specific challenges in robot navigation, mapping, and understanding the surroundings ( Aloui et al, 2022 ; Sun et al, 2017 ). To verify and compare these V-SLAM methods, important datasets have been created which played a crucial role in the field ( Pal et al, 2022 ; Tian et al, 2023a ).…”
Section: Visual Slam Evolution and Datasetsmentioning
confidence: 99%
“…Table 2 provides the illustrative validation elements of four design requirements Turtlebot 3 burger. 35 The table shows the requirements, the modeling information with SysML/ROS2ML, the simulation carried out in the ROS2 environment and the implementation on the real robot.…”
Section: Phase 3: Implementation Of the Physical Robotmentioning
confidence: 99%