2022
DOI: 10.1109/tcds.2021.3063273
|View full text |Cite
|
Sign up to set email alerts
|

Conditional Generative Adversarial Networks for Optimal Path Planning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
19
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1
1

Relationship

2
5

Authors

Journals

citations
Cited by 33 publications
(19 citation statements)
references
References 22 publications
0
19
0
Order By: Relevance
“…Deep learning techniques such as Conditional Variational AutoEncoder (CVAE), CNN, GAN and their variants have been widely used to solve motion-planning algorithms by generating processed configuration space in advance to guide the expansion of classical motion-planning algorithms [19][20][21][22][23][24][25][26][27][28][29][30].…”
Section: Module Replacement Algorithmsmentioning
confidence: 99%
See 1 more Smart Citation
“…Deep learning techniques such as Conditional Variational AutoEncoder (CVAE), CNN, GAN and their variants have been widely used to solve motion-planning algorithms by generating processed configuration space in advance to guide the expansion of classical motion-planning algorithms [19][20][21][22][23][24][25][26][27][28][29][30].…”
Section: Module Replacement Algorithmsmentioning
confidence: 99%
“…The LEGO algorithm trains the CVAE model using target samples in bottleneck regions and diverse regions to plan with sparse road maps. Takahashi et al [28] applied a GAN model with U-Net as the backbone to learning heuristic functions for grid-based A* algorithm that can reduce the search cost in 2-D space, whereas Ma et al [29] utilised the similar architecture to guide the sampling process of classical sampling-based RRT* algorithm that can accelerate the convergence speed to the optimal path significantly and generate a high-quality initial path. Figure 2 illustrates the experimental results of four maps presented in Ref.…”
Section: Cvae and Gan-related Pre-processing Modulementioning
confidence: 99%
“…In [27], Khan et al use Graph Neural Networks (GNNs) to encode the topology of the state space, and two different methods are proposed, one for constant graph and others for the incremental graph. [10] and [11] predict promising region to guide the sampling. However, the effects of the connectivity quality of the promising region are not stated, and the connectivity of the promising region is not enhanced.…”
Section: A Related Workmentioning
confidence: 99%
“…In [10], Neural RRT * utilizes a nonuniform sampling distribution predicted by a Convolutional Neural Network (CNN). Generative Adversarial Network (GAN) can be also implemented to achieve nonuniform sampling [11] [12]. However, they do not discuss the effect of the connectivity of the predicted promising region.…”
mentioning
confidence: 99%
See 1 more Smart Citation