2020
DOI: 10.1007/978-3-030-58580-8_8
|View full text |Cite
|
Sign up to set email alerts
|

TopoGAN: A Topology-Aware Generative Adversarial Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
14
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2
2

Relationship

2
8

Authors

Journals

citations
Cited by 26 publications
(14 citation statements)
references
References 31 publications
0
14
0
Order By: Relevance
“…Based on the theory of algebraic topology (Munkres, 2018), persistent homology (Edelsbrunner et al, 2000;Edelsbrunner & Harer, 2010) extends the classical notion of homology, and can capture the topological structures (e.g., loops, connected components) of the input data in a robust (Cohen-Steiner et al, 2007) manner. It has already been combined with various deep learning methods including kernel machines (Reininghaus et al, 2015;Kusano et al, 2016;Carriere et al, 2017), convolutional neural networks (Hofer et al, 2017;Hu et al, 2019;Wang et al, 2020;Zheng et al, 2021), transformers (Zeng et al, 2021), connectivity loss (Chen et al, 2019;Hofer et al, 2019). and graph neural networks (Zhao et al, 2020;Yan et al, 2021;Zhao & Wang, 2019;Hofer et al, 2020;Carrière et al, 2020).…”
Section: Related Workmentioning
confidence: 99%
“…Based on the theory of algebraic topology (Munkres, 2018), persistent homology (Edelsbrunner et al, 2000;Edelsbrunner & Harer, 2010) extends the classical notion of homology, and can capture the topological structures (e.g., loops, connected components) of the input data in a robust (Cohen-Steiner et al, 2007) manner. It has already been combined with various deep learning methods including kernel machines (Reininghaus et al, 2015;Kusano et al, 2016;Carriere et al, 2017), convolutional neural networks (Hofer et al, 2017;Hu et al, 2019;Wang et al, 2020;Zheng et al, 2021), transformers (Zeng et al, 2021), connectivity loss (Chen et al, 2019;Hofer et al, 2019). and graph neural networks (Zhao et al, 2020;Yan et al, 2021;Zhao & Wang, 2019;Hofer et al, 2020;Carrière et al, 2020).…”
Section: Related Workmentioning
confidence: 99%
“…1 (left). This has motivated previous work defining segmentation loss functions that encourage topologically consistent segmentations via persistent homology Hu et al (2019); Wang et al (2020a). Such losses compare the numbers of components and loops between ground truth and segmentation throughout a parameterized "filtration".…”
Section: Introductionmentioning
confidence: 98%
“…By the stability theorem of PDs [25,71], close distances between shapes or functions on them imply close distances between their PDs; thus, computing diagram distances efficiently becomes important. It can help an increasing list of applications such as clustering [29,53,60], classification [17,56,72] and deep learning [76] that have found the use of topological persistence for analyzing data. The 1-Wasserstein (W 1 ) distance is a common distance to compare persistence diagrams; Hera [50] is a widely used open source software for this.…”
Section: Introductionmentioning
confidence: 99%