2022
DOI: 10.48550/arxiv.2203.14335
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Deep Hierarchical Semantic Segmentation

Abstract: Humans are able to recognize structured relations in observation, allowing us to decompose complex scenes into simpler parts and abstract the visual world in multiple levels. However, such hierarchical reasoning ability of human perception remains largely unexplored in current literature of semantic segmentation. Existing work is often aware of flatten labels and predicts target classes exclusively for each pixel. In this paper, we instead address hierarchical semantic segmentation (HSS), which aims at structu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
3
1

Year Published

2024
2024
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 72 publications
(127 reference statements)
0
3
1
Order By: Relevance
“…In evaluating the explicit use of the ICD hierarchy (Section III-D), our results indicate that HMC methods did not yield improvements.This contrasts with reports of substantial gains using HMC methods with neural networks [33,34], in particular for ICD classification using medication data [20]. For the latter, we found that our model's classification performance was generally superior to results reported by Hansen et al [20] , as discussed in the next subsection.…”
Section: Exploiting the Icd Hierarchical Propertiescontrasting
confidence: 99%
See 2 more Smart Citations
“…In evaluating the explicit use of the ICD hierarchy (Section III-D), our results indicate that HMC methods did not yield improvements.This contrasts with reports of substantial gains using HMC methods with neural networks [33,34], in particular for ICD classification using medication data [20]. For the latter, we found that our model's classification performance was generally superior to results reported by Hansen et al [20] , as discussed in the next subsection.…”
Section: Exploiting the Icd Hierarchical Propertiescontrasting
confidence: 99%
“…In [S9] introduce the TreeMin loss: This loss can be intuitively understood as always penalizing according to the worst relevant prediction in the hierarchy. One advantage of these logical constraint approaches is that they do not require to introduce new hyperparameters to be tuned.…”
Section: Appendixmentioning
confidence: 99%
See 1 more Smart Citation
“…InternImage [ 4 ] is a large-scale CNN that incorporates deformable convolution [ 5 ] variants that allow for capturing long-range dependencies with adaptive spatial aggregation. The HSSN framework, introduced in [ 6 ], adapts existing segmentation networks to incorporate hierarchy information for improved network learning. It forms a tree-structured hierarchy of the latent class dependencies, representing concepts and relationships, which is used in training for mapping pixels and their classes and in inference for finding the best path from root to leaf in the class hierarchy.…”
Section: Introductionmentioning
confidence: 99%