2020
DOI: 10.1007/978-3-030-46147-8_9
|View full text |Cite
|
Sign up to set email alerts
|

Neural Message Passing for Multi-label Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
37
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
3
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 28 publications
(37 citation statements)
references
References 25 publications
0
37
0
Order By: Relevance
“…The attention mechanism is the crux behind many state-of-theart sequence-to-sequence models used in machine translation and language processing 40 and it has recently shown good results on multi-label classification. 41 While the attention mechanism has also been recently adopted to perform learning of relationships among elements in material property prediction, 34,35 our model additionally uses the attention mechanism to perform learning of relationships among multiple material properties by acting on the output of the multivariate Gaussian model as opposed to the composition itself.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…The attention mechanism is the crux behind many state-of-theart sequence-to-sequence models used in machine translation and language processing 40 and it has recently shown good results on multi-label classification. 41 While the attention mechanism has also been recently adopted to perform learning of relationships among elements in material property prediction, 34,35 our model additionally uses the attention mechanism to perform learning of relationships among multiple material properties by acting on the output of the multivariate Gaussian model as opposed to the composition itself.…”
Section: Discussionmentioning
confidence: 99%
“…Higher-order property correlation learning proceeds via an attention graph neural network, whose description can be found in prior literature. 34,35,40,41 We use five attention layers, namely, the message-passing operations are executed five times. Each attention layer also includes an element-wise feed-forward MLP which has two layers of 128 neurons each.…”
Section: H-clmp Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…where t p is true positive, f p is false positive and f n is false negative. High F 1 -ma usually indicate high performance on less frequent labels [45].…”
Section: Performance Metricsmentioning
confidence: 99%
“…Inspired by [21,32], we propose a PMP module to encode the state by taking into account the relation between an EHR and the hierarchical ICD structure, parent-child relations, and sibling relations of ICD codes, as shown in Figure 3. Formally, is defined as:…”
Section: Path Message Passingmentioning
confidence: 99%