2022
DOI: 10.1088/2632-2153/aca1f8
|View full text |Cite
|
Sign up to set email alerts
|

Incompleteness of graph neural networks for points clouds in three dimensions

Abstract: Graph neural networks (GNN) are very popular methods in machine learning and have been applied very successfully to the prediction of the properties of molecules and materials. First-order GNNs are well known to be incomplete, i.e., there exist graphs that are distinct but appear identical when seen through the lens of the GNN. More complicated schemes have thus been designed to increase their resolving power. Applications to molecules (and more generally, point clouds), however, add a geometric dimension to t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 10 publications
(13 citation statements)
references
References 55 publications
0
13
0
Order By: Relevance
“…Taking into account the appropriate components in the loss and investigating appropriate weighting schemes is decisive for a successful training process and needs to be carefully investigated. 292,293 Even though graph-based representations considering particle positions as nodes and interparticle distances as edges have been widely adopted for the development of MLPs, it can be shown 294 that, even for simple systems, with such representations multiple molecular configurations can be represented by the same graph and thus are indistinguishable. This provides motivation for the inclusion of angular or directional information in the description of the local environments.…”
Section: Open Challenges and Future Outlookmentioning
confidence: 99%
See 1 more Smart Citation
“…Taking into account the appropriate components in the loss and investigating appropriate weighting schemes is decisive for a successful training process and needs to be carefully investigated. 292,293 Even though graph-based representations considering particle positions as nodes and interparticle distances as edges have been widely adopted for the development of MLPs, it can be shown 294 that, even for simple systems, with such representations multiple molecular configurations can be represented by the same graph and thus are indistinguishable. This provides motivation for the inclusion of angular or directional information in the description of the local environments.…”
Section: Open Challenges and Future Outlookmentioning
confidence: 99%
“…This provides motivation for the inclusion of angular or directional information in the description of the local environments. 294 It would be generally highly beneficial for the evolution of the multidisciplinary community, if attention was given to the reporting of exhaustive details in relation to strategies followed, and if failures and problems faced were also discussed in depth, along with the potential eventual successful outcomes. Interfacing developed ML-schemes to widely used open-source parallel molecular simulation engines is also critical for widening the methods efficiency, testing, and applicability.…”
Section: Open Challenges and Future Outlookmentioning
confidence: 99%
“…59,60 Data augmentation may be able to avoid the need for equivariant models as well. 61 Another possibility is optimizing classical MD potentials against quantum data using automatic differentiation 62 or algorithmic improvements. 63,64 ■ RECURSIVE COARSE-GRAINING Despite the promise of current approaches to improving speed, there are inherent limitations to any simulation that attempts to keep track of the position and motion of every single atom in a system.…”
Section: Trade-offmentioning
confidence: 99%
“…We now extend our study to the case of NN models. NNs are a large class of regression methods in atomistic ML. , ,− They are generally regarded to be far more “flexible” than their linear counterparts, given the significantly larger number of weight parameters involved in training the model. One peculiarity of NN models is that they cannot be optimized in an analytical, deterministic way: model training is often carried out with recursive numerical methods and does not exactly reach the actual minimum, which is an assumption underlying the formulation of the LPR.…”
Section: Extension To Nn Modelsmentioning
confidence: 99%