2022
DOI: 10.1109/tpami.2022.3202217
|View full text |Cite
|
Sign up to set email alerts
|

Fast Quaternion Product Units for Learning Disentangled Representations in $\mathbb {SO}(3)$

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(5 citation statements)
references
References 53 publications
0
5
0
Order By: Relevance
“…It ensures global completeness of model only with message passing in 1-hop neighborhood to avoid time-consuming calculations like torsion in SphereNet or dihedral angles in GemNet. There are also some other studies [199,309,298,292] exploiting the quaternion algebra to represent the 3D rotation group, which mathematically ensures SO(3) invariance during the inference. Specifically, Yue et al [292] constructs quaternion message-passing module to distinguish the molecular conformations caused by bond torsions.…”
Section: Tablementioning
confidence: 99%
“…It ensures global completeness of model only with message passing in 1-hop neighborhood to avoid time-consuming calculations like torsion in SphereNet or dihedral angles in GemNet. There are also some other studies [199,309,298,292] exploiting the quaternion algebra to represent the 3D rotation group, which mathematically ensures SO(3) invariance during the inference. Specifically, Yue et al [292] constructs quaternion message-passing module to distinguish the molecular conformations caused by bond torsions.…”
Section: Tablementioning
confidence: 99%
“…This is required for QNN composed out of standard quaternion valued building blocks like fully connected or convolution layer, which can easily be checked by using the identity function as the activation function, contrary to e.g. [50] where custom quaternion valued architectures are non-linear by itself. Also, all used activation functions are suitable to be used for gradient based optimization during the training phase and can be incorporated in arbitrary QNN architectures.…”
Section: Discussionmentioning
confidence: 99%
“…A further application of recurrent quaternion models is [18] where quaternion RNNs, specifically Quaternion Gated Recurrent Units are applied on sensor fusion for navigation and human activity recognition. Finally, [52] proposes quaternion product units and uses them to human action and hand action recognition. Furthermore, they utilize them for point cloud classification.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations