2020 IEEE International Conference on Data Mining (ICDM) 2020
DOI: 10.1109/icdm50108.2020.00188
|View full text |Cite
|
Sign up to set email alerts
|

HexCNN: A Framework for Native Hexagonal Convolutional Neural Networks

Abstract: Hexagonal CNN models have shown superior performance in applications such as IACT data analysis and aerial scene classification due to their better rotation symmetry and reduced anisotropy. In order to realize hexagonal processing, existing studies mainly use the ZeroOut method to imitate hexagonal processing, which causes substantial memory and computation overheads. We address this deficiency with a novel native hexagonal CNN framework named HexCNN. HexCNN takes hexagon-shaped input and performs forward and … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
2
2

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 27 publications
0
3
0
Order By: Relevance
“…These features can be used in the field of map coordinate positioning [10]. Using the convolution neural network, the hexagonal representation can store a large amount of data on the same sampling points and reduce calculation time and quantization error [11], [12]. Zhao et al also prove that CNN with the hexagonal kernel has the same back-propagation as standard CNN [12].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…These features can be used in the field of map coordinate positioning [10]. Using the convolution neural network, the hexagonal representation can store a large amount of data on the same sampling points and reduce calculation time and quantization error [11], [12]. Zhao et al also prove that CNN with the hexagonal kernel has the same back-propagation as standard CNN [12].…”
Section: Related Workmentioning
confidence: 99%
“…Using the convolution neural network, the hexagonal representation can store a large amount of data on the same sampling points and reduce calculation time and quantization error [11], [12]. Zhao et al also prove that CNN with the hexagonal kernel has the same back-propagation as standard CNN [12]. Luo et al and Hoogeboom et al describe a set of construction methods for hexagonal structures that enable lossless rotation of hexagonal images and propose that hexagonal kernels could be used for G-CNN [4], [13], [14].…”
Section: Related Workmentioning
confidence: 99%
“…They define the size of kernel based on the number of levels of neighbouring elements. These hexagonal convolutions achieve higher efficiency than ZeroOut method [40] due to its divide-and-conquer approach. Increasing the number of learnable parameters along the skip connection of residual network architecture will again lead to vanishing gradient problem as mentioned in Sec.…”
Section: Hexagonal Convolution Operationsmentioning
confidence: 99%