2018 International Joint Conference on Neural Networks (IJCNN) 2018
DOI: 10.1109/ijcnn.2018.8489545
|View full text |Cite
|
Sign up to set email alerts
|

A New Self-Organizing Neural Gas Model based on Bregman Divergences

Abstract: In this paper, a new self-organizing neural gas model that we call Growing Hierarchical Bregman Neural Gas (GHBNG) has been proposed. Our proposal is based on the Growing Hierarchical Neural Gas (GHNG) in which Bregman divergences are incorporated in order to compute the winning neuron. This model has been applied to anomaly detection in video sequences together with a Faster R-CNN as an object detector module. Experimental results not only confirm the effectiveness of the GHBNG for the detection of anomalous … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
2
1
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 19 publications
0
3
0
Order By: Relevance
“…This work is based on the Growing Hierarchical Bregman Neural Gas (GHBNG) network [16], which can be regarded as a Growing Hierarchical Neural Gas (GHNG) model [15] where Bregman divergences are employed to calculate the winning unit. Let φ : S → R be a function with real values and strictly convex, which is defined on a convex set S ⊆ R D , where D is the input data dimension [3,4,19].…”
Section: The Ghbng Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…This work is based on the Growing Hierarchical Bregman Neural Gas (GHBNG) network [16], which can be regarded as a Growing Hierarchical Neural Gas (GHNG) model [15] where Bregman divergences are employed to calculate the winning unit. Let φ : S → R be a function with real values and strictly convex, which is defined on a convex set S ⊆ R D , where D is the input data dimension [3,4,19].…”
Section: The Ghbng Modelmentioning
confidence: 99%
“…Therefore, the most suitable divergence can be specified according to the input data by using Bregman divergences. In this paper, a self-organizing neural network called the Growing Hierarchical Bregman Neural Gas (GHBNG) [16] is used for CQ, which is a variation of the GHNG model which considers Bregman divergences as an extension of the classic Euclidean distance.…”
Section: Introductionmentioning
confidence: 99%
“…CollAR is based on the Growing When Required (GWR) neural network [12,13], which takes inspiration from self-organizing map (SOM) [14,15], their extension self-organizing neural network (SOINN) [16,17] and growing neural gas (GNG) [18][19][20]. Differently from solutions based on neural networks with a fixed architecture (i.e., where the number of layers and neurons is defined before training and it does not change), GWR is an incremental neural network that is capable of automatically and dynamically adapting its architecture.…”
Section: Introductionmentioning
confidence: 99%