2023
DOI: 10.1109/access.2023.3274938
|View full text |Cite
|
Sign up to set email alerts
|

Data Symmetries and Learning in Fully Connected Neural Networks

Abstract: Symmetries in the data and how they constrain the learned weights of modern deep networks is still an open problem. In this work we study the simple case of fully connected shallow non-linear neural networks and consider two types of symmetries: full dataset symmetries where the dataset X is mapped into itself by any transformation g, i.e. gX = X or single data point symmetries where gx = x, x ∈ X . We prove and experimentally confirm that symmetries in the data are directly inherited at the level of the netwo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 34 publications
0
1
0
Order By: Relevance
“…One must be careful when performing such classification, because there are DNNs that do not belong solely to a single group. For instance, Fully Connected Neural Networks (FCNNs) [270], also known as dense networks, can be applied to all four groups, because these DNNs are adaptable and thus can be utilized in different manners, as determined by a specific goal. A similar point of view is generally valid for stochastic neural networks [271], since one can introduce stochasticity into NNs belonging to any of the four main groups.…”
Section: Machine Learning Based On Direct Mathematical Proceduresmentioning
confidence: 99%
“…One must be careful when performing such classification, because there are DNNs that do not belong solely to a single group. For instance, Fully Connected Neural Networks (FCNNs) [270], also known as dense networks, can be applied to all four groups, because these DNNs are adaptable and thus can be utilized in different manners, as determined by a specific goal. A similar point of view is generally valid for stochastic neural networks [271], since one can introduce stochasticity into NNs belonging to any of the four main groups.…”
Section: Machine Learning Based On Direct Mathematical Proceduresmentioning
confidence: 99%