2018
DOI: 10.1016/j.procs.2018.07.257
|View full text |Cite
|
Sign up to set email alerts
|

Analysing Neural Network Topologies: a Game Theoretic Approach

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2
2
1

Relationship

2
8

Authors

Journals

citations
Cited by 21 publications
(15 citation statements)
references
References 16 publications
0
15
0
Order By: Relevance
“…Thus, biology and structural searches such as e.g. pruning suggest to develop sparse neural network structures [24].…”
Section: Introductionmentioning
confidence: 99%
“…Thus, biology and structural searches such as e.g. pruning suggest to develop sparse neural network structures [24].…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, heuristics [59], predictors [30], and estimators [28,60] are used and are under development to solve this issue. Interestingly, Shapley value has found a unique spot in the field of explainable machine learning [61] and is used to understand deeper and more complicated neural network architectures [61], prune the unnecessary elements [62], and even correct biased networks [60].…”
Section: Discussionmentioning
confidence: 99%
“…The number of hidden layers, the activation functions, and the number of neurons per layer are the main parameters of a neural network and must be defined before starting the training [ 54 ], but despite the fact that there have been great advances in this field [ 76 , 77 ], there is no formal method to optimize them [ 9 ].…”
Section: Methodsmentioning
confidence: 99%