2018
DOI: 10.1007/978-3-319-98998-3_2
|View full text |Cite
|
Sign up to set email alerts
|

Optimized Artificial Neural Network System to Select an Exploration Algorithm for Robots on Bi-dimensional Grids

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
11
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
2

Relationship

2
0

Authors

Journals

citations
Cited by 2 publications
(13 citation statements)
references
References 12 publications
2
11
0
Order By: Relevance
“…The variable to be predicted in this application is defined as the maximum number of steps in which the robot reaches the target object for a given set of conditions. The experimental setup is described in [6] and [10]. The other three datasets were downloaded from the public UCI repository [11].…”
Section: Small Datasetsmentioning
confidence: 99%
See 4 more Smart Citations
“…The variable to be predicted in this application is defined as the maximum number of steps in which the robot reaches the target object for a given set of conditions. The experimental setup is described in [6] and [10]. The other three datasets were downloaded from the public UCI repository [11].…”
Section: Small Datasetsmentioning
confidence: 99%
“…Often, the hyper-parameters are determined by a random search inside a limited set of possibilities, picking the values that achieve the best performance. However, there is a optimization method proposed in [6] which calculates the number of layers and neurons per layer that maximizes the performance of the neural network for one or two layers and therefore, gets the best descriptive metrics of system behavior. This previous work detailed in [6] shows an improvement in predictor performance when Hill Climbing with Random Restart algorithm is applied to find the optimal ANN architecture comparing the results with a simple Hill Climbing algorithm, demonstrating that the results with the introduction of Random Restart are better.…”
Section: Hyper-parameter Optimizationmentioning
confidence: 99%
See 3 more Smart Citations