2016
DOI: 10.1007/978-3-319-32695-5_4
|View full text |Cite
|
Sign up to set email alerts
|

Introducing Kimeme, a Novel Platform for Multi-disciplinary Multi-objective Optimization

Abstract: Optimization processes are an essential element in many practical applications, such as in engineering, chemistry, logistic, finance, etc. To fill the knowledge gap between practitioners and optimization experts, we developed Kimeme, a new flexible platform for multidisciplinary optimization. A peculiar feature of Kimeme is that it can be used both for problem and algorithm design. It includes a rich graphical environment, a comprehensive set of post-processing tools, and an open-source library of state-of-the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
3
1
1

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 21 publications
0
3
0
Order By: Relevance
“…-Neural Network (NN): a NN trained by Resilient Propagation [24]. The NN configuration was chosen applying the methodology described in [2] by means of the optimization software Kimeme [16], resulting in a network with three hidden layers, respectively with 91 nodes (Elliot symmetric activation function), 84 (with ramp activation function) and 68 nodes (with Gaussian activation function). The single output node used a hyperbolic tangent activation function.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…-Neural Network (NN): a NN trained by Resilient Propagation [24]. The NN configuration was chosen applying the methodology described in [2] by means of the optimization software Kimeme [16], resulting in a network with three hidden layers, respectively with 91 nodes (Elliot symmetric activation function), 84 (with ramp activation function) and 68 nodes (with Gaussian activation function). The single output node used a hyperbolic tangent activation function.…”
Section: Methodsmentioning
confidence: 99%
“…Since the computational cost of the GP algorithms is considerable higher than the other techniques (due to the parsing of a very large number of trees generated during the evolutionary process), we decided to run the two GP algorithms for a small number of generations (20) and a large number of predators and preys (respectively 100 and 500), to test their convergence under hard computational constraints and have a fair comparison with the other methods. As for the NEAT and NN, we used the open-source Java library Encog [12], coupled with Kimeme [16] as explained in [2]. Finally, the Multiple Regression algorithm was taken from the Apache Commons Java math library 5 .…”
Section: Methodsmentioning
confidence: 99%
“…In all our experiments (see the next section for further details), we used some of the state-of-the-art MOEAs available in Kimeme, a multi-disciplinary optimization platform introduced in [21,22]. The reason for using Kimeme was manifold: first of all, Kimeme provides a rich set of state-of-the-art single and multi-objective optimization algorithms, as well as an extensive post-processing toolkit.…”
Section: Proposed Approachmentioning
confidence: 99%