2017
DOI: 10.14311/nnw.2017.27.033
|View full text |Cite
|
Sign up to set email alerts
|

Influence of (P)rngs Onto Gpa-Es Behaviors

Abstract: This paper describes the first attempt of hardware implementation of Multistream Compression (MSC) algorithm. The algorithm is transformed to series of Finite State Machines with Datapath using Register-Transfer methodology. Those state machines are then implemented in VHDL to selected FPGA platform. The algorithm utilizes a special tree data structure, called MSC tree. For storage purpose of the MSC tree a Left Tree Representation is introduced. Due to parallelism, the algorithm uses multiple port access to S… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 11 publications
0
3
0
Order By: Relevance
“…Raidl and Gunther in [11] introduced HGP (hybrid genetic programming), added weights to the top-level tree members and optimized them using a robust least squares method. For example, gradient descent [12,13], simulated annealing combined with the simplex method [14], particle swarm optimization (PSO) [15], multiple regression in the STROGANOFF method [16,17], evolutionary strategies [13,18,19], genetic algorithms [13], self-organizing migrating algorithm (SOMA) [13,20], the Bison Seeker algorithm [21], and non-linear optimization using the Levenberg-Marquardt algorithm [22,23] can be used to optimize the constants. There are many modern approaches for GP optimization.…”
Section: Optimization Of Genetic Programming and Symbolic Regressionmentioning
confidence: 99%
“…Raidl and Gunther in [11] introduced HGP (hybrid genetic programming), added weights to the top-level tree members and optimized them using a robust least squares method. For example, gradient descent [12,13], simulated annealing combined with the simplex method [14], particle swarm optimization (PSO) [15], multiple regression in the STROGANOFF method [16,17], evolutionary strategies [13,18,19], genetic algorithms [13], self-organizing migrating algorithm (SOMA) [13,20], the Bison Seeker algorithm [21], and non-linear optimization using the Levenberg-Marquardt algorithm [22,23] can be used to optimize the constants. There are many modern approaches for GP optimization.…”
Section: Optimization Of Genetic Programming and Symbolic Regressionmentioning
confidence: 99%
“…Many artificial intelligence methods are dependent on the quality of the used random generator (function), but, in this case, where data are used for a neural network, which provides interpolation and noise reduction, a simpler generator can be used [6].…”
Section: Generation Of the Datamentioning
confidence: 99%
“…In this case, the method used in [2] or [5] is implemented. The starting population is generated as in the previous case.…”
Section: Genetic Algorithms With Tournament Selectionmentioning
confidence: 99%