2015
DOI: 10.1109/tse.2015.2432024
|View full text |Cite
|
Sign up to set email alerts
|

GALE: Geometric Active Learning for Search-Based Software Engineering

Abstract: Multi-objective evolutionary algorithms (MOEAs) help software engineers find novel solutions to complex problems. When automatic tools explore too many options, they are slow to use and hard to comprehend. GALE is a near-linear time MOEA that builds a piecewise approximation to the surface of best solutions along the Pareto frontier. For each piece, GALE mutates solutions towards the better end. In numerous case studies, GALE finds comparable solutions to standard methods (NSGA-II, SPEA2) using far fewer evalu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
32
0

Year Published

2016
2016
2020
2020

Publication Types

Select...
6
2
1

Relationship

6
3

Authors

Journals

citations
Cited by 31 publications
(32 citation statements)
references
References 43 publications
0
32
0
Order By: Relevance
“…If the accuracy of the model (with more data) does not improve the accuracy when compared to the previous iteration (lesser data), then life is lost. This termination criterion is widely used in the field of Evolutionary Algorithms to determine the degree of convergence [40].…”
Section: Residual-based: "Build An Accurate Model"mentioning
confidence: 99%
“…If the accuracy of the model (with more data) does not improve the accuracy when compared to the previous iteration (lesser data), then life is lost. This termination criterion is widely used in the field of Evolutionary Algorithms to determine the degree of convergence [40].…”
Section: Residual-based: "Build An Accurate Model"mentioning
confidence: 99%
“…This paper significantly extends prior work of the authors. In 2014, Krall & Menzies proposed GALE [35]- [37] that solved multiobjective problems via a combination of methods. A subsequent report [64] found that GALE needlessly over-elaborated some aspects of its design.…”
Section: Relation To Prior Workmentioning
confidence: 99%
“…• At the other extreme, the XOMO model discussed below is a much smaller model with continuous-valued decisions and no constraints. • In between these two extremes, we added the POM3 model (that used continuous-valued decisions) since prior work showed that POM3 is very slow to optimize [35]. Another reason to use the models described below is the existence of prior results from these models [35]- [37], [39], [58].…”
Section: Benchmarksmentioning
confidence: 99%
“…8. For the reader unfamiliar with the mutation technique of step 3 and the cdom scoring of step 5, we note that these are standard practice in the search-based SE community [24,25]. • Count how many times n1 that d k is associated with a "best" objective score.…”
Section: A Reporting the Resultsmentioning
confidence: 99%