2003
DOI: 10.1007/3-540-36599-0_17
|View full text |Cite
|
Sign up to set email alerts
|

Genetic Programming with Boosting for Ambiguities in Regression Problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2004
2004
2012
2012

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 2 publications
0
3
0
Order By: Relevance
“…One of the difficulty in comparing with published work is that, for the two alternatives presented here, the fitness function used is a φ-divergence, very useful for computing distance between two distributions but unusual in GP. As far as we know, the only previous GP work addressing explicitly the redundancy problem [9], is much more appropriated for functions decomposition, as the problem P a , than for more complex IP. Our SR-GP system can also easily solve P a , while MD-GP is much more suitable to tackle IP where complex densities have to be estimated.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…One of the difficulty in comparing with published work is that, for the two alternatives presented here, the fitness function used is a φ-divergence, very useful for computing distance between two distributions but unusual in GP. As far as we know, the only previous GP work addressing explicitly the redundancy problem [9], is much more appropriated for functions decomposition, as the problem P a , than for more complex IP. Our SR-GP system can also easily solve P a , while MD-GP is much more suitable to tackle IP where complex densities have to be estimated.…”
Section: Discussionmentioning
confidence: 99%
“…In the GP field, several studies are related to IP solving (see for example [4,3,11,5]) but very few of them have investigated the question of the redundancy. However a noticeable exception can be found in [9] and will be discussed further. To overcome the non-uniqueness of the solution, instead of predicting the conditional mean, one way is to ensure that the output of an inverse model is at least one of the expected solutions [6].…”
Section: Multiple Outputsmentioning
confidence: 94%
See 1 more Smart Citation