2019
DOI: 10.1016/j.jspi.2018.08.001
|View full text |Cite
|
Sign up to set email alerts
|

Aggregation using input–output trade-off

Abstract: In this paper, we introduce a new learning strategy based on a seminal idea of Mojirsheibani (1999, 2000, 2002a, 2002b), who proposed a smart method for combining several classifiers, relying on a consensus notion. In many aggregation methods, the prediction for a new observation x is computed by building a linear or convex combination over a collection of basic estimators r1(x), . . . , rm(x) previously calibrated using a training data set. Mojirsheibani proposes to compute the prediction associated to a new… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 7 publications
(7 citation statements)
references
References 22 publications
0
7
0
Order By: Relevance
“…An alternative definition of combined estimator suggests mixing the consensus idea with information about distances between inputs (Fischer and Mougeot (2019)). This is a way to limit the influence, if any, of a bad estimator; using at the same time information on the geometry of the inputs.…”
Section: Consensual Aggregation Combined To Input Distancementioning
confidence: 99%
See 1 more Smart Citation
“…An alternative definition of combined estimator suggests mixing the consensus idea with information about distances between inputs (Fischer and Mougeot (2019)). This is a way to limit the influence, if any, of a bad estimator; using at the same time information on the geometry of the inputs.…”
Section: Consensual Aggregation Combined To Input Distancementioning
confidence: 99%
“…In the second step, for each divergence, a very simple predictive model is fit per cluster. The final step provides an adaptive global predictive model by aggregating, thanks to a consensus idea introduced by Mojirsheibani (1999), several models built for the different instances, corresponding to the different Bregman divergences (see also Mojirsheibani (2000); Balakrishnan and Mojirsheibani (2015); Biau et al (2016); Fischer and Mougeot (2019)). We name this procedure the KFC procedure for K-means/Fit/Consensus.…”
Section: Introductionmentioning
confidence: 99%
“…In a work parallel to ours, the idea of using the distance between points in the output space is also explored by Fischer and Mougeot [8], where weights are assigned to points based on proximity of the prediction in the output space and the training data. However, the method employed (which we will now refer to as MixCobra) also uses the input data while constructing the aggregate.…”
Section: Related Workmentioning
confidence: 99%
“…This method has also been applied in filtering to improve the image denoising (see Guedj and Rengot (2020)). Moreover, consensual aggregation methods such as Biau et al (2016), Fischer and Mougeot (2019) and the present method are also incorporated in a three-step methodology called KFC procedure, which combines unsupervised clustering and supervised prediction for (energy) data modeling (see Has et al (2021)). Such an idea of consensual aggregation was also used in unsupervised classification known as Clustering Aggregation (see, for example, Gionis et al (2005) and Wu et al (2012)).…”
Section: Introductionmentioning
confidence: 99%