The goal of boosting algorithm is to maximize the minimum margin on sample set. Based on minimax theory, the goal can be converted into minimize the maximum edge. This idea motivates LPBoost and its variants (including TotalBoost, SoftBoost, ERLPBoost) which solve the optimization problem by linear programming. These algorithms ignore the strong classifier and just minimize the maximum edge of weak classifiers so that all the edges of weak classifier are at most γ .This paper shows that the edge of strong classifier may be higher than the maximum edge of weak classifiers and proposes a novel boosting algorithm which introduced strong classifier into the optimization problem and constrained the edges of both weak and strong classifiers no more than γ . Furthermore, we justified the reasonability of introducing strong classifier using minimax theory.We compared our algorithm with other approaches including AdaBoost, LPBoost, TotalBoost, SoftBoost, and ERLPBoost on the UCI benchmark dataset. In simulation studies we show that our algorithm converges faster than SoftBoost and ERLPBoost. In a benchmark comparison we illustrate the competiveness of our approach from the aspect of time consuming, and generalization error.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.