Multiconlitron is a general theoretical framework for constructing piecewise linear classifier. However, it contains a relatively large number of linear functions, resulting in complicated model structure and poor generalization ability. Learning to prune redundant or excessive components may be a very necessary progression. We propose a novel greedy method, i.e., greedy support multiconlitron algorithm (GreSMA) to simplify the multiconlitron. In GreSMA, a procedure of greedy selection is first used. It generates the initial linear boundaries, each of which can separate maximum number of training samples under the current iteration. In this way, a minimal set of decision functions is established. In the second stage of GreSMA, a procedure of boundary adjustment is designed to retrain the classification boundary between convex hulls of local subsets, instead of individual samples. Thus, the adjusted boundary will fit the data more closely. Experiments on both synthetic and real-world datasets show that GreSMA can produce minimal multiconlitron with better performance. It meets the criteria of ''Occam's razor'', since simpler model can help prevent over-fitting and improve the generalization ability. More significantly, the proposed method does not contain parameters that depend on the datasets or make assumptions of the underlying statistical distributions of the samples. Therefore, it should be regarded as an attractive advancement of piecewise linear learning in the general framework of multiconlitron.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.