Decision tree is a known classification technique in machine learning. It is easy to understand and interpret and widely used in known real world applications. Decision tree (DT) faces several challenges such as class imbalance, overfitting and curse of dimensionality. Current study addresses curse of dimensionality problem using partitioning technique. It uses partitioning technique, where features are divided into multiple sets and assigned into each block based on mutual exclusive property. It uses Genetic algorithm to select the features and assign the features into each block based on the ferrer diagram to build multiple CART decision tree. Majority voting technique used to combine the predicted class from the each classifier and produce the major class as output. The novelty of the method is evaluated with 4 datasets from UCI repository and shows approximately 9%, 3% and 5% improvement as compared with CART, Bagging and Adaboost techniques.