Support vector machine (SVM) algorithms have been widely used for classification in many different areas. However, the use of a single SVM classifier is limited by the advantages and disadvantages of the algorithm. This paper proposes a novel method, called support vector machine chains (SVMC), which involves chaining together multiple SVM classifiers in a special structure, such that each learner is constructed by decrementing one feature at each stage. This paper also proposes a new voting mechanism, called tournament voting, in which the outputs of classifiers compete in groups, the common result in each group gradually moves to the next round, and, at the last round, the winning class label is assigned as the final prediction. Experiments were conducted on 14 real-world benchmark datasets. The experimental results showed that SVMC (88.11%) achieved higher accuracy than SVM (86.71%) on average thanks to the feature selection, sampling, and chain structure combined with multiple models. Furthermore, the proposed tournament voting demonstrated higher performance than the standard majority voting in terms of accuracy. The results also showed that the proposed SVMC method outperformed the state-of-the-art methods with a 6.88% improvement in average accuracy.