Fuzzy Neural Network (FNN) and Support Vector Machine (SVM) are two prominent and powerful learning models broadly used for classification and regression. FNN has a significant local representation and human reasoning advantage. However, the drawback of such a network is that the focal point of the learning algorithms (e.g., backpropagation) is minimizing empirical risk. In contrary to FNN, SVM emphasizes simultaneously on minimizing empirical and expected risks, which theoretically leads to an excellent generalization performance power. Nevertheless, this depends on determining an adequate kernel function which is consistent with the data properties. Some approaches have been proposed to take advantage of one to deal with disadvantage of the other by combining SVM kernel and FFN. In this paper, we show that a Takagi-Sugeno-Kang (TSK)-type-based fuzzy neural network is, in fact, equivalent to an SVM with an adaptive kernel based on fuzzy rules generated in this FNN. Consequently, it is possible to learn the last layer of the FNN using the concepts of SVM and thus taking the advantage of SVM in generalization. In fact, on the one side, the proposed method is an SVM with an adaptive kernel based on fuzzy rules and on the other side, it is a TSK-FNN with SVM-based learning. Therefore, the FNN with SVM-based learning has the benefit of SVM which is minimizing both learning and testing errors. Besides, such FNN does not have the disadvantages of the SVM which is finding an appropriate kernel and training the kernel parameters. As a matter of fact, the defined kernel in FNN is an adaptive kernel based on data characteristics, which is derived from the fuzzy rules generated by the FNN itself. As a result, it is expected that the results of the proposed method for classification and regression outperform the results of SVM with the conventional kernels and fuzzy neural network trained in the conventional way. This claim is confirmed with the results obtained in this paper.